Summary
Overview
Work History
Education
Skills
Certification
Accomplishments
Languages
Timeline
Generic

SATISH KUMAR KANTA

San Antonio,TX

Summary

Data Engineer with over 10+ years of experience, specializing in developing robust data platforms and solutions.

Proven expertise in SAS and SQL programming, with a strong focus on query optimization and performance tuning.

Adept at designing and implementing data conversion and normalization protocols, contributing to the seamless delivery of timely and accurate data for various business units.

Expertise includes SAS/BASE, SAS/SQL, SAS/MACROS, and SAS/ODS, SAS/STAT, SAS/ACCESS, SAS Enterprise Guide, SAS Web Report Studio, SAS Management Console, IBM DB2, Oracle, UNIX, LINUX, R, Python, Dataiku, Dremio, Tableau, Power BI, QlikView, and Micro Strategy.

Expertise in Dremio, Dataiku, Hive SQL, and DB visualize, SQL, SAS, and Python. Leverage advanced data modeling techniques, incorporating Snowflake SQL and Collibra.

Utilize Business Object for effective data visualization and reporting. Collaborate with cross-functional teams, adapting to new technologies and methodologies. Showcase adaptability and a strong aptitude for learning.

Exhibit excellent customer skills, understanding client requirements and delivering solutions that exceed expectations.

Engage in continuous communication to ensure client satisfaction.

Excellent hands-on experience in SAS Grid Computing, SAS Grid Manager, Distributed Enterprise Scheduling, Workload Balancing, and parallelization of workloads.

Expertise with Platform RTM for SAS to monitor and configure the Grid Environment. Excellent experience in Business Analysis, Design, and Training of Users.

Excellent Stakeholder management skills and effective cross-functional management expertise, seeking a challenging environment that encourages continuous learning and creativity providing exposure to new ideas that helps professional growth.

Knowledge to Developed and maintained MongoDB databases for high-traffic applications. Implemented real-time data aggregation and analysis using MongoDB. Designed and implemented a distributed data processing pipeline using MongoDB. Utilized MongoDB to access and manage unstructured data.

Excellent hands-on experience with RDBMS like Oracle 9i, 10g, Teradata, DB2, MS SQL Server, MS Access, Informix, Sybase, Oracle Clinical & Clinical Trials Databases.

Strong experience in writing complex UNIX Shell Scripting.

Expertise in migrating SAS Platforms, Applications, Data, and Catalogs from older SAS versions. Re-engineered SAS code to leverage the enhanced features of SAS for streamlining complex job streams and processes.

Strong experience in complete Software Development Life Cycle (SDLC), requirements gathering, Data Management Principles, Data Warehouse Design, Data Mapping, Dimensional Modeling, Star Schema, Data Analysis and Design, Data Quality Techniques, Database Development, and Test Plan Development.

Excellent experience in Extraction, Cleansing, Transformation & Loading (ETL) as per business requirements. Strong knowledge in OLTP, Data Warehouse, Data Mining, OLAP & Data Marts applications.

Created and administered several Standard Operating Procedures (SOPs), Technical Operating Procedures (TOPs), and Validation Guidelines.

Strong experience in working with SAS under different platforms such as Mainframe, UNIX, and Windows. Good experience in writing installation Qualifications (IQs), Operational Qualifications (QoS), and Performance Qualifications (PQs) for various Operating Systems and numerous software.

Expertise in loading DB2 tables into Oracle using DB2 gateway, SQL Loader, and PDW (Product Data Warehouse) Server.

Used Bulk load to load SAS Datasets into DB2 tables.

Strong experience in installation, configuration, maintenance, and administration of SAS/EBI tools such as SAS/Data Integration Studio, SAS Information Maps Studio, SAS Management Console, SAS/Campaign Management Studio, SAS/OLAP Cube Studio, SAS Information Delivery Portal, SAS Web Report Studio, SAS Web Report Viewer, SAS Web OLAP Viewer & SAS/Enterprise Guide. Installed several Software including Base SAS, EBI, SAS/Warehouse Administrator, and SAS Forecast Server in multiple instances such as development, staging, QA, Testing, Training, and Production Environments. Written complex shell script, configured run setups, and profiles to execute the instances.

Strong experience in collaborating with stakeholders and working with people across the globe from 24 countries like Hongkong, Singapore, China, Indonesia, Africa, Malaysia, UAE, UK. Deployed complex dashboards and visualizations using different reporting tools - Tableau Qlikview, Power BI & SAS VA. Experienced ETL & data analytics in the field of Data warehouse, data analytics, data profiling, data migration and data visualization.

Advanced skills on data cleaning, manipulation, Data migration and creation of Analytical Data mart with SAS/DI Studio.

Strong experienced in statistical Modeling/Machine Learning and Visualization Tools. Strong exposure of all steps of project life cycle such as original data collection, secondary query, data manipulation and analysis, optimizing, summary of finding develop analysis indications; communicate strategic insights and presentation of results.

Advanced skills to all steps of project life cycle such as original data collections, secondary data query, data manipulation and analysis, optimizing, summary of findings, develops analysis indications; communicate strategic insights and presentation of results.

Experiences on SAS Visual Analytical (VA) and Tableau.

Hands on experience in Azure development, worked on Azure web application, App services, Azure storage, Azure SQL database, Virtual machines, Fabric controller, Azure AD, Azure search, and notification hub. Designed, configured and deployed Microsoft Azure for multitude of applications utilizing the Azure stack(including Compute , Web & Mobile, Blobs, Resource Groups, Azure SQL, and Cloud Services), focusing on high – availability, fault tolerance, and auto-scaling.

Developed methodologies for cloud migration, implemented best practices and helped to develop backup and recovery techniques for applications and database virtualization platform.

Good at Manage hosting plans for Azure infrastructure, implementing & deploying workloads on Azure virtual machines (VMs).

Good understanding of Web Application deployment and maintenance of IIS 5.0 and 7.0, Apache on Amazon Web Service (AWS).

Demonstrated understanding of AWS data migration tools and technologies including Storage Gateway, Database Migration and Import Export services.

Practical Database Engineer possessing in-depth knowledge of data manipulation techniques and computer programming paired with expertise in integrating and implementing new software packages and new products into system.

Managing various aspects of development, design and delivery of database solutions. Tech-savvy and independent professional bringing outstanding communication and organizational abilities.

History of mining, warehousing and analyzing data at the company-wide level.

Knowledgeable about the principles and implementation of machine and deep learning.

Results-oriented and proactive with top-notch skills in project management and communication.

Highly-motivated employee with desire to take on new challenges.

Strong worth ethic, adaptability and exceptional interpersonal skills.

Adept at working effectively unsupervised and quickly mastering new skills. Hardworking employee with customer service, multitasking and time management abilities. Devoted to giving every customer a positive and memorable experience. Committed job seeker with a history of meeting company needs with consistent and organized practices.

Skilled in working under pressure and adapting to new situations and challenges to best enhance the organizational brand.

Organized and motivated employee eager to apply time management and organizational skills in various environments.

Seeking entry-level opportunities to expand skills while facilitating company growth.

Overview

12
12
years of professional experience
1
1
Certification

Work History

Data Engineer

SourceOn IT INC
Atlanta, USA
10.2023 - Current
  • Design and implement database structures using SAS and SQL to meet the organization's data architecture requirements
  • Develop and maintain data models for efficient storage and retrieval
  • Create and optimize ETL processes for extracting, transforming, and loading data into the data warehouse
  • Implement data quality checks and ensure data integrity across various data sources
  • Monitor and optimize database performance, identifying and resolving bottlenecks
  • Work closely with data analysts to understand their data needs and provide support in data extraction and transformation
  • Lead the design, development, and maintenance of a scalable data platform to support product and data engineering teams
  • Collaborate with cross-functional teams to understand data requirements and translate them into effective technical solutions
  • Leverage Python for building data pipelines, ensuring efficiency and reliability in data processing
  • Incorporate best practices in Python coding standards and contribute to the team's codebase
  • Address issues related to data completeness and quality, ensuring high standards in data-related activities
  • Work closely with the software development team to ensure solutions meet the highest standards for solving complex data challenges
  • Utilize SQL for data manipulation, extraction, and transformation processes
  • Develop and optimize complex SQL queries to extract meaningful insights from diverse datasets
  • Design and implement data conversion processes to ensure seamless integration of diverse data sources
  • Implement normalization protocols to maintain data consistency and integrity across the platform
  • Conduct query optimization and performance tuning to enhance database responsiveness and efficiency
  • Collaborate with product and data engineering teams to understand their data needs and provide timely solutions
  • Communicate complex technical concepts to non-technical stakeholders, ensuring a clear understanding of data-related processes
  • Analyze the requirements and understand the functional specifications, Creating SQL Tables using SQL queries
  • Implement robust testing procedures to validate the accuracy and reliability of data processing workflows
  • Conduct regular data quality checks and address any discrepancies in collaboration with relevant teams
  • Create and maintain comprehensive documentation for data engineering processes, ensuring knowledge transfer and team continuity
  • Created various Excel documents to assist with pulling metrics data and presenting information to stakeholders for concise explanations of best placement for needed resources
  • Administered supported and monitored databases by proactively resolving database issues and maintaining servers
  • Working with stakeholders to evaluate business use case and brought the No sql cassandra database on board
  • Deployed multi-data center, multi rack Cassandra cluster in production environment and single-data center cluster in testing environment
  • Participated in requirements meetings and data mapping sessions to understand business needs
  • Researched and resolved issues regarding integrity of data flow into databases
  • Developed tables, views and materialized views using SQL
  • Identified and documented detailed business rules and use cases based on requirements analysis
  • Recommended data standardization and usage for protection of data integrity
  • Transformed project data requirements into project data models
  • Built library of models and reusable knowledge-based assets to produce consistent and streamlined business intelligence results
  • Strong experience in collaborating with stakeholders and working with people across the globe from 24 countries like Hongkong, Singapore, China, Indonesia, Africa, Malaysia, UAE, UK
  • Deployed complex dashboards and visualizations using different reporting tools - Tableau Qlikview, Power BI & SAS VA
  • Experienced ETL & data analytics in the field of Data warehouse, data analytics, data profiling, data migration and data visualization
  • Advanced skills on data cleaning, manipulation, Data migration and creation of Analytical Data mart with SAS/DI Studio
  • Strong experienced in statistical Modeling/Machine Learning and Visualization Tools
  • Strong exposure of all steps of project life cycle such as original data collection, secondary query, data manipulation and analysis, optimizing, summary of finding develop analysis indications; communicate strategic insights and presentation of results
  • Advanced skills to all steps of project life cycle such as original data collections, secondary data query, data manipulation and analysis, optimizing, summary of findings, develops analysis indications; communicate strategic insights and presentation of results
  • Experiences on SAS Visual Analytical (VA) and Tableau
  • Hands on experience in Azure development, worked on Azure web application, App services, Azure storage, Azure SQL database, Virtual machines, Fabric controller, Azure AD, Azure search, and notification hub
  • Designed, configured and deployed Microsoft Azure for multitude of applications utilizing the Azure stack(including Compute , Web & Mobile, Blobs, Resource Groups, Azure SQL, and Cloud Services), focusing on high – availability, fault tolerance, and auto-scaling
  • Developed methodologies for cloud migration, implemented best practices and helped to develop backup and recovery techniques for applications and database virtualization platform
  • Good at Manage hosting plans for Azure infrastructure, implementing & deploying workloads on Azure virtual machines (VMs)
  • Good understanding of Web Application deployment and maintenance of IIS 5.0 and 7.0, Apache on Amazon Web Service (AWS)
  • Demonstrated understanding of AWS data migration tools and technologies including Storage Gateway, Database Migration and Import Export services.
  • Optimized existing queries to improve query performance by creating indexes on tables.
  • Developed and implemented data models, database designs, data access and table maintenance codes.
  • Analyzed user requirements, designed and developed ETL processes to load enterprise data into the Data Warehouse.
  • Created stored procedures for automating periodic tasks in SQL Server.
  • Designed and developed reports using Business Objects, Tableau, Qlikview.
  • Developed Python scripts for extracting data from web services API's and loading into databases.
  • Integrated multiple sources of structured and unstructured datasets into a single platform.
  • Managed performance monitoring and tuning while identifying and repairing issues within database realm.
  • Collaborated with solution architects to define database and analytics engagement strategies for operational territories and key accounts.
  • Trained non-technical users and answered technical support questions.
  • Created conceptual, logical and physical data models for use in different business areas.
  • Worked as part of project teams to coordinate database development and determine project scopes and limitations.
  • Planned and installed database management system software upgrades to enhance systemic performance.

Data Analyst

Standard Chartered Global Business Services
Bengaluru, Karnataka
10.2016 - 09.2023
  • Design, develop, and maintain scalable and efficient databases using SAS, SQL, and Snowflake
  • Perform data modeling to ensure data integrity and optimize database performance
  • Develop, implement, and manage ETL processes to extract, transform, and load data from various sources into the data warehouse or analytical systems
  • Utilize Python for scripting and automation of data workflows, ensuring efficiency and repeatability in data processing tasks
  • Gain experience with Snowflake to design and implement cloud-based data warehousing solutions
  • Utilize Python for scripting and automation, enhancing efficiency in data processing tasks
  • Contribute to the establishment of data governance policies and standards to ensure data quality and compliance
  • Create Tableau dashboards for data visualization, enabling stakeholders to derive insights from the data
  • Develop and maintain Excel-based reporting processes, ensuring accurate and timely data presentation
  • Demonstrate proficiency in SQL (db2) for efficient data querying and manipulation
  • Utilize JIRA for project management and collaboration
  • Work with API integrations and ETL processes to enhance data flows
  • Lead software and data testing efforts, ensuring the reliability and accuracy of data-driven solutions
  • Implement and execute robust testing strategies
  • Demonstrate strong communication skills and problem-solving abilities
  • Work closely with clients, understanding their needs and translating them into effective data solutions
  • Effectively collaborated with stakeholders to build business dashboards
  • Analyzed the performance of the Credit Card and Personal Loan Telemarketing team and identified areas of improvement
  • Conceptualized and designed Credit Cards, Deposits & Loans Tier 3 ETL Frameworks for Retail Banking of different countries from the T1 Layer (source) to generate KPI dashboards for front-line visualization of performance analysis
  • Designed and created Datamarts for Credit Cards and Loans
  • Conceptualized and designed Credit Card EAM data-mart for India retail banking for monitoring application conversion KPIs and card activation through different demographic
  • Developing SAS programs for Data Extraction from multiple input sources
  • Data extraction from different Transaction Processing systems through Dremio using SQL query
  • Scheduling ETL jobs to generate Daily / Weekly / Monthly Business dashboards for Bank Performance analysis
  • Converting the SAS codes to the Dataiku process flow
  • Perform Trend Analysis and Forecasting
  • Designed Client level / Product level revenue reporting for Client Acquisition Channel
  • Supporting Data management and reporting for different geographies
  • Led both onshore and offshore teams consisting of ~20 members in driving deliverables
  • Created and modified several Oracle, DB2 stored procedures, functions, packages, triggers, alerts, and objects
  • Built library of models and reusable knowledge-based assets to produce consistent and streamlined business intelligence results
  • Designed and developed schema data models
  • Analyzed SAP transactions to build logical business intelligence model for real-time reporting needs
  • Developing SAS programs for Data Extraction from multiple input sources
  • Created and improved intelligence resources to facilitate consistent data management strategies
  • Updated organizational systems and subsystems to improve and streamline data collection
  • Evaluated trends to understand competitive environments and assess current strategies
  • Compiled, evaluated and reviewed engineered data for internal system
  • Documented business workflows for stakeholder review
  • Identified and documented project constraints, assumptions, business impacts, risks and scope exclusions
  • Run transformation by using functions (String, date and time, aggregate function) and procedures (Proc SQL, Proc Freq, Proc Means, Proc Tabulate, Proc Transpose, Proc Rank, Proc report) Synthesized current business intelligence data to produce dashboards and polished presentations, highlighting findings and recommending changes
  • Trained employees on software to improve data management, monitored use and suggested improvements.

Data Engineer

Dot Pro Software Solutions Pvt Ltd. (Edelweiss)
India, Tamilnadu
05.2012 - 09.2016
  • Cigna - Connecticut General Life Insurance Company is one of the largest healthcare providers in the United States, providing managed medical and dental care products, group health insurance, and related services
  • Objective Cigna will gather the data from four different sources i.e., online personal health record Self-management, E-enrollment in wellness programs, information about claims
  • Cigna has created a separate data mart for Customer Business Healthcare
  • The project main is to convert the entire Customer Business Healthcare dashboards From Benchmark tool to SAS to reduce manual iteration and time taken
  • Responsibilities Created separate data mart for Cigna Behavioral dashboards Converted all the Cigna Behavioral dashboards from the Benchmark tool to SAS to reduce manual effort and improve performance Extract data from the database by using the PTF application or library name statement Run transformation by using functions (String, date and time, aggregate function) and procedures (Proc SQL, Proc Freq, Proc Means, Proc Tabulate, Proc Transpose, Proc Rank, Proc report) Developed SAS programs for data extraction, validation, and Datamart creation Performed various analyses on wealth products (Insurance, FX, Investment) Generate customization dashboards in HTML, Rich Text Format, and PDF by using ODS

Elite Trade Relationship Group portfolio Analyst

ICICI BANK
Chennai, India
05.2012 - 09.2016
  • The purpose of this dashboard is to identify customer-level opportunities across Trade, BLG, CMS, and attachment products, and the dashboard is enabled to all the Regional Managers RM to have one view of all the opportunities existing customer level
  • This dashboard gives the following information
  • Activation status across attachment solutions (CIB, Tax payment, etc.,) Trade products applicable to the customers
  • Customer level one view of various campaigns- Trade, BLG, CMS, CA Responsibilities Extract data from flat files and databases by using the PTF application, Proc import, and Data step
  • Run transformation by using functions (String, date and time, aggregate function) and Procedures (Proc SQL, Proc Freq, Proc Means, Proc Tabulate, Proc Transpose, Proc Rank, Proc report)
  • Generate customization dashboards in HTML, RTF, and PDF by using ODS
  • Submit the dashboard for the User acceptance test to get sign-off and make the dashboard live.

Education

Bachelor of Science - Computer Science & Engineering

Jawaharlal Nehru Technological University
India
05-2010

Skills

  • SAS/BASE
  • SAS/SQL
  • SAS/MACROS
  • SAS/ODS
  • SAS/STAT
  • SAS/ACCESS
  • SAS Enterprise Guide
  • SAS Web Report Studio
  • SAS Management Console
  • ADVANCE SAS
  • SAS
  • SAS EG 71
  • SAS Graphs
  • SAS SQL
  • SAS Studio
  • SQL
  • SCRIPTING
  • Dataiku
  • Dremio
  • R
  • Python
  • Windows Server 2012 R2/2012/2008 R2
  • Windows Vista/7/8/10
  • UNIX
  • LINUX
  • Logical and Physical Database Design
  • MS Project
  • Database Design and Normalization
  • Agile
  • IBM DB2
  • Oracle
  • Microsoft Access
  • Snowflake
  • MongoDB
  • NoSQL
  • Informix
  • Teradata
  • Microsoft Office
  • Business Objects
  • Tableau
  • Power BI
  • QlikView
  • Micro Strategy
  • ChatGPT
  • SPOTFIRE
  • MINITAB
  • Data Modeling
  • Data Warehousing
  • Data Migration
  • Database Design
  • Database Administration
  • Data Analysis
  • SQL Transactional Replications
  • Risk Analysis
  • SQL and Databases
  • Database Development
  • Time Management
  • Report Generation
  • Excellent Communication
  • Continuous Improvement
  • Teamwork and Collaboration
  • Multitasking
  • Technical Support

Certification

  • SQL Certification
  • BASE SAS Certification

Accomplishments

  • GEM award for an outstanding performance for successful completion of all project assignments at SCB.
  • Trained New Joiners in SQL and EXCEL and received best trainee 2015.
  • Designed JNTU Kakinada Web site.

Languages

English
Professional
Hindi
Limited
Telugu
Professional
Tamil
Elementary

Timeline

Data Engineer

SourceOn IT INC
10.2023 - Current

Data Analyst

Standard Chartered Global Business Services
10.2016 - 09.2023

Data Engineer

Dot Pro Software Solutions Pvt Ltd. (Edelweiss)
05.2012 - 09.2016

Elite Trade Relationship Group portfolio Analyst

ICICI BANK
05.2012 - 09.2016

Bachelor of Science - Computer Science & Engineering

Jawaharlal Nehru Technological University
SATISH KUMAR KANTA