Summary
Overview
Work History
Education
Skills
Timeline
Generic

Satya Vattikuti

Summary

Data Engineer with over 8+ years of ETL expertise in the Healthcare industry, delivering client-focused data solutions. Specializes in database design, ETL development, testing, and implementing business applications for Banking, Healthcare, and Retail sectors. Known for high productivity and efficient task completion. Skilled in big data processing frameworks like Hadoop and Apache Spark, database management using SQL, and data visualization with Tableau. Excels in problem-solving and collaboration to develop innovative data solutions across diverse environments.

Overview

11
11
years of professional experience

Work History

Data Engineer

CHG HealthCare
Utah-Salt Lake City
01.2020 - Current
  • Extracted data from third-party applications, including Workday and locum’s tenens physician and nurse practitioner platforms, to recruit travel nurses globally for both short-term and long-term assignments
  • Utilized APIs from Facebook, LinkedIn, Google AdWords, and RingCentral to identify and engage high-quality healthcare providers and staff, contributing to the development of API lifecycle and governance strategies
  • Collaborated on Workday applications to source fresh data for CHG Health Care's physicians
  • Worked alongside senior peers on Kimball Data modeling to create dimension and fact tables for building robust data models
  • Transformed data from flat files to meet specific business requirements and loaded it into an SQL data warehouse, updating Salesforce data based on the last modified date
  • Collaborated with the testing team to rectify data quality issues in data pulled from Salesforce via ETL processes
  • Engaged in team mobbing sessions to ensure the delivery of high-quality and meaningful data to end-users
  • Worked closely with end-users to comprehensively understand their data requirements for report generation
  • Extracted data from the SAS data warehouse to develop detailed sales reports, particularly for hospital pricing and contracting purposes
  • Contributed to hospital segmentation and targeting efforts using sales data, SDI hospital discharge data, and HMS procedure & diagnosis data for reporting
  • As a member of the Clinical Analytics team, processed and analyzed CHG Health Care Claims data
  • Assisted in prioritizing hospital regions and conducted co-pay sensitivity analysis based on source data
  • Participated in physician early adoption analysis using New to Brand Rx data for analog products
  • Tableau: As needed, I created Tableau reports encompassing dashboards and drilldown reports, and subsequently, I published these reports on the server for internal users to access and utilize for reporting purposes
  • Production Support: Rotate responsibilities for providing production support among team members, addressing ongoing issues
  • I used ServiceNow tickets and Jira Tickets to track and document all production-related integration service incidents for better organization and resolution tracking
  • Environment: Snowflake, DBT, Matillion/ SSIS/SSRS/Informatica 10.6.6/ /Visual Studio 2017/Tableau 2020.3/SQL Server Management Studio/Workday/Get and commit ETL Packages on Microsoft Azure/Jira Tickets/Scrum

ETL Developer/Testing

HCL Global System
Michigan MI
04.2019 - 12.2019
  • Used BODS applications for an in-house project
  • Created efficient batch jobs in SAP BODS, improving performance by handling large data volumes using ABAP data flows
  • Applied various BODS transforms like Map operation, Hierarchy, Case, Data Transfer, and Table comparison transforms
  • Designed solutions for ETL framework
  • Automated and scheduled BODS job processes, including data loading via LSMW and BODS using IDOCs
  • Conducted ETL job testing with different test scripts and managed project deployment across system landscapes using Import/Export
  • Handled code packaging and deployment
  • Automated error handling and reconciliation processes using repository metadata
  • Managed BOIS administration and profiling activities, including scorecard creation for evaluating enterprise data quality
  • Environment: SSIS/SQL Server/ informatica Power Center 10.0 /SAP BODS/SQL Server /Micro Strategy/ Oracle/Vertica HP /Jira/ Scrum/Agile Methodologies

ETL Developer

Myriad Genetic
Salt Lake City - UT
03.2018 - 03.2019
  • Enhanced and optimized lengthy and intricate SQL code, Informatica, and SAP BODS jobs, reducing job runtimes to under 10 minutes
  • Developed ETL mappings to extract data from SFDC, Oracle, and SQL Server, loading it into an Oracle DB
  • Also, designed ETL flows for generating flat file extracts required for third-party broker system validation of Health plan information
  • Took responsibility for performance tuning at various levels, including Mapping, Session, Source, and Target, particularly for Slowly Changing Dimensions (Type 1 and Type 2)
  • Extracted data from APIs (Facebook, Google AdWords, Jira) into Stage tables
  • Prepared Unit testing and Systems Test Plans along with test cases for the developed mappings during QA testing
  • Implemented Deployment Groups to facilitate code migration (Workflows)
  • Generated user requirements for modified reports to ensure they execute correctly with the appropriate prompts, based on testing conducted for accuracy
  • Utilized Data Masking and Obfuscation techniques to secure sensitive production data in testing environments, making it realistic yet protected from unauthorized access
  • Environment: Informatica Power Center 10.0 /SAP BODS/SQL Server /Micro Strategy/ Oracle/Vertica HP/Jira/ Scrum/Agile Methodologies

ETL Developer/QA Tester

Catalina Marketing
Florida
09.2016 - 02.2018
  • Our client sits on arguably the largest consumer purchase history database going
  • It covers transactions at 35,000 stores made by 230 million unique shoppers per month
  • This client mainly deals with the large sets of coupon data for multiple vendors.The main goal of this project is building a new Data warehouse
  • Previously Catalina has DW which is known as CEDW2x legacy data, the main goal of this project is to build the new DW known as CEDW4x Big Data and match the data
  • The ETL tools used are Informatica and Aginity AMP
  • VTBD – Vault to Big Data is the name of the sub-project which I had worked on
  • Responsibilities: We useInformatica 9.6.1 to load the data from Legacy tables to Netezza tables
  • All the tasks had been assigned in the form of tickets and used JIRA to track them
  • Our primary source is vault which need to be pulled from HIVE and load into our landing area
  • Delivered all the assigned tasks on time, Work had been divided into sprints which is for two weeks
  • Attended all the business requirement meeting along with my lead and played a crucial role in building the requirements
  • Worked intensively with our team and build easily accessible code without missing the standards of the client
  • Prepared 301 business documents which consist of all the data flow form source to target
  • Built Data Integration Components using Informatica power center following locally DI Framework and recipes (i.e ETL Cookbook) like Netezza, Oracle, sql server on Unix/Linux and Windows operating systems
  • Extracted data from our cloud sources(Hive HUE)
  • HDFS system would dump the files into our cloud known as kafka que which is called as Hive Prod HUE
  • Used Hive version- 2.6.1-2 Web interface to query the data
  • Used Aginitiy AMP to generate Natural keys by the combination of multiple foreign keys
  • Wrote complex SQL overrides to join heterogenous databases
  • Worked on Writing and tuning complex SQL queries and shell scripts
  • Was involved in ETL production support once in every 3 weeks andused to work on issues which had been assigned
  • Perform Unit testing for the code built
  • Closely worked with Quality Analyst to make sure the code meets the required standards and the data is valid
  • Used Autosys to schedule the jobs by creating JIL scripts
  • Actively participated in code migration with DBA and admins
  • Maintain all documents current and available in the SharePoint
  • Environment: Informatica 9.6.1, HUE-2.6.1-2,Aginity AMP, Hive, Oracle, Netezza, Autosys, JIRA

ETL Developer

Moffitt Cancer Center
Florida
02.2016 - 08.2016
  • Project: Development: Prepared the Data Sheet-Source to Target Column Mapping Sheet (Using 21 Source Systems) to load data into Data warehouse and prepared the Data Base Scripts in Oracle & SQL Server
  • Designed, build, test and deployed the Source Systems based on Jira Ticket for existing mappings and extracted the data from the flat files and other RDBMS databases into staging area and populated into Data warehouse Created mappings, sessions, workflows to extract data from different sources to one single target and loaded about 2000 tables from different source systems (Oracle, SQL Server) Conducted code reviews on weekly basis before Deploying into test and Production environment and Prepared migration document to move the mappings from development to testing and then to production repositories
  • Developed Reusable Transformations / mapplets for the downstream work and performance tuning has been done for several ETL mappings with Business Rules
  • Verified data loss and Number of records in the source and target including Data Load, Insert, update, Incremental Auto load and Data Accuracy
  • Production Support: Ongoing support has been provided for the existing applications on need basis Environment: Informatica 9.61/Power Exchange/Oracle/SQL Server

ETL Developer

Catalina Marketing
Florida
10.2015 - 02.2016
  • As part of a Big Data Project, I contributed ETL expertise to convert legacy data from various source systems using Informatica for Catalina and processed strong background in analyzing, designing, developing, customizing, implementing, and testing software applications and Catalina Marketing products
  • Strong experience in all phases of the data warehouse development lifecycle, from gathering requirements to testing, implementation, and supporting existing Catalina Marketing manufacturing systems
  • My work involved handling diverse data types, including Retailer basket data, Customer data, Finance data, Promotions, and redemptions data from multiple retailers for Catalina
  • I successfully migrated a significant volume of legacy data to a new Data Warehouse as part of the Big Data Project and converted it into multiple data marts
  • I optimized mapping performance through various tests on sources, targets, and transformations
  • This included identifying and eliminating bottlenecks and implementing performance tuning strategies on targets, sources, mappings, and sessions to achieve maximum efficiency and performance
  • Environment: Informatica 9.61/Power Exchange/Hive/Autosys/Aginity/Archipelago/Oracle/SQL ServerData Modeling/Profiling/ Scrum/Agile Methodologies.

ETL Developer

BBVA Compass Official Bank
Birmingham, Alabama
12.2013 - 10.2015
  • Project#1.REALMS, or Relationship Aggregation and Limits Management System, is a tool that allows the bank to efficiently manage entity grouping and streamline aggregation reporting
  • Currently, this grouping information is scattered across different systems of record (SOR), and Successfully delivered data from thirteen distinct sources in a year-end release, including consumer and commercial loan origination systems, deposit, leasing, and various other financial data
  • I developed multiple Informatica mappings to load this data into the Data Warehouse, utilizing various transformations
  • This work was part of the REALMS project, and I actively participated in daily Scrum meetings Project#2 EDW-On Going Development on existing mappings & performance tuning & Data Masking; EDW is the main application system for BBVA Compass Bank, and it receives data from multiple vendor products and internal systems for customer operational transaction processing
  • These systems on either a daily or monthly schedule to create and transmit files for inclusion into the EDW
  • Data Masking& Obfuscation: Used Prod data in testing environments in a way such that the sensitive data is masked yet realistic by using Data Masking transformation in the mapping
  • Protected the sensitive information to block or mask sensitive or confidential information from unauthorized access
  • Night Production Support: Provided 24/7 Production support for BBVA Compass and support has been provided to resolve the ongoing issues (checking session logs for failure analysis) and troubleshoot the problems such as integration services are down, Database down Project#3: Informatica Admin Project: Created folders and logins for the group members and assigning necessary Environment: Power Center (Designer9.5x/ Repository Manager /Workflow Manager/Toad/Oracle 11g, UNIX Shell Scripting, TOAD, PL/SQL/Scrum/Agile Environment/Stored Procedure/Wealth Management.

Education

Bachelor of Business Management Information System -

RGVG College of Engineering

Master of Software Engineering -

Stratford University

Skills

  • Informatica Power Center
  • Informatica Intelligence Cloud Service
  • AWS
  • Redshift
  • IDQ
  • Salesforce
  • Power Exchange
  • Oracle Exadata
  • SQL Server 2008
  • DB2
  • Netezza
  • UC4
  • AutoSys
  • Control-M
  • Web services
  • DB Visualizer 100
  • Ipswich FTP
  • Azure DevOps
  • SSIS
  • IICS
  • Visual Studio
  • Data Extraction
  • Data Migration
  • Data Quality
  • Data Cleansing
  • ETL tools
  • Onshore and Offshore project models
  • Type 1 and Type 2 Slowly Changing Dimensions (SCDs)
  • ODS tables
  • Star and Snowflake schemas
  • Relational, dimensional, and multidimensional modeling
  • Incremental loading
  • Scheduling tools
  • Informatica Workflow Manager
  • Requirement gathering
  • Performance Tuning
  • Data Masking
  • Data Warehouse Testing
  • ETL mappings
  • ETL Processes
  • Test plan and test cases
  • Deployment Groups

Timeline

Data Engineer

CHG HealthCare
01.2020 - Current

ETL Developer/Testing

HCL Global System
04.2019 - 12.2019

ETL Developer

Myriad Genetic
03.2018 - 03.2019

ETL Developer/QA Tester

Catalina Marketing
09.2016 - 02.2018

ETL Developer

Moffitt Cancer Center
02.2016 - 08.2016

ETL Developer

Catalina Marketing
10.2015 - 02.2016

ETL Developer

BBVA Compass Official Bank
12.2013 - 10.2015

Bachelor of Business Management Information System -

RGVG College of Engineering

Master of Software Engineering -

Stratford University
Satya Vattikuti