Summary
Overview
Work History
Education
Skills
Accomplishments
Languages
Timeline
Generic

Srinivasulu Dommireddy

Richmond,VA

Summary

Data Engineer with proven expertise in ETL processes utilizing Ab Initio and cloud migration to AWS. Enhanced data processing efficiency and automated workflows, resulting in significant time savings. Strong analytical skills and effective collaboration in dynamic environments. Proficient in SQL development, committed to delivering high-quality data solutions.

Overview

23
23
years of professional experience

Work History

Data Engineer

FEPOC CareFirst
01.2022 - Current

Snowflake engineer

USAA
Herndon, San Antonio, TX
09.2020 - 12.2021
  • Build the new Abinitio code or plans to replace the current ingestion process by using the Sqoop
  • Build Abinitio continues flows to replace current steaming process (Kafka, nifi and kudu)
  • Build the Abinitio process to
  • Write the Complex sql queries to pull the variant (json) data to fielded data
  • Build the Abinitio process/plans to perform initial load from Mainframe to Hive
  • Build the CDC process to read the data from Abinitio queues
  • Create the Abinitio process to migrate the data from On prem to cloud and convert any timestamp UTC to EST format
  • Create shell scripts for ETL process automation
  • Prepared design documents for new requirements
  • Participated in design and code reviews
  • Design and develop Ab initio graphs for data transformations and populating Stage area
  • Performance tuning of the existing graphs using latest techniques, improving parallelism, using GDE tracking details for improving better CPU performance
  • Preparing and Execution of Component and Assembly Testing
  • Captured all test cases and results and validated the expected results
  • Written shell scripts to execute Ab initio jobs through Control-M
  • Executed the release End to End (from design to implementation), USAA, a diversified financial services group of companies, is among the leading providers of insurance, investing and banking solutions to members of the U.S
  • PROJECT # 1 : Loss product Analytics, AbIntio,Informatic,python, DBT,Control-M
  • Team Size : 5, Performing requirement analysis and translating Business needs to Functional and Non-Functional requirements
  • Prepared documents like LDD, solution architecture, security interaction and test cases document etc
  • Write the Complex Queries and replicate the data from Oracle to Snow flake
  • Write the Complex sql queries to pull the variant (json) data to fielded data
  • Build the python packages to load/unload data from snowflake database
  • Create shell scripts for recon process on prem and snow flake databases
  • Create shell scripts for ETL process automation
  • Prepared design documents for new requirements
  • Participated in design and code reviews
  • Involved in developing the graphs using various Ab Initio Components as per the business requirements
  • Use transformations (Data conversion, Derived column, Aggregate, Sort, PBK, Reformat, Join, Normalized, and Scan) to process data
  • Design and develop Ab initio graphs for data transformations and populating Stage area
  • Performance tuning of the existing graphs using latest techniques, improving parallelism, using GDE tracking details for improving better CPU performance
  • Preparing and Execution of Component and Assembly Testing
  • Captured all test cases and results and validated the expected results
  • Written shell scripts to execute Ab initio jobs through Control-M
  • Executed the release End to End (from design to implementation)
  • Onsite Co-ordination and supported in case of issues

AWS Developer

Capital One
04.2019 - 09.2020

Lead python Developer

CAPITAL ONE
Richmond, VA
04.2019 - 09.2020
  • Capital One Financial Corporation is an American bank holding company specializing in credit cards, auto loans, banking, and savings accounts, headquartered in McLean, Virginia with operations primarily in the United States
  • It is on the list of largest banks in the United States and has developed a reputation for being a technology focused bank
  • Dealer Marketing (Auto Finance, , Snow flake, AWS, Marketing Cloud, Sales force
  • Team Size : 2, Performing requirement analysis and translating Business needs to Functional and Non-Functional requirements
  • Prepared documents like LDD, solution architecture, security interaction and test cases document etc
  • Write the Complex Queries and replicate the data from Sales force to Snow flake
  • Conduct The quick off Meeting with Brand Manger
  • Create the EDD document based on the requirement
  • Once data replicated to Snow flake, create the reports by using python Data frames sql alchemy
  • Run the campaign based on the EDD and servicing level
  • Run the suppressions who already opt out and perform the bounce analysis using after post campaign
  • Write the python scripts with boto3 to connect with AWS
  • Automate the Python scripts with cloud watch by using cloud watch
  • Crate the back of EC2 instance by creating the lambda scripts and run as per schedule
  • Migrate control-M jobs to AROW(Cloud scheduler)
  • Migrate the Ab initio graphs into cloud environment
  • Ingesting data into AWS S3 bucket by using python scripts
  • Add s3-bucket and load the load ready files into AWS
  • Load the data from s3-bucket to snow lake database
  • Creation of the AROW Scheduling Scripts and scheduling the jobs
  • Migrate the CA Enterprise scheduling to AROW Jobs
  • Parsers written in Python for extracting useful data from the design data base
  • Using the Sql Alchemy and sql connector, pull the data from the snow flake and apply the transformation logic by using pandas and create report and upload into Marketing cloud
  • Pull the data from AWS S3 Bucket and perform the required operations
  • Create the pie chart by using the seaborn and marplot libraries
  • PROJECT # 2 : Lift & Shift Cloud Migration
  • Designation : Tech Lead
  • Environment : Ab Initio, scripting, Teradata, Snow flake, aws,PySpark and python
  • Team Size : 8, Load/extract data from multiple cloud applications and legacy applications using ETL(Ab Initio) and Cloud Services, Snowflake
  • Design, develop and Test ETL processes per business requirements
  • Worked on Jenkins pipeline automation by using Bogie gear's the AWS infrastructure and application code using Cloud Formation Templates (JSON based CFTs), Jenkins and Deploy on AWS platform technologies (EC2, ELB, S3, IAM, EMR, SNS, Route 53, Lambda, C-NAME, Catalog, Load balancer, Cloud formation, cloud watch log and Cloud trails)
  • Create standards for data extracts, analysis, reports, change management processes, etc
  • Generate automated, repeatable and stable processes to support data load, data validation, and data synchronizations for third-party solutions such as our core system BI data warehouse
  • Maintain the data warehouse metadata documentation and associated data repositories (e.g., data dictionaries) for the data warehouse environment and complex inhouse databases
  • Investigating ETL job failures due to resource constraints, missing/invalid data/files, and bugs and infrastructure component issues
  • Worked on optimization of queries & snowflake UDF's that impacted the performance of loads/data processing
  • Followed Agile-Scrum project development methodology for implementation of projects, part of the daily scrum meetings and sprint meetings
  • Deep Dive into root causes of any incident and implement permanent resolution to avoid reoccurrence
  • Proposing and implementing the fixes required for the broken functionality
  • Worked with business units/data owners to classify data for appropriate security, backup, and performance characteristics
  • Develop and support complex data queries and advanced data visualizations
  • Coordinate between onsite and offshore teams
  • Assisting the team to ensure that the tasks are delivered on schedule.

Associate

JPMorgan
12.2016 - 04.2019

Tech Lead

JPMORGAN CHASE, JPMorgan Chase & Co
12.2016 - 04.2019
  • Is a financial holding company
  • It provides financial and investment banking services
  • The firm offers a range of investment banking products and services in all capital markets, including advising on corporate strategy and structure, capital raising in equity and debt markets, risk management, market making in cash securities and derivative instruments, and brokerage and research
  • It operates through the following segments: Consumer and Community Banking, Corporate and Investment Bank, Commercial Banking and Asset and Wealth Management
  • The Consumer and Community Banking segment serves consumers and businesses through personal service at bank branches and through automated teller machine, online, mobile, and telephone banking
  • The Corporate and Investment Bank segment offers a suite of investment banking, market-making, prime brokerage, and treasury and securities products and services to a global client base of corporations, investors, financial institutions, government and municipal entities, Ab Initio, Informatica, Unix, Sybase,spark,scala,impala,hive, python
  • Team Size : 4

Tech Lead

AMERICAN EXPRESS
Scottsdale, Chennai, AZ, IN
08.2011 - 12.2016
  • American Express is a global financial, travel and network service provider
  • AMEX issues corporate cards to its corporate clients through whom it helps companies and institutions to manage their travel, entertainment and purchasing needs
  • The GDR is the single large data warehouse that stores/manages all the AMEX Corporate card transactions worldwide
  • Feed to this database is received from all over the world
  • Using Abinitio, a very strong Universal Database ETL tool, data is captured from GDR; transformations are carried on according to the Business requirements and loaded into the respective target tables, or as flat files to downstream systems; in some cases reports are sent to the clients in Data files
  • Management System support Utility(MSSU), MSTR
  • Team Size : 4, The project is aimed to send transactions/profile information to the End users on their request in the form of files/reports
  • The files/reports can be sent in any one of the file format like ASCII, EBDIC, XML format etc according to the end user request
  • Also the reports are sent on request, daily basis, and cyclic basis or according to the frequency specified.MSSU is one such process which identifies and pulls the request from data base tables and passes on the report request to the end graphs which in turn creates the reports and sent it to the end users
  • Roles & Responsibilities:
  • Performing requirement analysis and translating Business needs to Functional and Non-Functional requirements
  • Prepared documents like LDD, solution architecture, security interaction and test cases document etc
  • Created Flat files using dataset components like input file, output file, and intermediate file in Ab Initio graphs
  • Developed graphs with complicated functionality using multistage components
  • Implemented component parallelism, pipeline parallelism and data parallelism in Ab Initio for ETL process for Data warehouse to improve performance
  • Development of Ab Initio graph that runs on a daily basis as per business requirements
  • Designed and Development of ETL using Ab Initio.(Creation of Ab Initio Graphs)
  • Creating Sandbox level, Graph level parameters according to the requirements
  • Creation of the Control-M Scheduling Scripts and scheduling the jobs
  • Validation of Component/Assembly Testing outcomes with Client
  • Creation of structure and scripts for Control-M automation
  • Worked in a sandbox environment while extensively interacting with EME to maintain version control on objects
  • Sandbox features like Check In and Check Out were used for this purpose
  • Performing Transformations on source data with Transform Components like Reformat, Filter-by-Expression, Rollup etc
  • Created and Performed reviews of all ETL designs, graphs, and test plans
  • Done appropriate unit-level testing, log defects in QC and the management of the review process for the Ab Initio deliverables
  • Providing 90 Days of warranty support to the application, Japa Data Client Files
  • Designation, The objective of this project is to create the daily transactions KR1025 & KR1205 daily & cyclic file for India & Singapore and sent to the client through different delivery channel (SFT, CIW).We are part of the GDFS process which means Global Data file services team will create the files and sent to client
  • The files created will have three formats ASCII, EBICIDIC, and CSV.User will create setup in the WEB screen and then MSSU process create request daily based on the setup active state
  • After creating the requests GDFS process will trigger and pulls the data from unbilled and card acct tables and create the files .KR1025 file having record types 01,02,03,04 which at transaction level, card member level, company level and requesting acct level respectively
  • Jobs will be triggered in CONTROL –M through common process (Amex trigger services)
  • Roles & Responsibilities: :
  • Preparation of the specification of new programs as required by the client
  • Prepared design documents for new requirements
  • Participated in design and code reviews
  • Involved in developing the graphs using various Ab Initio Components as per the business requirements
  • Use transformations (Data conversion, Derived column, Aggregate, Sort, PBK, Reformat, Join, Normalized, and Scan) to process data
  • Design and develop Abinitio graphs for data transformations and populating Stage area
  • Performance tuning of the existing graphs using latest techniques, improving parallelism, using GDE tracking details for improving better CPU performance
  • Preparing and Execution of Component and Assembly Testing
  • Captured all test cases and results and validated the expected results
  • Written shell scripts to execute Abinitio jobs through Control-M
  • Executed the release End to End (from design to implementation)
  • Onsite Co-ordination and supported in case of issues
  • Involved in implementation calls with external teams and supported through out the implementation
  • Added value to the project by identifying the processes that involved manual intervention and provided optimal solutions for the same thereby decreasing the time spent on it, which in turn increased the efficiency of the employees
  • Maintained proper documentation from Analysis through Roll-Out for each deliverable and responsible to get sign off for each phase
  • Delivering the quality deliverables within the deadlines according to client standards
  • Warranty Support for delivered project for 90 days

Abinitio Developer

Amex
07.2011 - 12.2016

Change Management, Team member

HOME DEPOT
04.2008 - 08.2011
  • Ab Initio, UNIX, DB2 UDB, TERADATA, MSTR
  • Client : Home Depot
  • Description: Home Depot is the world largest home improvement retailer in the United States
  • The Home Depot operates 2,248 stores across the United States
  • Demand chain management is the E-commerce application
  • The DCM COM Project will currently forecast inventory moment and generate inventory orders using manual spreadsheet based process
  • The nature of the process creates the risk that forecasts and orders are not accurate results in sub optimal inventory levels
  • Also the spreadsheets are not available with the potential that a critical inventory does not get ordered into the YOW
  • Now the DCM improving the overall integrity and accuracy of orders using a forecast tool based on the historical sales and other factors by area
  • Increased Sales, reduced out of stocks and rightsizing of inventory levels
  • Roles and Responsibilities
  • Preparation of the specification of new programs as required by the client
  • Prepared design documents for new requirements
  • Participated in design and code reviews
  • Involved in developing the graphs using various Ab Initio Components as per the business requirements
  • Use transformations (Data conversion, Derived column, Aggregate, Sort, PBK, Reformat, Join, Normalized, and Scan) to process data
  • Design and develop Ab initio graphs for data transformations and populating Stage area
  • Performance tuning of the existing graphs using latest techniques, improving parallelism, using GDE tracking details for improving better CPU performance
  • Preparing and Execution of Component and Assembly Testing
  • Captured all test cases and results and validated the expected results
  • Written shell scripts to execute Ab initio jobs through Control-M
  • Executed the release End to End (from design to implementation)
  • Onsite Co-ordination and supported in case of issues
  • Involved in implementation calls with external teams and supported through out the implementation
  • Added value to the project by identifying the processes that involved manual intervention and provided optimal solutions for the same thereby decreasing the time spent on it, which in turn increased the efficiency of the employees
  • Maintained proper documentation from Analysis through Roll-Out for each deliverable and responsible to get sign off for each phase
  • Delivering the quality deliverables within the deadlines according to client standards
  • Warranty Support for delivered project for 90 days.

Software Engineer

Cyon technologies
04.2002 - 07.2011

Team Member

JPMorgan Chase
Columbus, Chennai, OH, IN
  • J.P
  • Morgan Core Advisory Portfolio is a professionally managed investment strategy that is broadly diversified across multiple asset classes and draws on the full breadth of our market insights and investment capabilities
  • Your portfolio benefits from the investment insights of J.P
  • Morgan, with our strategy team identifying investment risks and opportunities as well as making portfolio adjustments that conform to our holistic view
  • Roles & Responsibilities:
  • Performing requirement analysis and translating Business needs to Functional and Non-Functional requirements
  • Prepared documents like LDD, solution architecture, security interaction and test cases document etc
  • Created Flat files using dataset components like input file, output file, and intermediate file in Ab Initio
  • Develop the stored procedures in Sybase scripts
  • Migrate Informatica mappings to Ab Initio graphs
  • Load the data from Sybase to HDFS using Spark and Scala
  • Creating Sandbox level, Graph level parameters according to the requirements
  • Creation of the Control-M Scheduling Scripts and scheduling the jobs
  • Migrate the CA Enterprise scheduling to Control M Jobs
  • Validation of Component/Assembly Testing outcomes with Client
  • Creation of structure and scripts for Control-M automation
  • Worked in a sandbox environment while extensively interacting with EME to maintain version control on objects
  • Sandbox features like Check In and Check Out were used for this purpose
  • Created and Performed reviews of all ETL designs, work flows, and test plans
  • Done appropriate unit-level testing, log defects in QC and the management of the review process for the Ab Initio deliverables
  • Load the data into Hdfs using python and create hive tables
  • Create Hive scripts based on the transformation logic
  • Create Spark/Scala framework to run the hive scripts and create parquet files
  • Create impala tables on top of parquet files for analysis
  • Worked on different set of tables like External Tables and Managed tables
  • Develop the python scripts for automation
  • Schedule the jobs in Autosys
  • Develop the python scripts to automate the applications along with other applications
  • Pull the data from the Sybase by connecting through the python
  • By using the Kafka, we ingest the data from oracle and by using spark streaming and python apply the transformations and load into HDFS as parquet files
  • Create the impala tables on top of parquet files for further analysis and tableau reporting., Reef Phase 3 project will implement the interface between IRI and Bridger for Entity, Associated Individuals, Hierarchy and Corporate Card member (JAPA only) screening process for JAPA and Mexico
  • The goal of this project is to address the gaps identified in the screening process and improve efficiencies for Sanctions and PEPs screening across AMEX GCP JAPA markets by the implementation of an automated solution, to reduce the risk inherent in manual processes
  • Another goal is to address the gaps in the screening process in Mexico as a result of Pegasus Mexico project roll-out
  • At a high level, Project Reef Phase 3 is to develop the Backend Sanctions and PEPs screening from IRI to Bridger for GCP JAPA markets and Mexico
  • Roles & Responsibilities:
  • Responsible for end to end application development and integration and handling multiple projects at same time
  • Responsible to translate customer business needs into functional requirements and design the solution, and develop code to deliver solution to the customer
  • Developing generic graphs and make reusable for future requirements
  • To handle release End to End (from Analysis to implementation)
  • Development of new programs following the standard guidelines
  • Involved in data cleansing, data validation and data transformation at various stages of ETL process by using Ab Initio component library
  • Worked on various Ab Initio common components for design, development of the graphs
  • Assisting the project manager for estimation/requisition of resources for the project duration and their respective allocation/release
  • Coordinating and assisting the team to ensure that the tasks are delivered on schedule
  • Involved in Onsite-Offshore calls for providing clarifications and updating work status
  • Interaction with various teams involved in project
  • Conducted several reviews with clients in various phases to ensure design and code developed meets the requirement
  • Providing the Review comments and methods to enhance the Abinitio Jobs Performance by following standard guidelines
  • Being a project SPOC responsive to business users’ queries and supportive in providing quick resolutions
  • Execution of the projects as per quality processes and to ensure zero defect and on-time deliveries
  • Added value to the project by identifying the processes that involved manual intervention and provided optimal solutions for the same thereby decreasing the time spent on it, which in turn increased the efficiency of the employees.

Education

Bachelor of Science - Computer Science

Nagarjuna University
A.P.
01.2007

Skills

  • ETL tools: Ab Initio and Informatica
  • Data warehousing and migration
  • SQL development and scripting
  • Cloud computing: AWS, GCP, and Azure
  • Data analysis and visualization
  • OLAP tools: SSRS
  • Programming languages: Python, Scala, and PL/SQL
  • Databases: Snowflake, DB2, Sybase, Oracle, Teradata, and SQL Server
  • Domain expertise: Banking, Federal, and Retail
  • UI technologies: HTML and CSS
  • Messaging systems: Kafka and Flume
  • Version control: Git, Bitbucket, and GitLab
  • Problem resolution
  • Performance tuning

Accomplishments

  • Received Syntel’s one of the Value Award and recognized as 'SMART' for providing a LEAN idea to client.
  • AMEX certifications done in AGILE, Waterfall, SDLC, Capital Markets, Payments and Banking.

Languages

English
Full Professional
Telugu
Professional
Hindi
Professional
Tamil
Professional

Timeline

Data Engineer

FEPOC CareFirst
01.2022 - Current

Snowflake engineer

USAA
09.2020 - 12.2021

AWS Developer

Capital One
04.2019 - 09.2020

Lead python Developer

CAPITAL ONE
04.2019 - 09.2020

Associate

JPMorgan
12.2016 - 04.2019

Tech Lead

JPMORGAN CHASE, JPMorgan Chase & Co
12.2016 - 04.2019

Tech Lead

AMERICAN EXPRESS
08.2011 - 12.2016

Abinitio Developer

Amex
07.2011 - 12.2016

Change Management, Team member

HOME DEPOT
04.2008 - 08.2011

Software Engineer

Cyon technologies
04.2002 - 07.2011

Team Member

JPMorgan Chase

Bachelor of Science - Computer Science

Nagarjuna University
Srinivasulu Dommireddy