Summary
Overview
Work History
Education
Skills
Software
Timeline
Generic
Sandeep Natta

Sandeep Natta

Dayton,OH

Summary

Detail-oriented Data Engineer with expertise in designing, developing and maintaining highly scalable, secure and reliable data structures. Expertise in working closely with system architects, software architects and design analysts to understand business or industry requirements to develop comprehensive data models. Proficient at developing database architectural strategies at the modeling, design and implementation stages.

Overview

17
17
years of professional experience

Work History

Lead Data Engineer

Fifth Third Bank
04.2023 - Current


  • Worked on the Fraud Detection team, enabling the development of models to detect check fraud, disputes, dark web fraud, and anti-money laundering (AML) by delivering clean, reliable, and timely data pipelines.
  • Built and maintained data workflows that powered model training and scoring, ensuring high-quality feature sets were consistently delivered to data scientists.
  • Took ownership of 200+ legacy SAS jobs previously maintained by analysts, freeing up their time to focus on high-impact fraud detection initiatives.
  • Analyzed and documented each SAS job, identifying pain points and recurring failures to standardize support and ensure smooth operations.
  • Migrated legacy SAS data pipelines to Snowflake using DBT, replacing permanent SAS datasets with modular, efficient DBT models that improved performance and reduced storage costs.
  • Led knowledge transfer sessions and onboarded an offshore support team, delegating routine support tasks and ensuring 24/7 coverage.
  • Incrementally improved job design by identifying inefficiencies and modernizing data workflows, reducing failure rates and improving maintainability.
  • Created and maintained Switchboard data flows, including both auto-generated and custom (non-auto-gen) pipelines, by configuring YAML-based definitions to streamline and standardize data ingestion into Snowflake.
  • Leveraged YAML templates to manage metadata-driven pipeline creation, ensuring consistency, maintainability, and scalability across multiple data sources.
  • Implemented slowly changing dimensions (SCDs), snapshots, and data normalization using DBT.
  • Automated DBT job runs and deployments through CI/CD pipelines and DBT Cloud,
  • Partnered with cross-functional teams to validate data needs and design Snowflake tables, ensuring that migrated data was reliable, reusable, and scalable.
  • Created and maintained project documentation in Confluence, including runbook for DBT models, migration plans, SAS job overviews, and Snowflake data mappings.
  • Ensured compliance with data privacy laws (e.g., PHI, HIPAA) via secure data zones.
  • Managed and maintained version control system using Git for multiple projects, ensuring code integrity and collaboration among team members.
  • Implemented branching and merging strategies to streamline development workflows and facilitate efficient code reviews.
  • Documented incident response guides, troubleshooting steps, and job dependencies to support the offshore team and reduce knowledge gaps across the organization.


Senior ETL Developer

PNC
12.2015 - 04.2023
  • Worked with a team of five, migrating PNC Datawarehouse to a new data center (GFB).
  • Implemented Protegrity, an advanced variant of data security for RRW where PNC generates compliance reports to the Federal Government.
  • Implemented Informatica Upgrade from 10.2 to 10.5.
  • Performed Firewall Requests, Relation connections, Server Certifications, Mainframe file migration, Netmaps, cross environment data migration during informatica upgrade process.
  • Proficient in ETL transformations Joiner, Lookups, Expression, Filter, Router.
  • Created Session task, Command tasks, Decision tasks, Email task etc.
  • Gathered requirements from Enterprise data warehouse team and collaborated with applications users to implement Protegrity.
  • Map data sources, design & configure OFSAA data model and metadata for Basel I & II calculations. These calculations are used for Federal reporting on Quarterly basis
  • Automated several Informatica jobs based on data availability from EDW. Worked on PII data while protecting data integrity while working with external vendors.
  • Expertise in Banking Domain, worked with Mortgage & Regulatory Reporting
  • Collaborated with multi-functional roles to communicate and align development efforts.
  • Automated reports & jobs using Shell Scripts and Informatica.
  • Worked with Flat Files, Mainframe files and XML files as Source Data.
  • Designed Metadata tables for reports backtracking.
  • Strong Expertise in Shell Scripting.
  • Worked with Axiom, OFSAA, OBIEE for reporting.

Senior ETL Developer

CareFirst BlueCross BlueShield
05.2014 - 12.2015
  • Developed mappings, workflows and tasks using Informatica
  • Gathered requirements from SME and contributed in meeting with BA's for new enhancements and adhoc reports.
  • Extensively used Informatica Power designer to create mapping with various transformations.
  • Created Mappings, Mapplets, Workflows, Sessions. Created Parameter files and validation scripts Validated warehouse data structure and accuracy.
  • Collaborated with multi-functional roles to communicate and align development efforts.
  • Worked with Informatica B2B DT to parse HL7 format files and HIPPA files Modify and develop new ETL programs, transformations, indexes, data staging areas, summary tables, and data quality routine based upon redesign activities Migrated Informatica, UNIX code and maintained versioning using Subversion Created SCD1, SCD2 mappings to load history tables and capture the changes
  • Created efficient Error logging mythologies by using Post session Commands and Error logging tables
  • Created spreadsheets using Microsoft Excel for daily, weekly and monthly reporting.

ETL Developer/ Support Analyst

UBS Wealth Management
01.2013 - 05.2014
  • Created mappings sessions workflows, both re-usable and non-reusable depending on the requirement.
  • Used shell scripts to invoke workflows and capture the log files.
  • Worked with DB2 Mainframe database and flat files with UTF-8, UTF-16 encodings
  • Created Autosys jobs to schedule the developed workflows based on time and calendar constraints.
  • Provided production support for Autosys jobs and fixed when a job fails.
  • Extensively used almost all types of transformations like filter, joiner, sorter, rank, update strategy, expression, Java transformations and aggregation.
  • Improved the workflow runtime by tuning the mappings.
  • Implemented CDC (Change Data Capture) and SCD (Slowly changing dimensions) Type 2 and Type 3.
  • Deployed code into production for Informatica, Unix, Oracle and Autosys using subversions.
  • Created scripts for Database monitoring like Space Availability, Index status and Job failure.
  • Created UNIX scripts to transfer files from vendors. Both FTP and SFTP. Created shell scripts to invoke workflows and run jobs via Autosys.
  • Involved in creating Technical Specification Document (TSD) for the project. Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.

ETL Developer/Production Support

Maryland State Education Association
08.2012 - 01.2013
  • Worked closely with Business Analyst to gather Business Requirement Specific (BRS) and Prepare Technical Specifications
  • Extensively worked on Power Center 9.1 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Created Sessions, command task, reusable worklets and workflows in Workflow Manager
  • Modify and develop new ETL programs, transformations, indexes, data staging areas, summary tables, and data quality routine based upon redesign activities. Created Parameter files and validation scripts
  • Used most of the transformations such as the Aggregators, Filters, Routers,
  • Sequence Generator Update Strategy, Rank, Expression and lookups (connected and unconnected) while transforming the data according to the business logic.
  • Utilized several transformations in OWB such as Sequence, Splitter, Deduplicator, Constant
  • Migrated data from staging to ODS and from ODS to RPT.
  • Moved the mappings, sessions, workflows, mapplets from one environment to other.
  • Worked with numerous flat files, loaded data from flat files to Oracle.
  • Handled multiple tasks at a time and successful in completion on time, without any loss of quality
  • Used and developed Shell Scripts used for Pre and Post Session commands for the developed Mappings and scheduling.
  • Worked with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating parameter files. Written number of shell scripts to run various batch jobs.

Etl Developer/Support Analyst

Medtronic
05.2009 - 08.2012
  • Worked with the architect in designing the architecture and building the tables.
  • Extensively used Informatica Power Designer to create tables and generate shell scripts.
  • Created Parameter files and validation scripts.
  • Extensively used almost all types of transformations like a filter, joiner, sorter, rank, update strategy, expression and aggregation.
  • Used Repository Manager to migrated code for different environments, from stage to development, development to test.
  • Worked with pmcmd command line program to communicate with the Informatica server, to start, stop and schedule workflows.
  • Created several job scripts in Unix to run jobs automatically

PHP DEVELOPER

Deniro Marketing
07.2008 - 05.2009
  • Developed themes and modified modules according to the requirement
  • Designed forms and Created dynamic front-end validation using JavaScript. Worked on Smarty Template to develop pages which help user to register into the site.
  • Worked on CSS to design the layouts.
  • Modified SQL logic in order to bring changes as per the requirements. Worked with different video players like .flv, .swf etc
  • Worked with Sessions and Cookies to store the user information while navigating the website.
  • Submitted my ideas in several meetings to develop the website and bring up the traffic to the site.
  • Modify images using Photoshop as per the requirements.

Education

Master's degree - Management Information Systems, General

Texas A&M International University
Laredo, TX
12-2009

Skills

  • Data Warehousing
  • SQL data analysis techniques
  • ETL process management
  • Unix
  • Root Cause Analysis
  • Data integration solutions
  • Expertise in banking and healthcare

Software

Informatica

SQL

Snowflake

DBT

Oracle

Datastage

SAS

Timeline

Lead Data Engineer

Fifth Third Bank
04.2023 - Current

Senior ETL Developer

PNC
12.2015 - 04.2023

Senior ETL Developer

CareFirst BlueCross BlueShield
05.2014 - 12.2015

ETL Developer/ Support Analyst

UBS Wealth Management
01.2013 - 05.2014

ETL Developer/Production Support

Maryland State Education Association
08.2012 - 01.2013

Etl Developer/Support Analyst

Medtronic
05.2009 - 08.2012

PHP DEVELOPER

Deniro Marketing
07.2008 - 05.2009

Master's degree - Management Information Systems, General

Texas A&M International University