Summary
Overview
Work History
Education
Skills
Websites
Certification
Accomplishments
Timeline
Generic

Midhun Putchakayala

Raleigh,NC

Summary

Data Engineer/ Data modeller with over 16 years of experience in building and managing Data Warehouse, DataMart, Data Lakes. Proficient in all facets of the data lifecycle, including Requirements Gathering, Design, Development, Implementation, Testing, and Support. Experienced in Dimensional Modelling - designing Star Schema, Snowflake Schema, Dimensions, Fact tables and aggregate reporting tables. Good understanding on Insurance Information warehouse (IIW) Data vault model, Party Model, Relational Database design for applications. Experienced in providing Data Integration solutions using ETL and ELT methodologies to build Data warehouse/Data Mart /Operation Data Store / Downstream extracts that are stable & scalable. Developed and maintained data processing logic using Apache Spark within Azure Databricks notebooks. Collaborated with cross-functional teams to gather requirements, design solutions, and provide technical expertise on Azure cloud architecture and services. Developed many complex ETL jobs using Informatica Power centre (8.x/9.x/10.2), Talend, SSIS tools to build from various sources like relational tables, flat files to load various target instances. Also worked on Informatica Cloud. Has strong experience in the areas of Data Profiling, Data Mapping & Data Quality. Strong problem-solving skills, quick learner and open to new ideas. Experienced in Production support activities like ETL Loads recovery, handling outages/ tool upgradation. Built Several Cognos framework packages/Data modules and Dashboards using IBM Cognos Analytics that are well received by variety of audience and are used for day-to-day operations/FDIC audit / Leadership review.

Overview

17
17
years of professional experience
1
1
Certification

Work History

Lead Data Engineer

Medical Solutions
Remote
07.2022 - Current
  • Led and managed a dynamic data engineering team, consisting of 8 members, to effectively support the data requirements of data science, data governance, and business intelligence initiatives
  • Orchestrated cross-functional projects with data governance teams, ensuring compliance with data quality standards, privacy regulations, and industry best practices throughout the data engineering lifecycle
  • Implemented agile methodologies within the data engineering team, enabling rapid iteration, and adapted to changing business needs, while maintaining a focus on delivering high-quality solutions
  • Contributed to strategic discussions on data architecture, scalability, and data platform evolution, ensuring that the organization's data ecosystem remained modern, efficient, and future proof
  • Spearheaded the design and implementation of scalable data pipelines and ETL processes that facilitated the seamless movement, transformation, and integration of data from various sources into data warehouses or lakes
  • Mentored and coached data engineers, fostering a collaborative and innovative environment that encouraged skill development, knowledge sharing, and continuous improvement within the team
  • Demonstrated ability to architect and deliver scalable enterprise solutions combining various Azure services
  • Lead Data ops teams to ensure all production level azure data factory, data bricks jobs complete with in SLA and create escalation processes, build knowledge base of recurring issues and resolutions
  • Pioneered automation and optimization strategies for data operations, leading to streamlined workflows, reduced manual interventions, and increased efficiency by leveraging Azure Functions, PowerShell scripting, and CI/CD practices
  • Expertly navigated complex troubleshooting scenarios, applying a data-driven methodology to pinpoint root causes and execute impactful solutions across Azure-based ecosystems including Dynamics 365, Data Lakes, and more.

Senior Data Modeler and ETL Developer

Bangor Savings Bank
Bangor ME
09.2017 - 07.2022
  • Built Data models, ETL Packages and Cognos Framework Package to build Scorecards for Banks different leadership committees
  • These scorecards include variety of KPI’s for Loans, Deposits, Accounting, and IT departments
  • Created Dimensional model/ETL/ BI Package for Workday DM, Wealth Management group performance Dashboard, Business Loans Underwriting Process, Payroll Protection Program, Customer Demographics Dashboard, Prepaid Cards Launch
  • Created Party model for internal CRM, identified data quality issues for initial and ongoing data loads, created party merging process, loaded different customer relationships/interactions into party model tables
  • Developed data pipelines to ingest data by querying various API’s provided by external vendors like Paywith, Marqeta, Moody’s
  • Performed Data analysis to identify parameters to reassign customers to branches to estimate the true value of the branch and identified the moves/work with Core banking team
  • Has done several Cognos upgrades and coordinated core banking upgrades and mitigated the reporting impacts
  • Has done data analysis using Python, Google distance Matrix API to calculate the customer address to several nearby branch locations and thus identify the nearest branch
  • Developed SSIS packages which calculate Fiscal Year to Date calculations for different metrics along with monthly and FYTD growth calculations using complex SQL queries
  • Developed ETL to build data lake for lending cloud and built business conformance layer to build reports like underwriter productivity, average time at each step of lending process
  • Built BUOY Local/Prepaid Cards reports and Dashboards in Cognos Analytics and was responsible for creating data model/ETL and data modules/data sets in Cognos
  • Data profiling and analysis to identify data quality issues, using complex SQL queries
  • Experienced in creating road map, data flow diagrams, effort estimation, sprint planning
  • Has done POC on Snowflake, Talend, and Power BI.

Data Architect/ETL Developer

Tabner Inc.(Client: IDEXX Laboratories)
Westbrook ME
03.2016 - 09.2017
  • Designed and documented ETL load strategies needed to support required staging, warehouse, or operation data store
  • Prepared ETL specifications for two-way integration of data (esp
  • Contacts & Accounts) from BEACON to ODS and to Salesforce
  • Created external and custom objects, process builder flows, triggers, formulas in salesforce
  • Created OData connections and objects in salesforce that calls Informatica cloud real-time services
  • Created Data Replication & Data synchronization tasks in Informatica cloud to synch up on-premise database tables with salesforce objects
  • Worked with salesforce developers and suggested relationships to be used to fetch data & show on Account page layouts
  • Created views (Orders, Activities, Opportunities, Cases, Sales history) in on-premise databases to be accessible in Salesforce through OData
  • Planned Initial load of Business data (static data like Territories, user assignment) into Salesforce
  • Provided ETL solutions to load data to Salesforce and capture errors and report on them
  • Provided solutions for near real-time data integration (Credit Block on Accounts) between on-premise database and Salesforce functionality
  • Identified all the systems where data is created, edited or deleted and synchronize the changes to other places as required
  • Developed ETL jobs that consume RESTful services and update salesforce objects
  • Developed UNIX scripts to copy required files from FTP server and to archive them after ETL completion
  • Understand salesforce objects and relationships and guide salesforce developers to write SOQL queries/ build process builder jobs.

ETL Lead /Architect

Capgemini, India (Client: Zurich Insurance, UK)
11.2014 - 01.2016
  • ETL Lead, Data mapping from 4 legacy claim systems to IBM Insurance information warehouse data model, Data Modeler for data marts
  • Implemented an in-house Data profiling tool using SQL server procedures with results like Informatica IDQ tool which resulted in 3 weeks’ effort saving to team
  • Prepared High-level plan for different ETL streams (Operation Mart, Downstream/Interface Extracts, and Data Marts) in the scope of this project
  • Reviewed project plan and add dependencies or raise risks from Data warehouse team perspective to the program management
  • Designed and Developed Complex ETL mappings for special business scenarios like Policy Chaining, Policy Split - Unified view of policy from multiple source systems
  • Maintained requirement to code traceability matrix, prepare scheduling of new jobs along with phase 1 jobs
  • Designed Incremental load strategy for pulling data from multiple legacy source systems
  • Proposed solutions for complex data scenarios sub-line of business and prepared low-level design documents for ETL mapping
  • Defined and maintained the stage layer to act as a canonical model for integrating data from multiple legacy systems like FAME/CHS/UNITY
  • Reported data quality issues in FAME & UNITY source systems which has saved a lot of re-work on design
  • Acted as IIW specialist for CLAIM, EVENT, PARTY, and AGREEMENT subject areas
  • Identified Subset data in legacy systems for Unit Testing & System Testing
  • Developed complex Type –II mappings for POLICY, COVERAGE, CLAIM fundamental entities.

Senior ETL Developer/ETL Lead

Optum (Formerly United Health Care Information Services)
03.2010 - 11.2014

ETL/BI Developer

Mahindra Satyam at GE Energy
06.2007 - 03.2010

Education

Bachelor of Engineering - Electrical and Electronics

Jawaharlal Nehru Technological University
05.2007

Skills

  • Azure Data Factory
  • Azure Databricks
  • Apache Spark
  • Informatica Power Center 102
  • Informatica Cloud
  • SSIS
  • Talend
  • Relational Junction
  • Microsoft Azure Cloud
  • Python
  • Pyspark
  • Scala
  • IBM Cognos Analytics
  • Tableau
  • Power BI
  • Dashboards
  • ETL
  • Dimensional Modelling
  • Data warehouse
  • Data Lake
  • Hadoop
  • Spark
  • Google Cloud
  • Azure
  • Data Analysis
  • Data Mapping (Lineage)
  • Data Profiling
  • Data Quality
  • DB2
  • Oracle
  • SQL Server
  • Netezza
  • SQL
  • T-SQL
  • SOQL (salesforce)
  • MongoDB
  • Erwin Data Modeller
  • Toad Data Modeller

Certification

  • Informatica Certified Developer
  • Data visualization and Storytelling from North-eastern University
  • Introduction to Data Analytics, R from North-eastern University

Accomplishments

  • Impact Award at Bangor Savings Bank
  • Star Award for innovative solution at Capgemini, India
  • Sustaining Edge Award for major data warehouse integration at United Health Group
  • Several Client (GE Energy) appreciations at Mahindra Satyam

Timeline

Lead Data Engineer

Medical Solutions
07.2022 - Current

Senior Data Modeler and ETL Developer

Bangor Savings Bank
09.2017 - 07.2022

Data Architect/ETL Developer

Tabner Inc.(Client: IDEXX Laboratories)
03.2016 - 09.2017

ETL Lead /Architect

Capgemini, India (Client: Zurich Insurance, UK)
11.2014 - 01.2016

Senior ETL Developer/ETL Lead

Optum (Formerly United Health Care Information Services)
03.2010 - 11.2014

ETL/BI Developer

Mahindra Satyam at GE Energy
06.2007 - 03.2010

Bachelor of Engineering - Electrical and Electronics

Jawaharlal Nehru Technological University
Midhun Putchakayala