Summary
Overview
Work History
Education
Skills
Certification
Training
Personal Information
Timeline
Generic

Swapna Thankappan

Summary

To work in a challenging and rewarding environment that utilizes my skills and expertise in data engineering, data migration, data quality, data governance, and data analysis. Excellent Problem solver with strong business, technical and analytical skills. Able to design and implement valuable data management systems and solutions to identify key opportunities for greater effectiveness and efficiency.

Result oriented data engineering professional with a proven track record of developing and implementing comprehensive data engineering /data management solutions. More than fifteen years of IT experience in Requirement Gathering, Business Analysis, data analysis, Design, Development and Testing utilizing different Technologies and Tools. In-depth hands-on experience in database, ETL/ELT design and development and have excellent Data profiling, data analysis and data governance skills and thorough knowledge in the delivery of Projects. Well Experienced with different Products of Informatica including ETL data Integration using Informatica Power Center, Informatica Power Exchange for Mainframe Sources, Informatica Cloud, Informatica Administrator for Informatica Administrations, Informatica Data Quality (IDQ) for data profiling, Informatica Metadata manager, Data masking and Informatica Analyst and Informatica data Catalog. Expertise with multiple relational databases Oracle, DB2, Teradata, Micro Soft SQL Server, Netezza for Design, develop, production deployment and maintenance of database objects Expertise in writing complex SQL queries and PL/SQL scripting. Excellent experience in working with Enterprise job scheduling and Automation tools Tivoli and Control M. Experienced in UNIX Shell Scripting and working knowledge with Python. Working Experience with Business Intelligence and reporting Tools in generating various reports using Business Intelligence/OLAP Tools Power BI, Tableau and Cognos. Involved in the Development and maintenance of data visualizations and dashboards to support business decision making. Skilled in design, develop and deliver data integration solutions using a variety of tools and technologies. Proficient in Workload automations tools (Control M) for integration with applications, databases etc. Exposure to working on cloud-based platforms like AWS, cloud database AWS RDS etc. Expertise in Version control (bit bucket, GitHub) and CI/CD Process using GIT, Jenkins. Working with Agile project management Methodologies (Scrum, Kanban) and tools (Jira). Participating and driving quarterly PI (Program Incremental) planning meetings, works with Squad leads to come up with backlogs for upcoming sprints. Worked as SME for data engineering and data analysis team and expert in addressing / resolving Production issues which could impact the business. Provided technical mentorship to the team. Collaborated with stakeholders in explaining technical details. Worked on multiple programs and was able to meet timelines and goals. Took the initiative on each project and confirm that understand the requirements before working on it and identify ways to improve on the job. Developed end-to-end solutions and well experienced with driving projects from requirement gathering till production deployment.

Overview

17
17
years of professional experience
1
1
Certification

Work History

Principal Data Engineer

Fidelity Investments
03.2015 - Current
  • Working as Principal data Engineer in CDP (Common Data Platform (formerly DAL SDS – Data Access Layer-Shared Data Service) team under FFIO
  • CDP is a centralized data hub Platform provides data and solutions to multiple consumers (both internal and external) in the form of APIs & feeds
  • Being a key resource of the team working on multiple initiatives which includes providing design, development and review of data base and data integration, data quality solutions, ETL using Informatica (FDIP Platform) which is in cloud and scheduling of data integration pipelines using Control M Scheduler
  • Performed a program management role within CDP business group in Decommissioning one of the major application- SDS Integration on CDP/SDS space
  • Co-ordination among around 30 different business units internal and external to Fidelity with the changes, non-production/production testing, production deployment and consumer validation
  • Co-ordination of changes with Data Strategy & Governance team(DRAGON- Data Registry,Analytics,Governance, Oversight, and Navigation a team which handles data governance & data quality, and UDZ (Unified Drop Zone- which is a File Hub platform as part of Data Strategy and Governance Team) to ensure the data quality and data cataloging
  • Played a pivotal role in Fundref Decommission program, worked diligently to create detailed data analysis across of attributes consumed by multiple applications across Fidelity through the form of APIs, feeds and direct database accesses
  • Worked on Domain API conversion projects which is replacing the API attributes being sources from Fund reference DB to MDM database
  • Working with Database team for coming up with divisional schema for new MDM tables and data model reviews and DB objects developments
  • Collaborated with stakeholders across Fidelity (both internal and External) to understand the business requirements and develop required solutions
  • Developed and implemented Control M jobs to automate data processing using Informatica, report generation across multiple environments
  • Proficient in Control-M functionalities like Job creation, calendar management, dependency management ensuring proper execution order for complex workflows
  • Setting up dependencies and production deployment of control M jobs to production data center using Auto Install packages
  • Expertise in job scheduling, monitoring jobs executions, identify issues, and troubleshoot failures within Control -M environments
  • Completed cloud migration initiative of Feed ecosystem migration from On prem to Amazon AWS
  • Involved in EC2 IAM role creation, KMS Key linking, IAM role setup, linking S3 IAM roles with KMS keys in multiple environments
  • Part of cloud data base migration team for migration of database from on premise to Cloud RDS
  • Experienced in using AWS Dynamo DB, Amazon RDS etc
  • Experience in implementing data integration, data movements & ETL using AWS Glue, Glue Data Catalog, Glue data brew
  • Also experienced in using Amazon Kinesis data streams & EMR, Amazon Quick sight for creating dashboards
  • Implemented data integration solutions and pipelines for inbound files receiving from Upstream, by loading into staging tables and target tables using Informatica ETL process
  • Implemented lot of ETL processes using Informatica and Control M Scheduler and deployed to Production
  • Also handled multiple Informatica version upgrades
  • Worked with database team in setting up Golden Gate replication for replicating MDM data in DAL/CDP applications
  • Handled multiple ad hoc and major Prod deployment release support of Database, ETL and control M changes including Solutioning, Development, code review, testing support, code deployment & release support
  • Being part of CDP Cloud Migration project actively working on tasks includes design discussions for solutioning data consolidation strategy, data migration initiatives & Legacy application migration strategy
  • Worked with External Consumers and Auditing team (PwC, Deloitte, FIC PwC) for walk through sessions of data strategies/ processes implemented as part of SOC1 requests
  • Led Informatica version upgrades from 10.4 to 10.5, SMI upgrades from TCP to SSL and testing associated with the upgrades
  • Working on onboarding Snowflake Database in CD ODS
  • Contributed to Design, development, testing support, code deployment, code review of Database, ETL & control Objects as part of Mainframe Migration project for all regions
  • Managed Historical data migration effort from legacy DB to Cloud DB: Worked with team members in designs brainstorming sessions, approval from architect team and implementation of design approach
  • Played a Major role in Migrating from ETL data engineering platform Migration from PaasDI to FDIP on Cloud
  • Contributed major role in JET Money Market cutover Project
  • Development, Testing support, code review and code deployment of DB, ETL, Control M objects in Production
  • Also done a major role in Install support and Post install validation
  • Provided necessary support for Disaster Recovery Switch activity
  • Addressed critical Production issues and providing resolution at the earliest to the consumers
  • Mentoring junior members of the team and helped in Onboarding process
  • Worked closely with teammates and often help resources in an efficient way to resolve their issues
  • Conducted KT session for new resources for Onboarding and provided Projects overview
  • Environment: Oracle (AWS RDS), SQL Developer, Informatica 10.5.2, Control M V21, Power BI , Enterprise Jira, Jenkins Core, GitHub

Sr. Data Engineer - Consultant

Fidelity Talent Source (Veritude)
03.2015 - 09.2021
  • Company Overview: Client Fidelity Investments
  • Worked as a Sr
  • Data Engineer in Fidelity Investments FFIO Division
  • (Formerly FPCMS)
  • Implemented External Money Manager (EMM) solution using Informatica ETL, Data transformation Studio and Unstructured Data UDT) transformation
  • Led multiple projects in various phases from Analysis, Design, Development including (both DB, ETL), Testing and Code deployment to Production
  • Created database objects and ETL process for delivering data from DAL database to different downstream using point to point feeds
  • Converted Excel UDAs into feed file deliverables to downstream by implementing solutions using database and ETL
  • Developed and maintained Change Data Capture (CDC) solutions using Informatica Power Exchange, Power Center, and Oracle Database
  • Automated and scheduled manual generation of Auditing team (PwC & Deloitte) reports using Oracle database, Informatica ETL and Control M
  • Responsible for monthly delivery of Auditing team reports and Demo of report generation using informatica ETL, Oracle Database and control M
  • Was responsible for Informatica version migration from 9.6 t, 10.1 ,10.2 and SHA-1 , SHA-2 certificate upgrades
  • Leading production code- deployments and validations on database, ETL and control M
  • Setting up TRA review meeting with different teams
  • Involved in migration and testing of moving Informatica infrastructure from legacy PaasDI environment to new cloud environment FDIP
  • Providing Supports to production support team for job failures and monitoring
  • Experienced in data cleansing using Informatica Data Quality, Profiling data by creating Profiles, scorecards & rules, and monitoring data anomalies using Informatica Developer
  • Performed data analysis using Informatica data analyst and used metadata manager in identifying data lineage
  • Client Fidelity Investments

Data Migration Lead

TMX Finance
Carrolton, United States
01.2013 - 03.2015
  • Company Overview: TMX Finance is a consumer specialty finance company in the United States head quartered in Savannah, Georgia which is providing Title loans to customers based on vehicle titles
  • Responsible for maintaining the Data Governance strategy within a data area and responsible for executing the Data Governance activities
  • Worked with subject matter experts within the business to drive the development and publication of data dictionaries for all business-critical data sets and data items
  • Pioneered the creation of a data catalog and business glossary, standardizing terminology across multiple departments and improving cross-functional collaboration
  • Establish rules and procedures for how to use, manage, share, and access data with customers, third parties, and internal Business Units in conjunction with Information Security, Global Risk & Assurance, Compliance and Legal
  • Partner with key stakeholders to understand current and future business needs and how data is currently leveraged to ensure effective and efficient decision making
  • Understand strategic objectives of the company and ensure that the Data Governance and Management program aligns appropriately
  • Provide technical leadership to effectively implement key concepts such as data lineage, metadata management, data quality measurement and data stewardship
  • Conduct and maintain source system analysis and data profiling for lifecycle management of critical data elements
  • Worked to identify the root causes of data quality issues and create and manage a project plan to remediate the issues
  • Involved in data integration projects for EDW and Interacting with business users for Requirement gathering, Analysis, design and preparing data mapping document for Source to target mappings and coding for developing complex mappings and unit testing of system functionality and corrects the defects during various phases of testing
  • Also participates in the testing in QA environments
  • Participated in business requirements and design sessions and translating business and functional requirements into detailed technical specification
  • Performed the Extract, transform and Load using Informatica 9.5.1 Suit of Products into Enterprise Data Warehouse and worked as Informatica Administrator non-Production Environments
  • Worked on the POC (Proof of Concept) for creating Score cards and setting up Profiles and Rules for Customers and Address validation using Informatica Data Quality
  • (IDQ)
  • Working on multiple projects including Data Integration using Informatica Power Center Tools, Informatica Administration and ETL – Sales force Integration using Informatica Cloud
  • Involved in the development using Informatica for bringing different POS systems like EDW using SCD of Type 2 from a variety of relational sources like Oracle and SQL Server and flat files
  • Created shell scripts for backup of repository and domain databases
  • Also responsible for Release Management activities such as deployment walk-thru and change control co-ordination for deployments
  • Worked with Informatica Corporation on issue resolution and upgrades and served as primary account holder for Informatica products
  • Responsible for creates and manages Informatica Cloud Accounts like Users, User Groups, and maintenance of license for the Organization and also Migration of Objects between different Orgs
  • Worked closely with the Analytics and Reporting team to support the business analytics and end user reporting using Cognos and business objects
  • TMX Finance is a consumer specialty finance company in the United States head quartered in Savannah, Georgia which is providing Title loans to customers based on vehicle titles

Lead. ETL/DB Developer

HMS Holdings Inc
Irving, United States
02.2012 - 11.2013
  • Company Overview: HMS is health care company helping the clients to ensure that health care claims are paid correctly by responsible party, and those enrolled to receive program benefits meet qualifying criteria
  • Involved in requirement gathering, Design and Analysis, Preparing Source to target Documents, Development, Implementation, Unit Testing, Data reconciliation and Validations, QA testing , Production deployment, Automation of Jobs and production Support of ETL process for multiple jobs
  • Have worked on Informatica Data Explorer (IDE) for setting up data framework and mapping templates for data profiling
  • Worked with business analysts to gather requirements and design ,develop, test and deploy mapping templates
  • Done Data Hub Extracts for extracting data from Consolidated Claim tables and Paid Claims table from Teradata Warehouse using Fast Export and writing them to flat files with specified layout for EDI team
  • Involved in the design of the Claims Data Warehouse to be built on Teradata and designed Teradata BTEQ scripts to data transfer and Teradata Parallel Transporter(TPT) scripting for loading data into Teradata staging tables and merge scripts for loading data from stage to base tables
  • Extensively involved in designing, developing, implementation, testing and production support of ETL processes using Utilities Teradata Parallel Transporter, SQL, BTEQ, MLoad, Fastload and Informatica for data loading and extraction
  • Design, develop, maintain and test ETL and BI processes
  • Linux scripting to execute Teradata and Informatica codes
  • Wrote Fast export scripts to export data from Teradata Warehouse
  • Worked closely with Business users and business Analysts to develop requirements and then develop, test and deploy mappings & workflows in production and automation of jobs in Tivoli
  • Providing process and performance improvement of existing Informatica process, mentoring junior developers and performing code reviews and performance optimization of the projects done by them
  • Working on multiple projects simultaneously and participating in on call production support on rotation basis and provides hands on technical support on Critical ETL break/Fix issues
  • HMS is health care company helping the clients to ensure that health care claims are paid correctly by responsible party, and those enrolled to receive program benefits meet qualifying criteria

Sr. DB/Informatica Developer

HMS Holdings Inc
Irving, United States
07.2010 - 02.2012
  • Worked with different Informatica projects under Cost Avoidance &Verification Team
  • Expertise in using Informatica Power Exchange for creating data maps for dealing with COBOL/mainframe files and importing those data maps to use in Informatica Power center mappings

Allied Health care Inc
Farmers Branch, United States
03.2010 - 06.2010

People Logics
Dallas, United States
11.2007 - 05.2008
  • Company Overview: Client - RIA Services Inc
  • Client - RIA Services Inc

Education

Bachelor of Technology - Computer Science & Engineering

Post Graduate Diploma - Computer Science & Engineering

Diploma - Computer Engineering

Skills

  • Oracle 21/19C/12C
  • Teradata 1410
  • Netezza
  • IBM DB2
  • MS SQL Server 2008/2005
  • MS Access
  • Informatica Products
  • Informatica Power Center 1052/104/102/951/91/861
  • Informatica Power Exchange 951/86
  • Informatica Cloud
  • Informatica Administrator
  • Micro Strategy
  • Cognos Series 70/60 Impromptu
  • Cognos Report net 112
  • Business Objects 65/60/51/50
  • Tableau
  • Power BI
  • SQL
  • PL/SQL
  • Unix Shell Scripting
  • Python
  • Informatica Developer (IDQ)
  • Informatica Metadata Manager
  • Informatica Analyst
  • Informatica MDM
  • Data catalog
  • GitHub
  • Bitbucket
  • Harvest
  • Tivoli
  • Control -M
  • Aginity Workbench for Netezza
  • SQL Developer
  • PL/SQL Developer 703
  • Teradata SQL Assistant
  • TOAD
  • SQL Navigator 40
  • Loader
  • Oracle Forms/Reports 9i/10g
  • Microsoft Excel
  • Word
  • Access
  • PowerPoint
  • Visio
  • Project
  • AWS Glue
  • Kinesis
  • Lambda
  • AWS Lake Formation
  • Amazon Athena
  • Dynamo DB
  • Amazon Quick sight
  • Leading a team of data engineers
  • Mentoring & Coaching Junior engineers
  • Good communication skills
  • Quick learner
  • Technical learning
  • Excellent team player
  • Strong problem-solving skills
  • Quality and process improvement oriented
  • Ability to work on multiple projects
  • Good knowledge in writing software/system requirements documents
  • Data integration
  • Data warehousing
  • Data modeling
  • Data pipeline design
  • SQL expertise
  • Data migration
  • ETL development

Certification

  • Teradata 12 Certified Professional
  • Oracle Certified Professional (OCP)
  • Certified Scrum Master

Training

  • Informatica Power Center 9.x Administrator
  • Informatica Cloud Services
  • Informatica Power Center Developer Level 1 & Level II
  • Data Quality 9.x Developer
  • Data Quality Administrator
  • ILM Dynamic Data masking 9.x Developer, Administrator
  • Informatica Metadata manager
  • Informatica Analyst
  • Informatica Data Catalog
  • Data Governance Training Development
  • Data Governance Tools and Technologies
  • Snap Logic ETL Beginner & Advanced Training
  • Architecting on AWS
  • AWS cloud Practitioner
  • AWS Data Engineer
  • Snowflake Foundations Training

Personal Information

Visa Status: US Citizen

Timeline

Principal Data Engineer

Fidelity Investments
03.2015 - Current

Sr. Data Engineer - Consultant

Fidelity Talent Source (Veritude)
03.2015 - 09.2021

Data Migration Lead

TMX Finance
01.2013 - 03.2015

Lead. ETL/DB Developer

HMS Holdings Inc
02.2012 - 11.2013

Sr. DB/Informatica Developer

HMS Holdings Inc
07.2010 - 02.2012

Allied Health care Inc
03.2010 - 06.2010

People Logics
11.2007 - 05.2008

Bachelor of Technology - Computer Science & Engineering

Post Graduate Diploma - Computer Science & Engineering

Diploma - Computer Engineering

Swapna Thankappan