Adept at leveraging extensive experience in ETL development and Informatica Power Center, I significantly enhanced data integration processes at University of Maryland, showcasing strong analytical skills and a commitment to precision. My collaborative approach and expertise in Oracle and AWS services have driven successful project outcomes, underscoring my ability to lead and innovate in challenging environments.
Overview
17
17
years of professional experience
Work History
Admin, Sr ETL Developer
University Of Maryland
, Maryland
01.2021 - 02.2023
Involved in Informatica, role to take backup of the Repositories in Informatica Power center admin console before the Jobs are converted
Responsible for Performing E2E Testing on ETL Functionality Parallelly in both the Systems to ensure code working as expected in both PC and IICS
Experience in working with XML parser Transformation and loading the data to workday
Experience in working with Webservices and pushing the data to workday
Migrated the Jobs from PC to IICS with help of PC analyzer Tool
Developed Pre-Release Environment in IICS and Coordinated with Informatica Vendor team before the PC jobs are migrated
Developed integrating data using IICS for reporting needs
Used Webservices Transformation to Pull the data from Workday and load in to Interface tables and files
Used Xml Parser Transformation to Pull the data from workday with respect to Contingent,Rescind and Regular Employess data and Push the data to ADP
Developed the Job loading the data from SAP HANA and SAP BW to Oracle Database
Involved with SAP team and Networking team in order to check the Firewall between SAP Systems and IICS are communicating with Each other
Coordinating with QA team to perform the data loads and preparing the test cases
Involved with Reporting team to generate the files as per the reporting standards
Coordinating with the Production team to do code migration from IICS UAT to IICS Prod
Performed Both Positive and Negative test cases while developing the code
Involved in daily business meetings and created mapping documents based on the business needs to perform ETL operations
Coordinated with the Business teams to stimulate the test data in Dev, QA, UAT Environments
Used ULTRA EDIT to compare the data and data validation
Used Control M for Scheduling the jobs
Project: IICS Data Integration/IICS Application Integration, Environment: Informatica 10.2, IICS, Oracle 12c, SQL, T-SQL, PL/SQL, SQL Loader, Salesforce, AWS,Tableau (Reporting Tool), Service now (Ticketing Tool), JIRA
Responsibilities
Developed Informatica Cloud Data Integration mapping and task flows to exact and load data between on-premises, AWS RDS, Amazon S3, Redshift, Azure SQL Data Warehouse and Azure Data Lake Store; created and configured all kinds of cloud connections and runtime environments with Informatica IICS
Operated AWS console to configure services and configurations
Developed Redshift queries and RDS queries to confirm data is loaded correctly
Created Salesforce connections, Implemented Salesforce business process with Informatica IICS data Integration
Loading Student Information System from Salesforce to Redshift tables through IICS with the help of Synchronization and Replication Tasks
Experience working with IICS transformations like Expression, joiner, union, lookup, sorter, filter, normalizer, and various concepts like macro fields to templatize column logic, smart match fields, renaming bulk fields and more
Experience integrating data using IICS for reporting needs
Coordinating with the Production team to do code migration from IICS UAT to IICS Prod
Responsible for handling Production issues
Performed Both Positive and Negative test cases while developing the code
Involved in daily business meeting and created mapping documents based on the business needs to perform ETL operations
Coordinated with the Salesforce team to stimulate the test data in Dev, QA, UAT Environments
Existing Salesforce CRM Application to New Salesforce CRM Application Integration through IICS
Migrating Data to Azure Cloud with help of IICS
FSC Data to load from Workday to Oracle tables
ETL Alyterx to ETL IICS cloud Migration.
Sr ETL Developer
San Francisco, CA
03.2020 - 12.2020
Informatica 10.2, Oracle 12c, SQL, T-SQL, PL/SQL, Toad, SQL Loader, Salesforce, Qlik (Reporting Tool), Service now (Ticketing Tool), JIRA
Responsibilities
Responsible for Performing E2E Testing on ETL Functionality
Involved in installation, configuration, and upgrade of Informatica 10.1 to 10.2 hotfix
Created Replication and Synchronization Tasks through IICS and tested all the transformations are working as expected.
Sr.ETL Informatica Developer
Holland America/Princess Cruises
Santa Clarita, CA
03.2018 - 03.2020
Experience with Co-Ordination between Onsite-Offshore teams
Hands on experience as Business analysis and requirements gathering
The Business Requirement Documents and the Functional Specification are reviewed
Prepared Test Plan from the Business Requirements and Functional Specification
Developed Test Cases for Deployment Verification, ETL Data Validation, Cube Testing and Report testing
Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping Mapplet Designer and Transformation
Tested to verify that all data were synchronized after the data is troubleshoot and also used SQL to verify/validate my test cases
Reviewed Informatica mappings and test cases before delivering to Client
Responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project
Written several UNIX scripts for invoking data reconciliation
Experienced in writing complex SQL queries for extracting data from multiple tables
Testing has been done based on Change Requests and Defect Requests
Preparation of System Test Results after Test case execution
Performed Functional, Regression Data Integrity, System, Compatibility
Extensively executed T-SQL queries in order to view successful transactions of data and for validating data in Sql Server Database
Extensively used Informatica power center for extraction, transformation and loading process
TOAD is used to perform manual test in regular basis
UNIX and Oracle are using in this project to write Shell Scripts and SQL queries
Wrote SQL queries to validate source data versus data in the data warehouse including identification of duplicate records
Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results using Mercury Quality Center
Prepared Test status reports for each stage and logged any unresolved issues into Issues log
Used T-SQL for Querying the SQL Server database for data validation
Involved in gathering and analyzing the requirements and preparing business rules
Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping
Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager
Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle
Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager
Involved in the data analysis for source and target systems and good understanding of Enterprise Data Warehousing (EDW), staging tables, Dimensions, Facts and Star Schema, Snowflake Schema
Experience in Oracle BI Apps that include OBIEE/Siebel Analytics, ETL Tools and DAC
Develop mappings in Informatica Power Center to extract data from OLTP applications such as Oracle EBS, Oracle BRM, Siebel CRM and other legacy systems such as MSA and load data in to Oracle Datawarehouse for reporting in OBIEE
Customizing the out of box ETL, Reports and Dashboards in OBI Apps for Siebel, Oracle EBS applications to fulfill company’s reporting needs
Involved in creating new table structures and modifying existing tables and fit into the existing Data Model
Developed a python script to initiate a web service call that will further extract the operational data in XML form and load it into the SQL tables
Extracted data from different databases like Oracle and external source systems like flat files using ETL tool
Collaborated with team members to streamline and continuously improve reporting solutions
Generated visualization adhoc reports using Alteryx
Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data
Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 9.5
Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements
Writing complex SQL queries to analyze source data and communicating data quality issues to the business
Involved in Performance Tuning of mappings in Informatica
Good understanding of source to target data mapping and Business rules associated with the ETL processes
Generating Ship Reports to the users on weekly basis
Design:
Actively participated in Business gathering requirements
Key player in generating STTM’s (Source to Target mappings)
Documented Technical Design Documents
Process adherence:
Maintained SOX evidence document like Rollout document, Backout document, CRL before requesting for migration to higher environment levels
Designed Test case templates and presented to business before PROD rollouts
Overall SDLC (Software Development Life Cycle) experience:
Attained experience in an end to end implementation during build of EIM Organization Model and integrated conceptual data model
Exposed to various other technologies like Business Objects, MDM.
ETL Informatica Developer
NYPD
New York
10.2017 - 03.2018
Environment: Informatica 10.1.1, 9.5, Oracle, XML, SQL Server 2008, Web services, DB2 Mainframe, Tidal (Scheduler), Cognos, Remedy (Ticketing tool), GitHub, HP QC(Testing tool)
Responsibilities
Lead the analysis, design, development & implementation of logical data models, physical database objects, data conversation, integration and loading processes
Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse(EDW)
Worked on an onshore/offshore model working with development team in Bangalore
Designed ETL high level workflows and documented technical design documentation (TDD) before the development of ETL components to load DB2 from Flat Files, Oracle, DB2 systems to build Type 2 EDW using Change data capture
Created stored procedures, views based on project needs
Involved in migration of the maps from IDQ to power center
Applied the rules and profiled the source and target table's data using IDQ
Develop and coding the ‘real time’ and batch modes loads
Developed standard framework to handle restart ability, auditing, notification alerts during the ETL load process
Used Informatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities
Data Masking and Data Subset using Informatica TDM
Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared folder
IDQ development around data profiling, cleansing, parsing, standardization, validation, matching and data quality exception monitoring and handling
Involved in performance tuning and optimization of mapping to manage very large volume of data
Prepared technical design/specifications for data Extraction, Transformation and Loading
Worked on Informatica Utilities Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer
Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Router, Sorter, Expression, Normalizer, Sequence Generator and Update Strategy, data standardization, address validator etc
Develop complex ETL mappings on Informatica 10.x platform as part of the Risk Data integration efforts
Implemented SCD Type 1 and SCD Type 2 for loading data into data warehouse dimension tables
Implemented error handling for invalid and rejected rows by loading them into error tables
Implementing the Change Data Capture Process using Informatica Power Exchange
Extensively worked on batch framework to run all Informatica job scheduling
Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer
Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer
Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance
Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic
Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings
Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems
Used Variables and Parameters in the mappings to pass the values between mappings and sessions
Created Stored Procedures, Functions, Packages and Triggers using PL/SQL
Implemented restart strategy and error handling techniques to recover failed sessions
Used Unix Shell Scripts to automate pre-session and post-session processes
Did performance tuning to improve Data Extraction, Data process and Load time
Mapped Dashboard field requirements to Siebel OLTP/OLAP Physical/Business Model
Worked with data modelers to understand financial data model and provided suggestions to the logical and physical data model
Designed presentations based on the test cases and obtained UAT signoffs
Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments
Recorded defects as a part of Defect tracker during SIT and UAT
Identified performance bottlenecks and suggested improvements
Performed Unit testing for jobs developed, to ensure that it meets the requirements
Prepared/reviewed the technical design documentation (TDD) before the development of ETL and BOBj components
Collaborated with BI and BO teams to observe how reports are affected by a change to the corporate data model
Scheduled the jobs using Tidal
Used HP QC to track defects
Handled major Production GO-LIVE and User acceptance test activities
Created architecture diagrams for the project based on industry standards
Defined escalation process metrics on any aborts and met SLA for production support ticket
Handled Production issues and monitored Informatica workflows in production.
ETL Technical Developer
State Street
New York, NJ
01.2015 - 10.2017
Environment: Informatica 10.1.1, 9.5, Oracle, XML, SQL Server 2008, Web services, DB2 Mainframe, Tidal (Scheduler), Cognos, Remedy (Ticketing tool), HP QC(Testing tool)
Responsibilities
Developed ETL programs using Informatica to implement the business requirements
Communicated with business customers to discuss the issues and requirements
Created shell scripts to fine tune the ETL flow of the Informatica workflows
Used Informatica file watch events to pole the FTP sites for the external mainframe files
Production Support has been done to resolve the ongoing issues and troubleshoot the problems
Performance tuning was done at the functional level and map level
Used relational SQL wherever possible to minimize the data transfer over the network
Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections
Involved in the Migration of data from PeopleSoft Financials using Informatica PowerCenter Scheduled the data through Data Warehouse Administration Console (DAC) for metadata management and scheduling
Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements
Effectively worked in Informatica version based environment and used deployment groups to migrate the objects
Effectively worked on Onsite and Offshore work model
Pre and post session assignment variables were used to pass the variable values from one session to other
Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs
Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting
Performed unit testing at various levels of the ETL and actively involved in team code reviews
Identified problems in existing production data and developed one-time scripts to correct them
Involved in performance tuning and optimization of mapping to manage very large volume of data
Prepared technical design/specifications for data Extraction, Transformation and Loading
Worked on Informatica Utilities Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer
Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Router, Sorter, Expression, Normalizer, Sequence Generator and Update Strategy, data standardization, address validator etc
Implemented SCD Type 1 and SCD Type 2 for loading data into data warehouse dimension tables
Implemented error handling for invalid and rejected rows by loading them into error tables
Implementing the Change Data Capture Process using Informatica Power Exchange
Extensively worked on batch framework to run all Informatica job scheduling
Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer
Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer
Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic
Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings
Used Variables and Parameters in the mappings to pass the values between mappings and sessions
Created Stored Procedures, Functions, Packages and Triggers using PL/SQL
Implemented restart strategy and error handling techniques to recover failed sessions
Used Unix Shell Scripts to automate pre-session and post-session processes
Did performance tuning to improve Data Extraction, Data process and Load time
Worked with data modelers to understand financial data model and provided suggestions to the logical and physical data model
Designed presentations based on the test cases and obtained UAT signoffs
Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments
Recorded defects as a part of Defect tracker during SIT and UAT
Identified performance bottlenecks and suggested improvements.
Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment
Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer
Develop the mappings using needed Transformations in Informatica tool according to technical specifications
Created complex mappings that involved implementation of Business Logic to load data in to staging area
Used Informatica reusability at various levels of development
Developed mappings/sessions using Informatica Power Center 8.6 for data loading
Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union
Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor
Building Reports according to user Requirement
Extracted data from Oracle and SQL Server then used Teradata for data warehousing
Implemented slowly changing dimension methodology for accessing the full history of accounts
Write Shell script running workflows in unix environment
Optimizing performance tuning at source, target, mapping and session level
Participated in weekly status meetings and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings
Perform AS-IS on the current Stored Procedure code of HDFC
Design the STTM
Preparing Technical Specification required for ETL development
Extensively developed the mappings, sessions, worklets, workflows using Informatica 8.6.1
Consolidate the code and test at the onsite server
Involved in performance tuning of stored
Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
ETL Developer
IBM
, United Kingdom
04.2009 - 10.2011
Environment: Informatica Power Center 9.1, Informatica MDM 9.1, XML, Oracle, PL/SQL, DB2, RALLY, HP Quality Control, Informatica Power Exchange 9.1, JIRA
Responsibilities
Performed requirement and analysis and prepared business rules
Used Hierarchy Manager tool, for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD
Implemented IDD applications and created subject area groups, subject areas, subject area child, IDD display packages in hub
Document unit test cases and provide QA support to make testers understand the business rules and code implemented in Informatica
Responsible for investigation, characterization, and communication of build and release problems, implementing corrective and preventive actions
Resolved all the issues with JIRA tickets on priority basis
Analyzed Change request (CR) as per requests from team track/JIRA
Creating SR to Informatica Inc- related to any Power center product issues
Defined and configured schema, staging tables, and landing tables, base objects foreign-key Relationships, look up systems and tables, packages, query groups and queries/custom queries
Raised change requests, incident Management, analyzed and coordinated resolution of program flaws for the Development environment and hot fixed them in the QA, Pre-Prod and prod environments, during the runs using JIRA ticketing system
Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter, update strategy, lookup, sorter, Normalizer, sequence generator, etc
Worked with XSD and XML files generation through ETL process
Defined and worked with mapping parameters and variables
Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool
Performed the performance evaluation of the ETL for full load cycle
Checked Sessions and error logs to troubleshoot problems and also used debugger for complex
Worked on Parameterize of all variables, connections at all levels in UNIX
Used Cleanse Functions to Cleanse and Standardize while data is loading into stage tables
Enabled delta detection to extract the Incremental Data
Defined the Systems, Trust Scores and Validation rules
Created the Match/Merge rule sets to get the right master records
Performed data steward operations by editing the records using data manager and merge manager
Import & Export of ORS using Metadata manager
Configured Address Doctor which can cleanse Customer Address Data.
Senior ETL Developer
IBM
, India
06.2008 - 04.2009
Informatica 8.1.6, Oracle, SQL Server 2005, HP QC, Control M, JIRA
Responsibilities
Analysis and design of warehouse workflow from the existing Stored Procedures
Preparing the mapping diagrams & Technical Specification required for ETL development
Working on Informatica and Unix
Perform AS-IS on the current Stored Procedure
Design the STTM
Preparing the mapping diagrams
Preparing Technical Specification required for ETL development
Extensively developed the mappings, sessions, worklets, workflows using Informatica 8.6.1
Consolidate the code and test at the onsite server
Involved in performance tuning of stored
Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis
Used ControlM scheduling jobs
Assisted QA/UAT cycle by resolving the defects quickly
Wrote Configuration files for Performance in production environment