Having 12+ years of Professional experience in IT industry, Involved in different datawarehouse projects and have great experience in the tools like Informatica PWX, IDQ, IICS, IDMC, AWS Redshift with extensive usage of ETL & Reporting tools Tableau, Power BI.
➢ Working on the health insurance details from client and creating mappings and workflow.
➢ Created mappings as per the requirements to load data into the SQL Server Tables.
➢ Used different types of transformations to build the required logic.
➢ Created Tables, Keys (Unique and Primary) and Indexes in SQL Server Tables.
➢ Tuning of the mappings and SQL Scripts for a better performance.
➢ Using the major ETL tool - Informatica, along with the other tools and using the SQL Server
interface to work efficiently in such a huge Data Warehouse.
➢ Created Tables, Keys (Unique and Primary) and Indexes in SQL Server Tables.
➢ Tuning of the mappings and SQL Scripts for a better performance.
➢ Testing ETL applications following all ETL standards and architecture
➢ Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data
matching, data conversion, exception handling, reporting and monitoring capabilities of IDQ
9.6.1.
➢ Created the batch control process, to debug the workflows in sequential order and avoiding the
sessions which have already been run.
➢ Worked on scheduling jobs in Control-M tool to schedule the workflows and work on their
dependencies.
Tools: Informatica Power center 10.1, IDQ 9.6.1, Control-M, SQL Server, Visual Studio.
➢ Knowledge in Full Life Cycle development of Data Warehousing in agile
➢ ETL Team Leader.
➢ Have extensively worked in Informatica Power Center 10.x/9.x/8.x. and on-premise data integration
platform
➢ IICS (Informatica Cloud) Data Integration, App & API Integration.
➢ Have extensively worked in designing and developing ETL programs for supporting Data Extraction,
transformations and loading using Informatica Power Center/Informatica IDQ/Informatica IICS/IDMC.
➢ Data Architect for the feature team and responsibly engaged with EPIC exploration, Sprint Refinement,
Sprint Planning, Daily Scrum and Retro meetings.
➢ Data Cleaning using Informatica IDQ tool such as Address doctor, CAMEO etc.,
➢ Experience in informatica IICS and knowledge in cloud ETL.
➢ Experience with dimensional modeling using star schema and snowflake models.
➢ Proficiency in developing SQL with various relational databases like Amazon Redshift, Oracle, Teradata.
➢ Understand the business rules completely based on High Level document specifications and implement the
data transformation methodologies.
➢ Strong with relational database design concepts.
➢ Developed several reusable transformations and mapplets that were used in other mappings.
➢ Experienced in Performance tuning of Informatica (sources, mappings, targets and sessions), used
Pushdown optimization technique and tuning the SQL queries.
➢ Independently perform complex troubleshooting, root-cause analysis and solution development.
➢ Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in
work schedules and possess good communication skills.
➢ Hands on experience in preparing Test plans, Test cases, Automated Tests and Test Data and executing
the same.
➢ Expertise in analyzing the results and reporting Suggestions and defects.
➢ Good Exposure to Bug Life Cycle.
➢ Team player, Motivated, able to grasp things quickly with analytical and problem-solving skills.
➢ Comprehensive technical, oral, written and communicational skills
➢ Experience with Agile project management using the tools ATC JIRA, Confluence, Bitbucket etc.,
➢ Deploy and maintain CI/CD pipelines across multiple environments.
➢ A drive towards automating repetitive tasks using scripting via Bash or Python.
➢ Versioning control system experience in GIT
➢ Knowledge in python coding for API development using AWS SAM CLI.
➢ Attending client meeting for requirement gathering and ETL designing
➢ Have worked as a Data Architect in agile feature team
➢ Designed and developing ETL jobs to extract data from CRM and load into EDW Redshift
➢ Wrote various data normalization jobs for new data ingested into Redshift.
➢ Optimizing and tuning the Informatica mapping and Redshift environment and enabling
mapping/queries to perform up to 100x faster for end user like Adobe Campign, Tablue reporting, R-studio analytics etc.,
➢ Co coordinating with offshore team for deliverables.
➢ Developing/Designing ETL Informatica processes/mappings.
➢ Tuning ETL/Informatica mappings/processes.
➢ Best practices for developing ETL jobs.
➢ UNIX Shell Scripting and ETL automation.
➢ Tuning of complex SQL Queries/Processes.
➢ SCD 1, 2 ,3 and Change Data Capture (CDC).
➢ Resolving production ETL failures.
➢ Accomplished several other development activities like performance tuning of ETL loads trouble
shooting of Informatica mappings using Mapping Debugger and deployment of code to other higher
environments.
➢ Worked on encryption part for PII data (secure/sensitive data)
➢ Collaborated and extended his support to various other project teams for their Informatica migration.
➢ Manages between onshore and offshore for effective project deliverables.
➢ API development using AWS SAM CLI
➢ Using Python programming for Rule validation as per business logics
➢ Converting JSON to structured data and loading to AWS Redshift using python script.
➢ Accomplished several error handling processes in order to track for the rejected records.
➢ Accomplished several audit tables for tracking the daily running jobs.
➢ Troubleshooting the ETL code by using informatica debugger options.
➢ Identifying the duplication customer using informatica IDQ – match transformation method.
➢ Enabling KMS encryption in AWS cloud to achieve data privacy and data governance.
➢ Develop business critical Informatica entities using IICS/IDMC Informatica Intelligent Cloud Services
(CAI - (cloud application integration) & CDI - (cloud data integration))
➢ Daily Meeting with BMW business people and gathering requirement after
understanding their demand
➢ Creating ETL Design document from the understanding business needs
➢ Getting the development from off-shore team
➢ Manages between onshore and offshore for effective project deliverables.
➢ Accomplished several other development activities like performance tuning of ETL
loads trouble shooting of Informatica mappings using Mapping Debugger and
deployment of code to other higher environments.
➢ Worked on encryption part for PII data(secure/sensitive data)
➢ Developed BigQuery queries to extract and deliver meaningful insights to
stakeholders.
➢ Implemented ETL process to streamline the import of data from various sources into
BigQuery warehouse.
➢ Build Dataset and tables in Big Query and loading data from cloud storage.
➢ Using Python programming for Rule validation as per business logics
➢ Converting JSON to structured data and loading to AWS Redshift using python script.
➢ Accomplished several error handling processes in order to track for the rejected
records.
➢ Accomplished several audit tables for tracking the daily running jobs.
➢ Troubleshooting the ETL code by using informatica debugger options.
➢ Identifying the duplication customer using informatica IDQ – match transformation
method
➢ To extract transform and load from different source system to target system using
Informatica (Teradata, Salseforce.com, SQL Server, Oracle) (using fast load and
multi load utility). Extracted source data from flat files, Oracle and loaded to an
Oracle.
➢ Developed mappings in Informatica Power Center Designer to load data from staging
environment to warehouse. Did lot of issues fixed during on call production support
and implemented permanent fixes on the repeated issues.
➢ Created mappings using the transformations such as the Source qualifier,
Aggregator, Expression, Router, Filter, Sequence generator and Update Strategy.
➢ Experience on Debugger to validate the mappings and gain troubleshooting
information about data and error conditions.
➢ Involved in creation of mappings, sessions, command task and workflows.
➢ Developed various Mappings with the collection of all Sources, Targets from legacy
data sources and Transformations.
➢ Testing the ETL jobs end to end and fixes the discrepancies. Worked as a
Configuration Controller
➢ Did Workflow deployment within the domains using (Informatica deployment & XML).
➢ Mainframe Plays main role to load data from legacy system to Teradata (using
BTEQ utility).
➢ Data loading using some of Teradata utilities like Fload, Mload, BTEQ and
FastExport.
➢ Developed mapping to fetch data SAP BW to Teradata using ABAP code method and
OH (Open Hub) extraction method. Having knowledge on to create SAP connections
in Informatica.
➢ Implemented File mode and Steam mode methods used to fetch records from SAP
BW using ABAP.
➢ Strong hands-on experience on Teradata Utilities such as BTEQ, FLOAD, MLOAD, TPUMP
and Analyst using SQL SERVER, TERADATA (V13, V14, V15.10, V16.20).
➢ Experience in Query Optimization and Performance Tuning of Stored Procedures and
Functions.
➢ Proficient in data warehousing techniques for Data cleansing, Slowly Changing
Dimension phenomenon’s (SCD’s), Change Data Capture (CDC).
➢ Experience in integration of various data sources from Databases like Teradata, Oracle,
SQL Server, and text files.
➢ Developed complex Teradata SQL code in BTEQ script using OLAP and Aggregate
functions.
➢ Excellent knowledge and experience in documents like BSD, TSD & Mapping documents.
➢ Extensively involved in identifying performance bottlenecks in targets, sources and
transformations and successfully tuned them for maximum performance using best
practices.
➢ Experience in documenting Design specs, Unit test plan and deployment plan.
➢ Extensive knowledge with Teradata SQL Assistant. Developed BTEQ scripts to Load data
from Teradata Staging area to Data Warehouse, Data Warehouse to data marts for
specific reporting requirements. Tuned the existing BTEQ script to enhance performance.
➢ Expert in Teradata RDBMS, initial Teradata DBMS environment setup, development, and
production DBA support, use of FASTLOAD, MULTILOAD, TPUMP, and Teradata SQL and
BTEQ Teradata utilities.
➢ Created shell script to run DataStage jobs from UNIX and then schedule this script to run
DataStage jobs through scheduling tool.
➢ Experience in writing UNIX Korn shell scripts to support and automate the ETL process
➢ Work on planning and grooming user requirement to provide solution design using OLAP
architecture and create estimates on end to end development.
➢ Experience with Data Warehouse concepts, and methodologies, as well as strong
knowledge with star schema and snowflake schema data models.
➢ Knowledge in Query performance tuning using Explain, Collect Statistics, Compression
and Join Indexes including Join and Sparse Indexes.
➢ Design and document ETL solutions for projects, estimate work effort and provide status of tasks.
➢ Provide technical support for the administration and production support of Informatica.
➢ Provide first and second level troubleshooting, and baseline and operational support.
➢ Adhere to best practices and methodologies for ETL development team.
➢ Present all work in design and code review sessions to senior team members.
➢ Prepare documents that describe, and support code developed.
➢ Identify improvements to the data warehouse environment. Work with the team to implement these
solutions.
➢ Escalate issues to the right party when required.
➢ Development of new Load programs for ETL.
➢ Error Handling along with mapping depends on requirements.
➢ Creation of Mappings, Sessions and Workflows for the ETL requirements requested by the clients.
➢ Testing for the ETL requirements
➢ Creating JIRA tickets for INFA code and DB Code
➢ Labeling the Informatica components and check in the SQL scripts (DM sheet, Stored Procedure) in
SVN
➢ Build the informatica component in Jenkins.
➢ Deploying into downstream environment
➢ Develop new ETL and enhancements and modify existing code using Informatica PowerCenter.
➢ Work independently to develop, configure, and unit test programs from specs (source to target
mappings).
➢ Worked on scheduling jobs in Control-M tool to schedule the workflows and work on their
dependencies.
➢ Work closely with data architects, reporting team and application team on ETL development efforts.
➢ Designing the requirement
➢ Development of new fact and dimension table
➢ Development of new Load programs for ETL.
➢ Creation of Mappings, Sessions and Workflows for the ETL requirements requested by the clients.
➢ Developing shell scripting for FTP/SFTP with other source system
➢ Testing for the ETL requirements and DWH
➢ Deploying the developed Informatica objects, scripts and tables.
➢ Modifying existing code with error handling logic as per new design
➢ Modifying session level properties to achieve performance at session level.
➢ Implementing push down optimization for limited mappings.
➢ Implementing partitioning in informatica mappings.
➢ Generating and modifying ABAP code for SAP ECC source type.
➢ Performance tuning at target table (Netezza)
➢ Validation script for file type source system
➢ Development of new Load programs for ETL.
➢ Creation of Mappings, Sessions and Workflows for the ETL requirements requested by the clients.
➢ Involved in requirement analysis, implementation, testing for the ETL requirements.
➢ Performance tuning on long running sessions.
➢ Modifying existing code with error handling logic as per new design
➢ Modifying session level properties to achieve performance at session level.
➢ Implementing push down optimization for limited mappings.
➢ Implementing partitioning in informatica mappings.
➢ Generating and modifying ABAP code for SAP ECC source type.
➢ Performance tuning at database level (Teradata)
➢ Creating reporting queries in OBIEE.
➢ Creating OBIEE queries in physical, business presentational layers
Informatica PowerCenter, Informatica IDQ, Informatica IICS, Informatica IDMC, Informatica Admin
AWS cloud, AWS Redshift, AWS S3, Snowflake etc,
SQL/PLSQL, Shell Scripting, Python
PostgreSQL, Teradata, Netezza, Oracle, SQL server, GCP BigQuery query development
Tableau, Power BI
AWS Certified Cloud Practitioner
Issuing Organization: Amazon Web Services (AWS)
Status: Active till Feb, 2026
AWS Certified Cloud Practitioner
Issuing Organization: Amazon Web Services (AWS)
Status: Active till Feb, 2026