Summary
Overview
Work History
Education
Skills
Timeline
Generic

Abdul M Shaik

Newark,DE

Summary

  • 9+ years of experience with expertise in ETL Architecture, designed, development, testing, support and implementation of Enterprise data warehouses (EDW), Operational Data Store (ODS), data marts and decision support systems (DSS), Facts and slowly changing dimensions SCD Type I and SCD Type II with the help of MD5 hash function, OLTP and OLAP involving to load data from various sources and target like RDBMS Oracle, MS SQL Server, files like Cobol VSAM, excel, fixed wide and Flat Files.
  • Experienced in implementing CDC using Informatica, Software Development Life Cycle SDLC and CI/CD environment.
  • Excellent interpersonal and communication skills working with senior level managers, and business people.
  • Designed and developed mappings with optimal performance using Aggregator, Sorter, Rank, Joiner, Router, Filter, Expression, Stored Procedure, Normalizer, Source Qualifier, Connected & Unconnected Lookup, B2B Transformation, Sequence Generator, Stored Procedure, Un-Cached & Cached, Update Strategy, Union, Source and target pre and post load etc.
  • Extensive implementation of Incremental/Delta loads with the help of various concepts like mapping variable, mapping parameter and parameter table concept
  • Creating ETL Code in such a way to support implementation of full loads for the initial run and incremental/delta loads for next daily runs
  • Responsible for providing daily status report for all Informatica applications to customer, Monitoring and tracking the critical daily applications & code migration during deployments.
  • Design and develop mappings to implement full incremental loads from source systems
  • Responsible to ETL code migration, DB code changes and scripting changes to higher environment, support the code in production and QA environment
  • Identify bottlenecks and fix the performance issues.
  • Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Extensive experience in writing UNIX shell scripts for repository backups and automation of the ETL processes and developer shell scripts which will send the reports to client over the network by using file transfer protocol (FTP & SFTP) and generating the log file, which will keep the history for the FTP reports.
  • Analytical individual with deep understanding of ETL processes, combined with expertise in data mapping and transformation using Informatica. Dedicated to delivering high-quality solutions that drive operational efficiency and data accuracy.

Overview

10
10
years of professional experience

Work History

Senior ETL Informatica/IICS Developer

Tata Consultancy Services
10.2016 - Current


State Farm Insurance Feb 2022 – Till Now


The scope of the project is to integrate new cloud applications of the company into an already existing Enterprise Data warehouse system to support the needs of Finance Business. There are two main data warehousing system cloud Data warehouse (CDW) and Enterprise Data warehouse.

Responsibilities:

· Worked with the IT architect, Program managers in requirements gathering, analysis, and project coordination

· Analyzed existing ETL Datawarehouse process and ERP/NON-ERP Applications interfaces and created design specification based on new target Cloud Datawarehouse (Azure Synapse) and Data Lake Store.

· Created IICS connections using various cloud connectors in IICS administrator

· Developed Data Integration Platform components/processes using Informatica Cloud Platform, Azure SQL Datawarehouse, Azure Data lake Store and Azure Blob Storage technologies.

· Installed and configured Windows Secure Agent register with IICS org

· Experience integrating data from On - premise database and cloud-based database, application and file systems solutions using Informatica intelligent cloud services

· Experience working with cloud-based database solutions including Azure Synapse, Azure Data lake, Azure data factory, AWS Redshift, S3 cloud service and Snowflake.

· Developed Data Integration Platform components/processes using Informatica Cloud Platform, Azure SQL cloud Datawarehouse, Azure Data lake Store and Azure Blob Storage technologies.

· Performed loads into Snowflake instance using Snowflake connector in IICS for a separate project to support data analytics and insight use case for Sales team.

· Extensively used cloud connectors Azure Synapse (SQL DW), Azure Data lake Store V3, Azure.

· Developed Cloud integration parameterized mapping templates (DB, and table object parametrization)for Stage, Dimension (SCD Type1, SCD Type2, CDC and Incremental Load) and Fact load processes.

· Extensively used Parameters (Input and IN/OUT parameters), Expression Macros and Source Partitioning Partitions.

· Extensively used Push Down Optimization option to optimize processing and use limitless power of Azure Synapse

· Created PYTHON scripts which will used to start and stop cloud Tasks.

· Developed MASS Ingestion tasks to ingest large datasets from on-perm to Azure Data lake Store – File ingestion

· Experience in AWS cloud services like S3 and Redshift.

· Experience working with various session properties to extract data from Salesforce object using standard API, Bulk API.


Huntsman Chemicals, Woodlands, TX July 2020–Jan 2022

Senior ETL Informatica Developer

The primary objective of this project is to provide improved levels of service delivery to customers, by effective analysis on data collected from different sources. The secondary objective is to improve the management reporting and analysis process by providing a multi-dimensional analysis capability to help monitor the key business parameters for Customer Service Division.

Responsibilities:

· Interacted with both Technical, functional and business audiences across different phases of the project life cycle.

· Acted as a liaison with internal and external customers to research, analyze and propose solutions to technical, operational and test scenarios.

· Developed mappings for fact and dimension tables using the various transformations to extract data from different source databases and Files

· Upgraded Informatica from 10.4.x to 10.5.x on Linux servers for Dev/Test and Prod environments.

· Created Workflows using various tasks like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment and worked on scheduling of the workflows.

· Responsible for creating scalable, multi-threaded ETL framework to run batch jobs to move data from staging/landing zone to Target tables.

· Designed and developed complex procedures to handle errors and exceptions at both application and database level using shell scripts.

· Created Schema objects like Indexes, Views, and Sequences.

· Developed PL/SQL triggers and master tables for automatic creation of primary keys.

· Extensively used bulk collection in PL/SQL objects for improving the performing.

· Used Bulk Collections for better performance and easy retrieval of data, by reducing contexts witching between SQL and PL/SQL engines.

· Extensive experience in developing Stored Procedures, functions, packages, views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.

· Involved in data loading and building data migration scripts using PL/SQL and SQL*Loader, UTL packages based on file formats, calling UNIX scripts to download and manipulate files.

Bank of the West, San Ramon, CA Feb 2018 - June 2020

Senior Informatica Developer

The main objective of the project was to develop an Enterprise Data Warehouse for reporting and analysis purposes. Matrix extracts data from different source systems and applies business rules, loads the data into warehouse.

Responsibility:

· Actively involved in the Design and development of the STAR schema data model.

· Implemented slowly changing and rapidly changing dimension methodologies; created aggregate fact tables for the creation of ad-hoc reports.

· Created and maintained surrogate keys on the master tables to handle SCD type 2 changes effectively.

· Used SQL tools like TOAD to run SQL queries and validate the data in warehouse and mart.

· Used various lookup caches like Static, Dynamic, Persistent and Non-Persistent in Lookup transformation.

· Involved in debugging mappings, recovering sessions and developing error-handling methods.

· Successfully migrated objects to the production environment while providing both technical and functional support.

· Used Power exchange CDC (change data capture) for pulling from oracle source.

· Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager.

· Installed/configured Teradata Power Connect for Fast Export for Informatica.

· Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.

· Configured the fast export utility of Teradata.

· Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities.

· Resolved memory related issues like DTM buffer size, cache size to optimize session runs.

· Performed Loading operation of historical data using full load and incremental load into Enterprise Data Warehouse.

Sysco Foods, Houston, TX Oct 2017 – Jan 2018
Senior Informatica IDQ Developer

Sysco is the global leader in selling, marketing and distributing food products to restaurants, healthcare and educational facilities.

IDQ Responsibilities:

  • Created data objects in Informatica Data Quality
  • Created expression rules and ran profiles on the source data.
  • Created scorecards for redundant data and created validation rules using word wrapper in IDQ.
  • Created reference tables in Oracle 19c and used it as database dictionaries in IDQ.
  • Experience data profiling for the column/across table data validation.
  • Parsed the target data in IDQ using parser transformation.
  • Designed, developed, implemented, and maintained Informatica Power Center and IDQ application for matching and merging process.
  • Experienced with DVO, Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
  • Created references tables, applications and workflows and deployed that to Data integration service for further execution of workflow.
  • Built plans for Informatica Data Quality and also used pre-built standardized plans for clean address data.
  • Exporting the Mapplets from IDQ into Informatica PowerCenter to use the mapplet in various mappings for implementation of Address doctor.
  • Created and verified Quality Stage jobs for Match, merging process and un-duplication of data.
  • Identified and eliminated duplicates in datasets through IDQ components of Edit Distance, Jaro Distance, and Mixed Field matcher, It enables the creation of a single view of customers, and helps control costs associated with mailing lists by preventing multiple pieces of mail.

JP MORGAN CHASE BANK, Columbus, OH Oct 2016 - Sep 2017

ETL Informatica Developer

The primary objective of this project is to provide improved levels of service delivery to customers, by effective analysis on data collected from different sources.

Responsibilities:

  • Worked with Team Lead and Data Modeler to design data model using Erwin 7.1.
  • Designed and Developed Informatica mappings for populating the data into the dimension, facts and history tables constantly from different source systems.
  • Designed the ETL process for extracting data from heterogeneous source systems, transform and load into Data Mart
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, and Sorter and Aggregator transformations.
  • Developed ETL Scripts for processing and transferring of raw data from the legacy systems into the warehouse every week.

ETL Informatica Developer

StaffMethods
02.2015 - 09.2016

Well Point (Anthem), Norfolk, VA

Anthem’s affiliated health plans and companies are empowering consumers and driving innovation to control costs and improve the quality of care everyone receives.

Responsibilities:

· Responsible for Business Analysis and Requirements Collection.

· Followed agile methodology for project implementation by attending daily standup meeting, scrum call sand working session.

· Parsed high-level design specification to simple ETL coding and mapping standards.

· Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

· Automation and scheduling of Informatica jobs using Control-M.

· Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

· Developed mapping parameters and variables to support SQL override.

· Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.

Education

Bachelor of Science - Computer Science

Jawaharlal Nehru Technological University
India
06.2009

Skills

Databases: Oracle 19c/21c, SQL server 2022/2019, Netezza, Snowflake, Azure SQL Database, AWS Redshift, Big Query (GCP), S3

ETL Tools: Informatica PowerCenter 10x, Informatica Intelligent cloud services (IICS/CDI/CAI/IDMC), Secure Agent, Power exchange, Informatica IDQ, MDM and Azure data factory

Data Modeling

Tools: ERWIN, ER/Studio

Programming Skills: Shell Scripting, PL/SQL, T-SQL, batch, Basic Python and Spark, HTML, C#, JAVA Script, Perl

Methodologies: Data Modeling – Logical, Physical and conceptual data models Schema - Star / Snowflake schema Dimensional modeling techniques (Kimball and Inmon)

BI Tools: OBIEE, Business Objects, and Cognos, Micro Strategy, Power BI reports and Tableau

Tools: ServiceNow, JIRA, CDA, GitHub, Big Bucket, Jerkins, Confluence, Kafka, SQL Loader, TOAD, WinSCP

ERP and CRM: Salesforce, SAP

Job Scheduling tools:Autosys, Control-M, Tidal and Tivoli

Operating Systems: Windows, Unix and Linux

Timeline

Senior ETL Informatica/IICS Developer

Tata Consultancy Services
10.2016 - Current

ETL Informatica Developer

StaffMethods
02.2015 - 09.2016

Bachelor of Science - Computer Science

Jawaharlal Nehru Technological University
Abdul M Shaik