Summary
Overview
Work History
Education
Skills
Timeline
Generic
Muhammad Ali

Muhammad Ali

Richmond,TX

Summary

With over 10 years of experience in data integration and data warehousing, I am an expert in using the ETL tool Informatica and IDQ. I have utilized enterprise data warehousing ETL/business intelligence methodologies to support data extraction, transformation, and loading processing in a corporate-wide ETL solution using Informatica Power Center and IDQ. My expertise lies in designing and developing complex mappings to extract data from diverse sources such as flat files, RDBMS tables, legacy system files, XML files, applications, COBOL sources, and Teradata. I am also experienced in working with Hadoop, HDFS, Hive, Impala, HQL queries, and Sqoop. In addition to my data integration skills, I have a strong background in data warehouse experience using Informatica Power Center and Informatica PowerExchange (CDC) for designing and developing transformations, mappings, and sessions. I am proficient with Informatica Data Quality (IDQ) for cleanup and massaging at the staging area. Furthermore, I am skilled in implementing complex business rules by creating reusable transformations, workflows/worklets, and mappings/mapplets. Performance tuning of targets, mappings, and sessions is another area of expertise. I have also developed shell/python scripts to handle incremental loads. My thorough knowledge of database management concepts like conceptual, logical, and physical data modeling has allowed me to excel in this field. I am experienced in using data modeling tools like Erwin for logical and physical database design and data modeling. Additionally, I have worked with Tableau 10.4 and Business Objects XIR3/XIR2 to build user-defined queries and reports for drill-down analysis on multiple databases. My strong expertise lies in relational database systems such as Oracle, SQL Server, MS Access, Sybase, DB2 design, and database development using SQL, PL/SQL, SQL PLUS, TOAD, SQL-LOADER. Highly proficient in writing, testing, and implementing stored procedures using PL/SQL is another skill that sets me apart. I have experience in scheduling ETL jobs using Control-M, Tivoli, CA Workload Automation. Throughout my career in the domain of data warehousing, I have gained valuable experience across various industries including banking, insurance, healthcare, credit card utilities, and pharmaceuticals. My proven ability to implement technology-based solutions for business problems makes me a valuable asset to any organization.

Overview

13
13
years of professional experience

Work History

Senior ETL Informatica Developer/DQ Consultant

Morgan Stanley
04.2023 - Current
  • A role focused on designing, developing, and maintaining data masking solutions using Informatica tools, primarily to protect sensitive information within datasets by obfuscating it while preserving data integrity for testing, development, and other non-production purposes
  • Also responsible for designing, developing, and maintaining data integration processes using Informatica tools to extract, transform, and load (ETL) data from various sources into a data warehouse
  • Analyze sensitive data across different systems and define appropriate masking strategies based on business requirements and data privacy regulations
  • Utilize Informatica PowerCenter to create data masking rules and transformations, including field-level masking, pattern-based masking, and data type-specific masking techniques
  • Collaborate with stakeholders to establish data masking policies, including which data elements to mask, the level of obfuscation, and appropriate masking patterns
  • Deploy data masking solutions to production environments, monitor masking processes, and address any issues or updates to masking rules as needed
  • Conduct thorough testing of masked data to ensure the quality and usability of masked datasets while maintaining data integrity
  • Deep understanding of Informatica data integration tools, including data masking features and capabilities
  • Strong SQL skills to query and manipulate data within databases for masking purposes
  • Comprehensive knowledge of data masking techniques like randomization, substitution, and encryption
  • Awareness of relevant data privacy laws (e.g., GDPR, HIPAA) and how they apply to data masking practices
  • Ability to analyze data structures and identify sensitive information to effectively implement masking strategies
  • Design logical and physical data models within the data warehouse to optimize data storage and retrieval
  • Create and implement ETL workflows using Informatica tools to extract data from diverse sources, transform it according to business rules, and load it into the data warehouse
  • Develop data quality checks and cleansing processes to ensure data accuracy within the data warehouse
  • Monitor and optimize ETL processes to maintain efficient data loading times

BSA/Senior ETL Informatica Developer

Honeywell
06.2022 - 04.2023
  • A role focused on analyzing business requirements and designing data integration solutions using the Informatica platform, primarily responsible for tasks like data mapping, data quality checks, and ETL (Extract, Transform, Load) processes, ensuring data accuracy and consistency across systems within an organization by leveraging Informatica tools and functionalities
  • Understanding business needs, identifying data sources, and translating them into technical specifications for data integration within the Informatica platform
  • Designing logical and physical data models to structure and organize data within the data warehouse or data lake
  • Creating and managing data extraction, transformation, and loading processes using Informatica PowerCenter or other relevant Informatica tools
  • Implementing data quality checks and cleansing routines to ensure data accuracy and consistency
  • Managing Informatica environments, including user access, configurations, and performance monitoring
  • Performing unit, integration, and user acceptance testing to verify data quality and functionality of data integration solutions
  • Creating detailed documentation for data mapping, data transformations, and ETL processes
  • Contributing to the overall SAP implementation project, including timelines, milestones, and budget management
  • Identifying opportunities to optimize SAP usage and business processes based on user feedback

Senior ETL Informatica Developer/DQ Consultant

Assurant Inc.
03.2021 - 05.2022
  • A technical professional responsible for designing and developing data warehouse structures within an organization, utilizing the Informatica platform to extract, transform, and load (ETL) data from diverse sources into a centralized data warehouse, ensuring data quality and accessibility for business intelligence applications
  • To support development of a next-generation Agent Portal
  • Make possible improved Agent/Customer services (e.g
  • Agent incentive programs and transparency of new business processing status)
  • To enable increased focus on Market Intelligence and operational reporting and rollout of advanced analytics to support customer cross-sell initiatives
  • Implemented data quality processes including transliteration, parsing, analysis, standardization and enrichment at point of entry and batch modes; Deployed mappings that runs in a scheduled, batch, or real-time environment
  • Single view of the customer
  • Single truth of data
  • Improved analytics and reporting capabilities
  • Improved data management – data governance and on-going update and maintenance
  • As well as transformation of policy administration, servicing, underwriting and claims processes
  • ETL architecture – Prepare Data Flow diagrams, identify major streams like Data Conversion, Data Quality, Reconciliation
  • Created solution architecture and planned the development effort
  • Communication/conducting meetings with various business teams, stakeholders, development teams and management teams
  • Very strong in implementation of data profiling, creating score cards, creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency
  • Validation, Standardization and cleansing of data will be done in the process of implementing the business rules
  • Prepare design for all metadata according to various ETL processes
  • Monitor processes and develop plans to capture and access all metadata
  • Create, optimize and maintain T-SQL code
  • Tune T-SQL queries and database performance by improving overall query execution plans, indexing strategies, table design
  • Environment: Informatica Power Center 10.4, Informatica Data Quality 10.4, Snowflake, Postgress SQL, Vertika, Oracle 11g, SQL Server 2017, Service Now, Toad 10, Autosys

Senior Informatica Consultant/Data Engineer

Change HealthCare
04.2020 - 02.2021
  • Change HealthCare has chosen to build a foundation for business transformation based on accurate and trustworthy data
  • In the overall picture, the key objectives are to create a central repository for data related to CHC
  • Establish a Data Governance process that ensures trusted data source
  • Provide for scheduled and on-demand reporting and data consumption
  • ETL architecture – Prepare Data Flow diagrams, identify major streams like Data Conversion, Data Quality, Reconciliation
  • Created solution architecture and planned the development effort
  • Communication/conducting meetings with various business teams, stakeholders, development teams and management teams
  • Very strong in implementation of data profiling, creating score cards, creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency
  • Validation, Standardization and cleansing of data will be done in the process of implementing the business rules
  • Prepare design for all metadata according to various ETL processes
  • Monitor processes and develop plans to capture and access all metadata
  • Environment: Informatica Data quality (IDQ), collibra, Tableau, Informatica Metadata Manager

BI Developer

AAA - Automobile Club of Southern California
09.2019 - 03.2020
  • Working on the Enterprise level Datawarehouse (EDW)
  • Working on different data marts and data lake for different domains i.e Automotive, Emergency Roadside Assistance, Insurance, Travel, Membership and Payments etc
  • Working on Informatica Cloud, AWS, Hadoop, Hive, Impala and HQL queries for different databases
  • Extract transform and load data through Informatica Power Centre and Informatica Developer
  • Creation of design document of data load process
  • Parsed out the Informatica workflows to a grain of 1:1 workflow: session, using python scripts and manipulated the workflow
  • XML files to configure the connections
  • Loading data from/to Teradata, SQL Server, Oracle, Hadoop, Hive and API
  • Used Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2B Data Exchange of Informatica
  • Used Informatica B2B Data Exchange to Structured data like XML
  • Export & Import of workflows in different environments
  • Making changes in the configuration/Parameter files and performance tuning of existing Informatica jobs
  • Data loading, data conversion, ensuring data validation and loading of error & audit tables
  • Maintaining the data retention policy of the organization
  • Scheduling the ETL jobs in Control M
  • Working of Agile Methodology and Implementing in Clarizen and JIRA
  • Environment: Informatica Power Center 10.2, Informatica Developer, Oracle 11g, SQL Server

Informatica Developer

Duke Energy
05.2019 - 09.2019
  • Worked on the Outages Management System
  • Customer data flows from Inservice 9.2 (OMS Database to downstream application)
  • From OMS database data stored in staging table where data is at granular level of Customer Account Number & Premise ID
  • Creation of design document of customer load process
  • Creation of informatica mappings, mapplets, worklet & workflows
  • Load data from SQL Server, Flat Files & Oracle to Oracle Database
  • Created the SFDC, Flat File and Oracle connections for AWS Cloud services
  • Performance tuning of existing Informatica jobs
  • Export & Import of workflows in different environments
  • Making changes in the configuration/Parameter files already developed for OMS History, Proactive Communication, IFactor (Outages Maps) and Reliability Metric
  • Data loading & data conversion (customers IDs & premise ID's pattern changed)
  • Creation & loading of error table and audit tables, and Ensuring Data validation
  • One-time conversion of transactional tables in OMS Primary, Archival and NRT (Near Real Time) databases
  • Weekly full load customer data by ETL Process
  • Maintaining the data retention policy of the organization
  • Scheduling the ETL jobs in CA Workload Automation
  • Working of Agile Methodology and Implementing in JIRA
  • VB Script for the loading of data and worked on Amazon RedShift cloud data integrator 10
  • Environment: Informatica Power Center 10.2, Oracle 11g, SQL Server 2017

Informatica Developer

Phillip 66
02.2018 - 04.2019
  • Created High Level and Low Level design document and ETL standards document
  • Extracted data form flat files, Mainframes, DB2 and oracle database, applied business logic to load them in the central oracle database
  • Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite
  • Involved in performance tuning at source, target, mapping and session level
  • Loaded Oracle tables from XML sources
  • Configured Informatica for the SAP Connector
  • Worked on creating physical layer, business model, mapping and presentation layer in OBIEE
  • Retrieved data from SAP using Informatica Power Exchange
  • Supported Integration testing by analyzing and fixing the issues
  • Created Unit Test Cases and documented the Unit Test Results
  • Resolved Skewness in Teradata
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans
  • Used Informatica web services to create work requests/work Items for the end user
  • Integrated the sales force data into target Oracle using Informatica cloud
  • Validated the sales force target data in force.com application
  • Scheduled the workflows to pull data from the source databases at weekly intervals
  • Used various performance enhancement techniques to enhance the performance of the sessions and workflows
  • Performance tuning on sources, targets, mappings and database
  • Worked as production support SME to investigate and troubleshoot data issues coming out of Weekly and Monthly Processes
  • Used database level Greenplum partitioning and Informatica hash partitioning
  • Environment: Informatica Power Center 10.2, SAP, Oracle 11g, DB2, SQL Server

Informatica Developer

JM Family Enterprises Inc
10.2015 - 01.2018
  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Analyzing, designing and implementing complex SQL stored procedures, ETL processes and Informatica mappings
  • Used Tidal scheduler to get the source file from the server using Tidal flat file FTP connection as well as power center FTP connection
  • Worked with tool bar, perspective tool bar, sub-tool bar and design and view tabs of Pentaho data integration
  • Created Pentaho ELT jobs and did the performance monitoring and logging
  • Worked on L2 and L3 production support/monitoring of the daily nightly loads of ETL
  • Implemented SCD1, SCD2 type maps to capture new changes and to maintain the historic data
  • Providing technical assistance during production phase of project development
  • Configured Informatica for the SAP Connector and extracted data from SAP and loaded into Oracle EBS
  • Configured power exchange for SAP R3
  • Retrieved data from SAP R3
  • Designed and wrote the scripts required to extract, transform, load (ETL), clean, and move data and metadata so it can be loaded into a data warehouse, data mart, or data store
  • Installed and configured Informatica Power Exchange for CDC and Informatica Data Quality (IDQ)
  • Created CDC (change data capture) sources in Power Exchange and imported that into Power Center
  • Created custom product name discrepancy check plans using IDQ and incorporated the plan as a mapplet into Power Center
  • Configured Informatica Power Exchange add on for SAP (Power Connect)
  • Retrieved data from SAP IDocs using Informatica connector
  • Used Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2B Data Exchange of Informatica
  • Created new mappings and enhancements to the old mappings according to changes or additions to the Business logic
  • Configured Informatica web services hub in administration console
  • Worked with Informatica web services
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans
  • Configured SFDC license in administration console
  • Environment: Informatica Power Center 9.6.1, oracle, Db2, unix

Informatica Developer

Texas Children’s Hospital
02.2013 - 10.2015
  • Running batch cycles, which involve job triggers from Informatica, query table development in Teradata /Oracle SQL, report generation, and claims archiving maintenance
  • Involved in creating stored procedures and using them in Informatica
  • Implementing the claim Data Conversion process is to move claims from the Claims Workbench to the CNG Navigator application
  • Providing technical assistance during the production phase of project development
  • Defined and developed technical standards for data movement and transformation and reviewed all designs to ensure those standards were met
  • Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions
  • Designed Work Flows that uses multiple sessions and command line objects (which are used to run the Unix scripts)
  • Created source and target mappings, transformation logic and processes to reflect the changing business environment over time
  • Provided enterprise data warehousing solutions, including design and development of ETL processes, mappings and workflows using Informatica’s PowerCenter
  • Responsible for migration of the work from dev environment to testing environment
  • Provided guidance and expertise to resolve technical issues related to DW Tools and primarily Informatica
  • Environment: Informatica Power Center 9.0.1, Oracle 10G, Cognos BI 8.3, Relational Junction, DB2, Flat files, PL/SQL, SQL
  • Plus, TOAD, UNIX, Shell Scripting, Control-M, Erwin 4.2

ETL Developer lead

Wells Fargo
02.2012 - 02.2013
  • Involved in creating design document Informatica mappings based on business requirement
  • Extracted data from Flat files loaded them into EDW
  • Developing Complex Transformations, Mapplets using Informatica to Extract, Transform and Load data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS)
  • Used various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc
  • To implement complex logics while coding a Mapping
  • Worked on Informatica tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor
  • Created tasks and workflows in the Workflow Manager and monitored the sessions in the Workflow Monitor
  • Tested scripts by running workflows and assisted in debugging the failed sessions
  • Implemented Client Side validation using Java Scripts
  • Analyze functional requirements provided by Business Analysts for the code changes
  • Writing UNIX scripts for unit testing the ETL Code
  • Execute the Test cases for the code changes
  • Extensively participated in functional and technical meetings for designing the architecture of ETL load process
  • Environment: Informatica Power Center 9.0.1 Power Exchange, SQL Server 2008, Oracle 10g

Education

MBA - Marketing

Allama Iqbal Open University
01.2009

MS - Computer Science

Preston University
01.2001

Skills

  • Informatica Power Center
  • Power Exchange
  • IDQ
  • SSIS
  • Cognos
  • Spot Fire
  • Cognos IWR
  • OBIEE
  • Business Objects 50
  • Business Objects 65
  • Erwin 40
  • Star-Schema Modeling
  • FACT and Dimension Tables
  • Oracle 11g
  • Oracle 10g
  • Oracle 9i
  • Oracle 8i
  • Microsoft Access
  • SQL Server 2005
  • SQL Server 2008
  • MS Excel
  • Flat Files
  • Teradata V130
  • Teradata V120
  • Sybase
  • Netezza
  • C
  • C
  • Java
  • JavaScript
  • Python
  • SQL
  • PL/SQL
  • T-SQL
  • UNIX
  • Shell Scripting
  • Visual
  • UNIX Shell Scripting
  • Windows 2008
  • Windows 2003
  • Windows NT
  • Windows XP
  • HP-Unix
  • Linux
  • AIX
  • Object Oriented Analysis Design using UML
  • MS Project
  • MS Office Suite
  • Toad

Timeline

Senior ETL Informatica Developer/DQ Consultant

Morgan Stanley
04.2023 - Current

BSA/Senior ETL Informatica Developer

Honeywell
06.2022 - 04.2023

Senior ETL Informatica Developer/DQ Consultant

Assurant Inc.
03.2021 - 05.2022

Senior Informatica Consultant/Data Engineer

Change HealthCare
04.2020 - 02.2021

BI Developer

AAA - Automobile Club of Southern California
09.2019 - 03.2020

Informatica Developer

Duke Energy
05.2019 - 09.2019

Informatica Developer

Phillip 66
02.2018 - 04.2019

Informatica Developer

JM Family Enterprises Inc
10.2015 - 01.2018

Informatica Developer

Texas Children’s Hospital
02.2013 - 10.2015

ETL Developer lead

Wells Fargo
02.2012 - 02.2013

MS - Computer Science

Preston University

MBA - Marketing

Allama Iqbal Open University
Muhammad Ali