Summary
Overview
Work History
Education
Skills
Timeline
background-images
Jeevan Reddy

Jeevan Reddy

Bridgewater,NJ

Summary

Accomplished IT professional with over 14 years of experience in software development and business intelligence solutions, specializing in data warehousing and decision support systems. Proficient in utilizing a range of Informatica tools, including Cloud IICS/IDMC, Power Exchange, and Data Quality, to design and implement robust data integration solutions across various domains such as healthcare, finance, and telecom. Demonstrated expertise in cloud integrations and real-time analytics platforms, complemented by a strong foundation in data governance and quality management practices. Adept at leveraging advanced technologies like Hadoop and Snowflake to architect scalable data solutions that enhance operational efficiency and drive strategic decision-making.

Overview

15
15
years of professional experience

Work History

Sr. Informatica IICS/IDMC/CAI/CDGC/ADF/SAP BODS Lead

Johnson Controls Inc.
08.2021 - Current
  • Designed Data migration from Oracle Fusion, Sap ECC and Oracle ERP to Sap S4 HANA using Informatica IICS/IDMC/CDQ/CAI/BODS/ADF.
  • Design and implement data storage solutions using Azure services such as Azure SQL Database and Azure Data Lake Storage ADLS.
  • Designed and implemented data migration pipelines using Azure Data Factory (ADF) for moving data from Oracle Fusion, SAP ECC, and legacy ERP systems to Azure Data Lake and Snowflake.
  • Integrated ADF pipelines with Snowflake for automated ingestion and transformation workflows.
  • Developed Data Ingestion, Transformation and maintain data pipelines using ADF Azure Data Factory and Azure Data Lake.
  • Working as an Interim Admin on Informatica IICS/IDMC to enable the Users, creating the User Groups and enabling the right services and connectors.
  • Extensively worked on IICS/IDMC Cloud Data Integration CDI, CAI cloud application integration, Cloud Data Quality CDQ, Enterprise Data Catalog EDC, Data Governance DG, Data Synchronization DS, Data Replication DR and Mass Ingestion MI services.
  • Manage and perform data cleansing, de-duplication and harmonization of data received from, and potentially used by, multiple systems in IICS/IDMC.
  • Developed various CAI processes to load data from various sources like SQL Server, JSON and flat file data into cloud application using REST API.
  • Participate in the development and implementation of enterprise metadata standards, guidelines, and processes to ensure quality metadata and support for ongoing Data Governance.
  • Involved in onboarding of technical and business metadata into the Informatica Enterprise Data Catalog EDC and Axon DG environments, ensuring the population of data lineage and linkage between the technical and business metadata.
  • Build the end-to-end process on Informatica CDQ for data cleansing, standardizing, Address cleaning and De-duplication.
  • Created some Informatica mass ingestion tasks to bring in the data from legacy ERP into azure data lake.
  • Data ingestion to one or more Azure services (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in Azure Databricks.
  • Writing and analyzing existing scripts for Azure Data Bricks processing of the data.
  • Design and development of Data Integration and Application mappings, transformations, sessions, workflows and ETL batch jobs, shell scripts to load data from Source systems to Staging database HANA.
  • Environment: Informatica IICS/IDMC, CDQ, SAP Data Services (BODS), ADF, Snowflake, UNIX, Oracle Fusion, Oracle 12g, Flat files, XML, Shell Scripting, Microsoft Azure Data Lake.

Sr. Informatica Power Center/IICS/IDMC/SAP BODS Lead

QVC- Qurate Retail Group
07.2019 - 08.2021
  • Designed Data migration from Peoplesoft and Mainframes to S4 HANA using Informatica 10.2/IICS/IDMC.
  • Design and development of mappings, transformations, sessions, workflows and ETL batch jobs, shell scripts to load data from Source systems to Staging database HANA using IICS/IDMC, EDC, AXON, CDI.
  • Reusable transformations and Mapplets are built wherever redundancy is needed in IICS/IDMC.
  • Performance tuning is performed in IICS/IDMC at the Mapping level as well as the Database level to increase the data throughput.
  • Designed Data migration from SAP ECC to S4 HANA using SAP SLT and IICS/IDMC.
  • Collaborated with Functional SMEs to generate requirements and data ETL procedures.
  • Played a technical liaison role with functional and business users to generate data mapping and data Validation technical documents.
  • Leading the effort of optimizing the loading process of large financials (SAP AR, AP, GL and Recurring) data sets including both transparent and clustered tables from legacy ECC to S/4 HANA.
  • Designed and implemented Customer Master Data mapping and reference data implementations in SLT using ABAP include programs.
  • Implemented performance filters, source side filtering for migrating large data sets.
  • Designed and configured SLT Replication (Real time and Batch) from SAP ECC sources using HANA Data provisioning and BODS.
  • Expertise in CMC configuration and administration of SAP Data services repositories.
  • Environment: Informatica Power Center 10.2/ IICS/IDMC, SAP Data Services (BODS), Data Modelling (Erwin), UNIX, Peoplesoft, Oracle 12g, Flat files, XML, Shell Scripting, Putty, WinSCP and Toad.

Sr. Informatica Power Center/Cloud Lead

Honeywell International Inc. (NTT Data)
03.2017 - 07.2019
  • Description: The purpose of this project is Honeywell Commercial Excellence Analytics wants to build a common platform for all Honeywell sales team that will drive the right dialog with Sales on driving the right, measurable activities to generate target number of opportunities, manage and improve forecast accuracy and track sales capacity to deliver and continuously improve Seller’s productivity by creating a Sales Pipeline Dashboard using Salesforce Wave Analytics to track HON Sales Opportunities data in the Ops Center Org. Salesforce Wave is a business intelligence (BI) platform from Salesforce.com.
  • Main intent of this project is Honeywell has multiple salesforce instances and it has a Siebel system. Currently there is no way to know across Honeywell what is the pipeline for each of the business. There is business like HBT, PMT, AERO, SPS each have multiple salesforce instances and they are struggling on reporting that is why we bring the data across all SF instances in one place where they can see everything so that all Executives and SBG leaders (SBG Presidents, SBG Sales and CE leaders) can review monthly to understand Win rate, AOP targets and the corresponding pipeline coverage. This will help business understand the health of the pipeline and take corrective action, as necessary.
  • SPINCO: Worked as a Data Migration Lead to migrate all the Homes related data from multi org Honeywell into Resideo.
  • SPS GDM MVP2: Worked as a Data Migration Lead to migrate the legacy data (ACS, RAE and SAP) in to GDM org.
  • Responsibilities:
  • In-depth practical knowledge of all modules and features of Salesforce both Sales and Service Cloud. Also, had good exposure in other areas of project execution like Customer facing, Requirement gathering and analysis, Consulting, Solution Designing, Documentation, Implementation, Development, and support of Business Solutions.
  • Involved in extracting, transforming, and loading data Opportunities, Accounts, Users, Leads, Contacts, Record type, Opportunity History, Tasks, Events interactions tables from various source systems to Salesforce.com and reverse data feed from Salesforce for CRM Honeywell.
  • Worked with Informatica Cloud IICS/IDMC/Power Center to create Source /Target connections, monitor, synchronize the data in SFDC.
  • Automated Validation and De-duplication of Salesforce data using Informatica IICS/IDMC Cloud Customer 360 CC360.
  • Implemented effortless Consolidation and Integration of data from multiple systems to provide a Single View of Customer information in Informatica IICS/IDMC CC360.
  • Worked in Data Cleansing and mapping data from source salesforce.com to target Oracle using IICS/IDMC.
  • Data migration (from salesforce.com to Oracle) using Informatica Cloud IICS/IDMC, EDC, AXON, and Power Center.
  • SOQL queries to fetch the data from Workbench and Explorer. Designed and developed ETL and Data Quality mappings to load and transform data from sources such as Oracle and SQL to Data warehouse using Power Center and Cloud.
  • Performed data profiling and analysis of various objects in SalesForce.com (SFDC) using IDQ and MS Access database tables for an in-depth understanding of the source entities, attributes, relationships, domains, source data quality, hidden and potential data issues, etc.
  • Implemented Change Data Capture (CDC) on Source data from Salesforce.com using IICS/IDMC.
  • Extracted various data from SalesForce.com using Informatica 10.1 with Sales Force Adapter.
  • Created the dashboards in salesforce wave analytics and used multi-dimensional analysis (Slice & Dice and Drill Functions) to organize the data along a combination of Dimensions and Drill-down Hierarchies giving the ability to the end-users to view the data from heterogeneous viewpoints.
  • Customized the Salesforce Wave Dashboards to the track usage for productivity and performance of business centers and their sales teams.
  • Scheduled the Informatica jobs using Autosys.
  • Environment: Informatica Power Center 10.1, Informatica Cloud/IICS/IDMC, MDM, Informatica REV, Autosys, Data Modelling (Erwin), UNIX, Siebel, Oracle 12g, Salesforce.com, Salesforce Wave Analytics, Flat files, XML, Shell Scripting, Putty, WinSCP and Toad.

Sr. Informatica Power Exchange/IDQ Developer

Northwestern Mutual, Milwaukee (Life Insurance and Financial Planning)
05.2016 - 02.2017
  • Description: This project is about Northwestern Mutual moving towards Integrated Advisor, there is a need to adapt planning in the rewards & recognition for which data is currently not present in the awards platform. This project focus on to move the planning PPA (Personal Planning Analysis) & BPA (Business Planning analysis) data to the BIIP platform so that data can be accessed easily by the awards platform. For the achievement of the Award, Rewards & Recognition team required the number of Personal Planned Analysis (PPAs) & Business Planned Analysis (BPAs) with 2+ modules that are being delivered by a FR (Financial representative) for the awards timeframe accumulated through calendar month end being reported on.
  • Responsibilities:
  • Involved in Design and develop the architecture for all data warehousing components e.g. tool integration strategy; source system data ETL strategy, data staging, movement and aggregation, information and analytics delivery and data quality strategy.
  • Designed and developed ETL and Data Quality mappings to load and transform data from sources such as DB2, Oracle and Sybase to Data warehouse using Power Center and IDQ/IDE.
  • Extensively used Informatica Data Quality (IDQ) transformations like Match, Consolidation, Exception, Parser, Standardizer and Address Validator.
  • Developed IDQ Match and Merge strategy and Match and consolidation based on customer requirement and the data.
  • Built several reusable components in IDQ/IDE using Parsers Standardizers and Reference tables.
  • Performed data profiling and analysis of various objects in SalesForce.com (SFDC) using IDQ and MS Access database tables for an in-depth understanding of the source entities, attributes, relationships, domains, source data quality, hidden and potential data issues, etc.
  • Worked with the Business Analysts on IDQ - Data Profiling, Data Validation, Standardization and Data Cleansing for the Oracle 12c data migration to rebuild and enhance the business rules.
  • Also, worked with Business Analysts to modify/enhance rules for physical/mailing addresses using IDQ -Address Validator.
  • Developed both one-time and real-time mappings using Power Center 9.6 Power Exchange.
  • Registered the Data maps for Real-time CDC Changed Data Capture data in Power Exchange. Worked on Extraction Maps and Test in Power Exchange Navigator.
  • Implemented Change Data Capture (CDC) on Source data from Salesforce.com.
  • Extracted various data from SalesForce.com using Informatica 9.6 with Sales Force Adapter.
  • Worked on updating Sales Force external ID and created various objects in Salesforce.
  • Created new and modified existing Hierarchies in the universes to meet Drill Analysis of the user’s reporting needs and involved in performance tuning of various Business Objects by creating aggregate tables.
  • Performed integrity testing of the Universes (Universe Structure checking, Object parsing, joins parsing, Conditions parsing, Cardinalities checking, Loops checking and Contexts checking) after any modifications in the them in terms of structure, classes, and objects.
  • Used multi-dimensional analysis (Slice & Dice and Drill Functions) to organize the data along a combination of Dimensions and Drill-down Hierarchies giving the ability to the end-users to view the data from heterogeneous viewpoints.
  • Environment: Informatica Power Center 9.6, Informatica Power Exchange 9.6, Informatica IDQ/IDE 9.6, Autosys, Hadoop Ecosystem, Data Modelling (Erwin), UNIX, Windows 2007 Professional Client, Sybase, Oracle 10i, DB2, SAP Business Objects, Flat files, XML and COBOL, Shell Scripting, Putty, WinSCP and Toad.

Sr. Informatica Developer/IDQ Developer

John Deere World Headquarters, Illinois (Agriculture and Forestry)
09.2013 - 05.2016
  • Description: This project will be a joint effort between the newly created Machine Knowledge Center (MKC) and PV&V personnel from the enterprise. The intent of this project is to develop a process in which PV&V can mine and utilize customer information from the database created by the JDLink product that is sold on John Deere equipment. Leveraging this data will help PV&V to enhance the product knowledge of how our machines are used by our customers. This data can be used to enhance the reliability, durability, and performance of both current and future product offerings. Part of the process development of this project will be to identify what resources will be needed to develop this service for the PV&V community to obtain this valuable customer data. This project will also identify potential cost to acquire this information.
  • Hadoop Projects:
  • John Deere Customer Product (JDCP) and Load Profile data collected from the Customers and the Source team are loaded into Hadoop Ecosystem. On this data, we perform Data cleansing and business transformations are implemented in Hadoop ecosystem using Map Reduce jobs. The final data is provisioned to downstream systems for reporting and dash boarding purposes.
  • Responsibilities:
  • Successfully designed and architected the Integrated Data Warehouse in John Deere on Big Data platform.
  • Designed, developed, implemented, and maintained Informatica Data quality IDQ/MDM application for matching and merging process.
  • Utilized Informatica IDQ/IDE 9.1 to complete initial data profiling and matching/removing duplicate data.
  • Installed and configured content-based data dictionaries for data cleansing parsing and standardization process to improve completeness conformity and consistency issues identified in the profiling phase using IDQ/IDE.
  • Configured Analyst tool IDE and helped data stewards or business owners in profiling the source data create score cards applying inbuilt DQ rules and validating the results.
  • Experienced working with Informatica Big Data –To Read/Write HDFS files, Hive Tables and Hbase.
  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
  • Mastered the ability to design and deploy rich Graphic visualizations using Tableau.
  • Working on generating various dashboards in Tableau Server using different data sources such as Netezza, DB2 and Created report schedules, data connections, projects and groups.
  • Expert level capability in table calculations and applying complex, compound calculations to large, complex big data sets.
  • Worked closely with business power users to create reports/dashboards using tableau desktop.
  • Environment: Informatica Power Center 9.6, Hadoop Ecosystem, Informatica DVO, IDQ, MDM, Power Exchange, Data Modelling (Erwin), Netezza, PL/SQL, DB2, Sybase, Tableau V8, Shell Scripting, Putty, WinSCP and Toad, Aginity tool.

Sr. Informatica Developer/IDQ Developer

EARTHLINK CORPORATE HEAD QUARTERS, ATLANTA (Networks and Communications)
10.2012 - 09.2013
  • Responsibilities:
  • Served as an ETL Developer/Data Quality Analyst in the deployment of FP Financial Product Reporting and PDS Persistent Data Staging. Primary responsibility is to do the Data Quality checks and data integration using Informatica Data Quality and Power center, Unix Shell scripting, architecting and developing of a custom ETL framework that consisted of over 144 processes using Oracle native language (PL/SQL) and load that into Oracle DWH.
  • Environment: Informatica Power Center 9.1, Informatica IDQ/IDE, Power Exchange, Web Services, IDQ, UNIX, Windows 200 Professional Client, Oracle 8i/9i Enterprise Edition, PL/SQL, Teradata, SAP BOXI R2/6.5, VSAM files, Flat files, XML and COBOL, Shell Scripting, Putty, WinSCP and Toad.

Sr. Informatica Developer

J.P Morgan Chase, NEW YORK (Banking and Financial)
11.2011 - 09.2012
  • Responsibilities:
  • Served as a Senior ETL Developer/SQL Developer in the enhancement of an existing custom ETL framework that collected, cleansed, and integrated the company’s performance data (i.e., cash-flows, liquidity positions) from various operational source systems. The enhancements were part of an overall initiative in improving an internal custom built Liquidity Risk Management System (LRMSMS) that supported and provided JP Morgan Corporate Treasury executives, senior managers, and business analyst with analytical reporting capability on the company’s liquidity position, sources and uses of cash, forecasting of cash flows and stress test modeling, funding counterparties, and developing funding plan as needed. Major contributions and/or accomplishments included: designing and developing an ETL component that dynamically constructed in real time the SQL to load over 100 source feeds into the Liquidity Position fact table using a Meta Data strategy; designing and developing an ETL process that mapped custom products and/or services hierarchical relationship with the company’s general ledger products for reporting purposes; developing Sybase objects such as tables, views, indexes, triggers, procedures, and functions to support the ETL Meta Data Rule component; providing support, guidance, and training in the deployment of the solution in various environments such as integration, QA, and production.
  • Environment: Informatica Power center 9.1, Webservices, SQL Server 2008, Oracle 11i/10g, Teradata, PL/SQL, Power exchange 9.1, Sybase, SAP Business Objects XI 3.x, TOAD, Windows XP, UNIX maestro, ERWIN 4.2, Control-M.

Sr. Informatica Developer

GROUP HEALTH CO-OPERATIVE HEAD QUARTERS, WA (Healthcare)
03.2011 - 11.2011
  • Responsibilities:
  • Served as an ETL Developer/Data Analyst in the deployment of Claims, Hospital Events, In-Patient Pharmacy, Hospital Billing, and Professional Billing. And so on to the Data Warehouse to make easier access of Patients Claims data, provide analytics, trending and comparisons, reporting and exporting capabilities that support the business needs, but most of all increase data utilization, and improve customer services and productivity. Primary responsibility using Teradata utilities (SQL, B-TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, Query man), Teradata parallel support, Unix Shell scripting, architecting and developing of a custom ETL framework that consisted of over 144 processes using Oracle native language (PL/SQL) and load that in to Teradata.
  • Environment: Informatica Power center 8.6, Power Exchange 8.6, HIPAA(835,837-Institutional/Professional-Inbound/Outbound),Webservices, Business Objects/6.X, Oracle 11g/10g, PL/SQL, Flat files, XML,COBOL, , Teradata.

Informatica Developer

ADECCO GROUP OF NORTH AMERICA, FL (IT Staffing)
08.2010 - 03.2011
  • Responsibilities:
  • Served as one of the ETL Informatica Developer involving in Gathering the requirements from the end users and Involved in analysis of source systems, business requirements and identification of business rules. Design and Implement Informatica mappings to Migrate data from various legacy applications/acquisition offices to a centralized application. Responsible for Analysis, Design and Implement of various data marts for BackOffice (Financial, payroll, Benefits and HR Modules) using Data Modeling Techniques, Informatica 7.6. Tuned mappings and sessions for better performance on the data loads.
  • Environment: Informatica Powercenter (Designer 7.6, Repository Manager 7.6, Workflow Manager 7.6),Power Exchange, Business Objects XI/6.X, Oracle 11g/10g, PL/SQL, SQL Server 2008,2005 Flat files, XML, TOAD, UNIX,, Erwin 4.0 and Shell Scripting.

Education

Master's degree - computer science

University of Houston
Houston, Texas

Skills

  • ETL: Informatica Power center 101,96, 91, 8x, 7x, Informatica Cloud IICS/IDMC, Informatica IDQ/CDQ/CAI, Microsoft ADF, SAP BODS
  • BI Tools: Salesforce Wave Analytics, Business Objects, Tableau 9x, Power BI
  • Big Data Ecosystems: Hadoop, Map Reduce, HDFS, Hive, Pig, Sqoop, Oozie, Spring XD
  • Operating Systems: Windows 95/98/2000/2003 Server/NT Server Workstation 40, UNIX
  • Programming: Java, R, PIG, Hive, C, SQL, PL/SQL, HTML, XML, DHTML
  • Other Tools: Eclipse, SQL
  • Plus, TOAD, MS Visio, TOAD 80, Aginity, CA workstation, ESP
  • Scripting Languages: SQL, PL/SQL, UNIX Shell Scripting
  • Methodologies: Agile,E-R Modeling, Star Schema, Snowflake Schema
  • Data Modeling Tool: Erwin 35/41
  • Databases: Oracle 11G/9i/8i/73, MS SQL Server 2008, DB2, Netezza, Sybase, Teradata, Azure Data lake, SNOWFLAKE

Timeline

Sr. Informatica IICS/IDMC/CAI/CDGC/ADF/SAP BODS Lead

Johnson Controls Inc.
08.2021 - Current

Sr. Informatica Power Center/IICS/IDMC/SAP BODS Lead

QVC- Qurate Retail Group
07.2019 - 08.2021

Sr. Informatica Power Center/Cloud Lead

Honeywell International Inc. (NTT Data)
03.2017 - 07.2019

Sr. Informatica Power Exchange/IDQ Developer

Northwestern Mutual, Milwaukee (Life Insurance and Financial Planning)
05.2016 - 02.2017

Sr. Informatica Developer/IDQ Developer

John Deere World Headquarters, Illinois (Agriculture and Forestry)
09.2013 - 05.2016

Sr. Informatica Developer/IDQ Developer

EARTHLINK CORPORATE HEAD QUARTERS, ATLANTA (Networks and Communications)
10.2012 - 09.2013

Sr. Informatica Developer

J.P Morgan Chase, NEW YORK (Banking and Financial)
11.2011 - 09.2012

Sr. Informatica Developer

GROUP HEALTH CO-OPERATIVE HEAD QUARTERS, WA (Healthcare)
03.2011 - 11.2011

Informatica Developer

ADECCO GROUP OF NORTH AMERICA, FL (IT Staffing)
08.2010 - 03.2011

Master's degree - computer science

University of Houston
Jeevan Reddy