Summary
Overview
Work History
Education
Skills
Timeline
Generic

VENKATA BONAM

Summary

Skilled professional with over two decades of experience as a Developer, Lead, and Architect, specializing in the design and implementation of ETL solutions for Data Migration, Data Conversion, and Business Intelligence across Data Warehousing and Decision Support Systems domains. Proficient in Informatica Cloud and diverse versions of Informatica PowerCenter/IICS, adept at creating sophisticated mappings with a wide array of transformations. Demonstrated leadership prowess, proficient in guiding teams and fostering collaboration across onsite and offshore models.

Overview

20
20
years of professional experience
6
6
years of post-secondary education

Work History

Data Architect

Conoco Phillips
01.2020 - Current
  • Designed a solution to identify cost spending across the Business Units of Alaska, L48, and Canada, and effectively categorized spend data
  • Constructed Informatica interfaces to facilitate the loading of cost settlement data from Teradata Analytics for SAP (TAS), supply chain, and invoice data from SCDW, as well as budget data from TAS services (SAP) into the Global Production Data Warehouse (GPDW)
  • Played a pivotal role as GPDW APP DB in the creation of target tables within GPDW Teradata DB, spanning Stage 1, Stage 2, and Core T tables
  • Additionally, built views in Core V for Spotfire access to enable advanced analytics
  • Developed comprehensive end-to-end support and training documentation for the CMS (Cost Management System) project
  • Actively participated in report analysis meetings with various Business Units (BUs), gathering new requirements to enhance decision-making processes
  • Utilized Informatica Intelligent Cloud Services (IICS) Cloud Application Integration (CAI) and Kafka for real-time extraction and integration of data from sensors monitoring heat, pressure, sulfur, and water carbon atoms in Alaska drilling infrastructure
  • This data seamlessly integrated into an application named Cygnet for further analysis and decision-making
  • Formulated migration roadmaps for transitioning Informatica PowerCenter jobs to IICS, leveraging extensive use of Robotic Process Automation (RPA)
  • Developed IICS mappings manually for those not supported by RPA due to limitations in converting the PC mapping, utilizing Stored Procedures, XML, Web Services, Java, and SAP Transformations where applicable.

Solutions Architect

BP / Infosys
10.2018 - 12.2019
  • Coordinated with Business users to gather requirements for PL/BS, Operational stats, Trail balance reporting requirements as part of Project Phoenix to replace current WR5 reporting system
  • Analyzed functional specs and identified various tables required and relationships as for new data model
  • Created technical specs and mapping specs in support for Informatica mapping development
  • Experience on Migrating SQL database to Azure data Lake, Azure data lake Analytics, Azure SQL Database, Databricks and Azure SQL Data warehouse and controlling and granting database access and Migrating On premise databases to Azure Data Lake store using Azure Data factory
  • This is a Use case development, workflow development and data engineering project where we develop couple of new use cases, workflow and ingestion pipeline also migrate informatica workflow, Netezza on prem data to Azure synapse warehouse
  • Created mappings to load data from BHP S/4 HANA system to EDW staging in Azure
  • Created mapping with complex business logic and to load data from stage to dimension and facts in data lake on Azure
  • Deployed customize python/spark script and automate execution of use cases on databricks and HDInsight cluster
  • Created reconciliation mappings to validate data that was loaded in data lake against BHP data
  • Created test plans and executed test scripts to validate accuracy of data.

Integration Lead/Analyst

Sysco Foods
04.2017 - 09.2018
  • Worked with source system owners of different regions like Sweden, France, UK and Ireland to gather functional requirement and involved in designing low level and high documents
  • Involved in integration of various relational and non-relational sources such as DB2, Teradata 13.1, Oracle 9i, SFDC, Netezza, SQL Server, COBOL, XML and Flat Files
  • Conducted multiple meetings with source system owners and created functional and technical design documents and performed design documents reviews to ensure all functionality is captured
  • Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in Azure Databricks
  • Created prototypes to extract first set of Sales, GL, Assets and HR data from source system and thoroughly analyzed source data and identified required dimension and attributes and coordinated with data modelers in designing data models for consolidation and planning
  • Created source to target field mapping documents along with plan to implement business rules in various layers of data flow in support of development team
  • Supported UT, UAT Testing activities includes development of test scripts, planning of the execution of test scripts and documentation of test results
  • Created and configured workday connection (Informatica cloud labs) for workday integration for Informatica cloud
  • Created mappings to load GL, Accounting and resource management data from Workday to SAP BPC
  • Developed complex mappings and coordinated with offshore team for Informatica code development
  • Actively involved in troubleshooting, testing and code reviews to ensure all development is coded as per the standards and best practices.

Technical Lead

EBay Inc. / PayPal
06.2015 - 04.2017
  • Participated in meetings with various application owners and gathered requirements and transferred business requirements into comprehensive solutions
  • Performed source data analysis to understand relationships and created mapping specs to support Informatica mapping development
  • Extensively worked on Shell scripts for running SSIS programs in batch mode on UNIX and Created mappings using pushdown optimization to achieve good performance in loading data into Netezza
  • Developed high level functional and technical documents and provided ETL standard guidelines and best practices for development team
  • Created Informatica mappings to load master data from oracle to SAP BW PSA
  • Created Informatica mappings to load Payment flow, BT cash data, Currency conversion data from SAP HANA DSL layer to SAP HANA DPL layer/DLL (SAP BA layer)
  • Created mappings to extract data from SAP ECC (tables & data sources) for GL, customer & Vendor master data
  • Created replication tasks for all lookup table in Sales force and loaded them to on-premise DB to avoid multiple API calls
  • Exported Informatica Power Center mappings to cloud environment (ICS) and parsed Power Center connections connection properties and created new connections in Informatica cloud services
  • Used sales force outbound messaging to trigger synchronization tasks and set the sales force batch size
  • Extensively worked in Identifying bottlenecks and fine-tuned the mappings and also implemented Pushdown optimization and partitioning for better performance
  • Analyzed BTEQ scripts for logic involved for loading Golden metrics reporting data into EDW and implemented same in Informatica PFIT land scape
  • Managed onsite and offshore teams and ensured coordination of developments efforts that take place on the project ensuring to meet the deadlines
  • Provided guidelines for testing to maintain data quality and created unit test documentation for the objects developed
  • Performed detailed functional, technical, mapping and unit test documentation and deployment activities necessary for project
  • Extensively used PL SQL to validate source data and identify the relationships to in support for data modeling
  • Created DDL scripts for DSL layers tables to load data from FETL tables and developed SQL quarries to analyze the DSL layer data and identified required dimensions and facts for dimensional modeling
  • Created complex SQL quarries for data validation from ultimate source to target
  • Extensively used SQL for overriding Informatica SQ transformation for custom data selection.

Informatica Lead

Idaho Power
08.2014 - 05.2015
  • Coordinated with data analyst and business users to create source to target mapping documents from SAP sources like ABAP tables, SAP Data Sources, SAP function, SAP BW (OHD)
  • Created reports for users in Tableau by connecting to various data sources (MS SQL Server, Oracle, MS Excel, Netezza, CSV)
  • Created Informatica mappings using ABAP type extraction to load data from SAP tables, and generated BCI mapping to load data from SAP business content to load to staging area
  • Created SCD type 2 mappings to load data from staging to foundation area using various transformations
  • Created reusable code to generate MD5 hash to identify changes coming from source
  • Created technical designs and unit test scripts and test documents
  • Participated in daily standups to track project/issues status
  • Worked with SAP ADMIN to identify authorization issues for BCI loads
  • Extensively used SQL override and lookup overrides in Informatica for custom select selection and applying business rules
  • Extensively used SQL
  • Loader to load data from flat files to the database tables in SQL server tables.

Tech Lead

Sony Electronics Limited
05.2013 - 07.2014
  • Worked with upstream teams in identifying impacted list of fields that are Logically/Structurally impacted or required conversion
  • Created new ETL interface to load data from retiring applications like mainframes ORA apps into new SAP (DRP) System
  • Created new BAPI/Idoc interfaces to extract data from SAP BAPI and IDoc’s
  • Worked closely with Data modelers and create standards, guidelines for data modeling and ETL processes
  • Reviewed and approved functional specification /Technical designs for all impacted ETL/New ETL’s and created UAT/SIT docs for the same after required changes are complete
  • Involved in designing CDW (various levels of staging) from ultimate source to target and created standard guidelines for Informatica mappings
  • Designed and developed mappings to load data from Salesforce.com using Informatica Power Exchange for Sales force using
  • Extensively used Informatica metadata manager tool to identify the data flow using data lineages and created metadata manager services
  • Extensively used Teradata utilities and configured Informatica sessions for Teradata parallel processing
  • Configured sessions for Teradata parallel processing API targets and enabled session recovery for Teradata PT using stream mode
  • Used data lineages from metadata manager for auditing the flow of data from source to target
  • Extensively used Informatica metadata manager and business glossary to identify the impact analysis
  • Communicated to downstream systems about the impact analysis to handle the changes from reporting end
  • Extensively worked with offshore teams and evaluated the timely changes to the coding as part of conversion
  • Developed & reviewed Informatica code done by team members and approved as per the standard guidelines document
  • Created SQL scripts to validate data flow between source and target.

Technical Lead

Devon Energy Corporation
08.2012 - 04.2013
  • Created Technical Designs & defined mapping, transformation rules as per the Functional Documents
  • Extensively used Informatica Mappings & Workflows
  • Designed & created complex Mappings using various Transformations like Lookup, Update Strategy, Expression, Aggregator, Normalizer, Union, SAP Functions, ABAP, IDOC interpreter, BAPI & SAP Application Source Qualifier
  • Developed Mappings for inbound/outbound using, IDOC, and BAPI’s & set up Business Content mappings i.e
  • Listener, Send Request, and Processing & Cleanup to load into EDW for upstream Business Objects Reporting System
  • Captured CDC from Sales force objects based on creates date and last modified date fields
  • Configured Sessions and workflow to process sales force data
  • Implemented SCD type2 design for CDW
  • Designed & developed the Landing and ODS framework, ETL etc
  • Using complex Oracle SQL queries, Informatica mappings using various transformations like Filters, Routers, Lookups, and Expression etc
  • Provided post Go-Live live support and troubleshooting with Production jobs
  • Tuned the performance of Informatica, Database and Reporting performance
  • Some of the steps used to tune the performance include indexes, Informatica and database partitioning, Explain Plan, Effective rewriting the query and SQL Hints
  • Created mappings to extract data from SAP APO/BI cubes/DSO’s using OHS (Open hub services) and coordinated SAP BW team in configuring the process chain
  • Coordinated with SAP BW team in clearing logs and rerunning the SAP failed process chain
  • Created basic BTEQ (basic Teradata query) to generate keys
  • Created stored procedure transformation to call SQL procedures.

Informatica BI Lead

Ericsson
03.2011 - 07.2012
  • Worked with Business Analyst to gather the requirements and created functional specifications
  • Coordinated with data modeler in designing the various schemas for Finance and planning and manufacturing modules
  • Worked on Offshore Onshore model, held meetings on a daily basis for coordination and planning
  • Handled client enquiries and prepared daily workloads for the team members
  • Created mapping documents and estimates for each mapping
  • Worked with offshore team and conducted daily meetings to get status update
  • Worked with SAP team to identify the tables, Functional modules and data sources
  • Created BCI and ABAP mappings to extract data from various data sources and tables
  • Used RFC/BAPI to load data into SAP
  • Extensively used various transformations like Aggregator, Expression, connected and unconnected Lookup’s and update strategy transformations to load data into target
  • Worked on performance issues and bottlenecks to reduce the load time and monitored the data loads
  • Created mappings to support delta loads into target
  • Used Teradata utilities fast load, multiplied, Tpump to load data
  • Performed tuning and optimization of complex SQL queries using Teradata Explain
  • Created several custom tables, views and macros to support reporting and analytic requirements
  • Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.

Informatica Lead / Developer

Thomson Reuters
08.2010 - 01.2011
  • Extensively used Informatica Power exchange to extract data from SAP ECC 6, SAP APO (SCM) 7.0, SAP BI 7.0, Oracle 10 g an SQL server
  • Extracted data from standard and customized data sources using BCI Extraction
  • Created Send request/Process request/Listener mappings to load data from SAP data sources
  • Lead Bi team and coordinated with offshore WIPRO team Analyzed the individual performance of the team and motivated them to perform even better
  • Analyzed the assigned projects and distributing the tasks to the members as per their area of expertise
  • Worked with SAP team to identify the right data sources with respect functional design specs
  • Generated ABAP code for extracting data from SAP tables for stream and file modes using Informatica
  • Created SAP BW OHS (open hub services) source definitions in Informatica and configured work flow to run from SAP process chain
  • Extensively worked on debugging the process chain failures and fixing them to re run the process chain to invoke the Informatica workflow to load SAP info cube data
  • Extensively used switch synonym process to load data into load and report tables in data mart
  • Created Informatica mappings using various transformations
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Created and managed daily, weekly cycle and monthly data operations, workflows and scheduling processes using Maestro
  • Involved FDS review and designed TDS and design standards.

BI Lead

Johnson & Johnson Health care systems
06.2009 - 07.2010
  • Extensively used Informatica Power exchange to extract data from SAP FICO, SAP SD modules
  • Worked with SAP ABAP team in extracting data from critical SAP tables like KONV using BAPI code
  • Involved in migrating ABAP code from Dev to QA and production
  • Created various transformations to move SAP data into Oracle tables
  • Analyze complex customer’s business requirements in areas of GL Account, Cost Center & Profit Center Accounting and map them to standard SAP processes, solutions and products within project scope
  • Used Autosys to schedule workflows in production environment
  • Extensively worked with data modeler in designing GL account, Cost center schemas
  • Developed detailed functional specifications, enhancements and modifications documentation and track development progress partnering with on-site ABAP team specialists
  • Offered solutions to the top management regarding project related queries
  • Participated in testing at all levels (unit, integration and user testing); designing and driving testing initiatives; develop and execute test plans; provide leadership to others in support role; documentation and training end users on processes.

Sr.Informatica developer

Disney Studio’s & Home Entertainments
10.2008 - 04.2009
  • Extensively used Informatica Power Center mappings & workflows
  • Designed & created complex mappings using various transformations like Lookup, Update Strategy, Expression, Aggregator, Normalizer, Union, SAP Functions, ABAP, IDOC interpreter, BAPI & SAP Application Source Qualifier
  • Developed mappings for inbound/outbound using IDOC, and BAPI’s & set up Business Content mappings i.e
  • Listener, Send Request, and Processing & Cleanup to load into EDW for upstream Business Objects Reporting System
  • Captured CDC from Sales force objects based on creates date and last modified date fields
  • Configured sessions and workflow to process sales force data
  • Extensively worked on Shell scripts for running SSIS programs in batch mode on UNIX
  • Created mappings using pushdown optimization to achieve good performance in loading data into Netezza
  • Developed high level functional and technical documents and provided ETL standard guidelines and best practices for development team
  • Created Informatica mappings to load master data from oracle to SAP BW PSA
  • Created Informatica mappings to load Payment flow, BT cash data, Currency conversion data from SAP HANA DSL layer to SAP HANA DPL layer/DLL (SAP BA layer).

Informatica Consultant

CARLSON
01.2007 - 09.2008
  • Extensively used Informatica Power Center mappings & workflows
  • Designed & created complex mappings using various transformations like Lookup, Update Strategy, Expression, Aggregator, Normalizer, Union, SAP Functions, ABAP, IDOC interpreter, BAPI & SAP Application Source Qualifier
  • Developed mappings for inbound/outbound using IDOC, and BAPI’s & set up Business Content mappings i.e
  • Listener, Send Request, and Processing & Cleanup to load into EDW for upstream Business Objects Reporting System
  • Captured CDC from Sales force objects based on creates date and last modified date fields
  • Configured sessions and workflow to process sales force data
  • Extensively worked on Shell scripts for running SSIS programs in batch mode on UNIX
  • Created mappings using pushdown optimization to achieve good performance in loading data into Netezza
  • Developed high level functional and technical documents and provided ETL standard guidelines and best practices for development team
  • Created Informatica mappings to load master data from oracle to SAP BW PSA.

Informatica developer

Countrywide Insurance
12.2005 - 12.2006
  • Extensively used Informatica Power Center mappings & workflows
  • Designed & created complex mappings using various transformations like Lookup, Update Strategy, Expression, Aggregator, Normalizer, Union, SAP Functions, ABAP, IDOC interpreter, BAPI & SAP Application Source Qualifier
  • Developed mappings for inbound/outbound using IDOC, and BAPI’s & set up Business Content mappings i.e
  • Listener, Send Request, and Processing & Cleanup to load into EDW for upstream Business Objects Reporting System
  • Captured CDC from Sales force objects based on creates date and last modified date fields
  • Configured sessions and workflow to process sales force data
  • Extensively worked on Shell scripts for running SSIS programs in batch mode on UNIX
  • Created mappings using pushdown optimization to achieve good performance in loading data into Netezza
  • Developed high level functional and technical documents and provided ETL standard guidelines and best practices for development team
  • Created Informatica mappings to load master data from oracle to SAP BW PSA.

Informatica Consultant

Bear Stearns & Co
06.2004 - 11.2005
  • Extensively used Informatica Power Center mappings & workflows
  • Designed & created complex mappings using various transformations like Lookup, Update Strategy, Expression, Aggregator, Normalizer, Union, SAP Functions, ABAP, IDOC interpreter, BAPI & SAP Application Source Qualifier
  • Developed mappings for inbound/outbound using IDOC, and BAPI’s & set up Business Content mappings i.e
  • Listener, Send Request, and Processing & Cleanup to load into EDW for upstream Business Objects Reporting System
  • Captured CDC from Sales force objects based on creates date and last modified date fields
  • Configured sessions and workflow to process sales force data
  • Extensively worked on Shell scripts for running SSIS programs in batch mode on UNIX
  • Created mappings using pushdown optimization to achieve good performance in loading data into Netezza
  • Developed high level functional and technical documents and provided ETL standard guidelines and best practices for development team
  • Created Informatica mappings to load master data from oracle to SAP BW PSA.

Education

master’s in computer applications (MCA) - MCA

Madras University
Chennai, India
05.2000 - 05.2003

bachelor’s in science (B.Sc.) -

Andhra University
AP, India
08.1997 - 05.2000

Skills

Developer/Team Lead/Architect rolesundefined

Timeline

Data Architect

Conoco Phillips
01.2020 - Current

Solutions Architect

BP / Infosys
10.2018 - 12.2019

Integration Lead/Analyst

Sysco Foods
04.2017 - 09.2018

Technical Lead

EBay Inc. / PayPal
06.2015 - 04.2017

Informatica Lead

Idaho Power
08.2014 - 05.2015

Tech Lead

Sony Electronics Limited
05.2013 - 07.2014

Technical Lead

Devon Energy Corporation
08.2012 - 04.2013

Informatica BI Lead

Ericsson
03.2011 - 07.2012

Informatica Lead / Developer

Thomson Reuters
08.2010 - 01.2011

BI Lead

Johnson & Johnson Health care systems
06.2009 - 07.2010

Sr.Informatica developer

Disney Studio’s & Home Entertainments
10.2008 - 04.2009

Informatica Consultant

CARLSON
01.2007 - 09.2008

Informatica developer

Countrywide Insurance
12.2005 - 12.2006

Informatica Consultant

Bear Stearns & Co
06.2004 - 11.2005

master’s in computer applications (MCA) - MCA

Madras University
05.2000 - 05.2003

bachelor’s in science (B.Sc.) -

Andhra University
08.1997 - 05.2000
VENKATA BONAM