Over all 13 Years of extensive experience in IT industry in Data warehousing stream, which includes Designing, Development, Implementation, migration and testing of large data warehouse projects.
Around 9 Years of experience working as an ETL Developer and providing Business Intelligence Solutions in Data Warehousing by implementing ETL process using Ab Initio .
Analysis and development of Data warehousing solutions in developing graphs for Extraction Transformation and Loading (ETL) with the help of Ab Initio.
Around 2 years of experience in Snowflake cloud development.
Good experience in core Java and advance java concepts. Implemented SCD type2 CDC (Change Data Capture) in a Data Integration.
Expertise with most of the Ab-Initio components like Database, Datasets, Partition, De-partition, Sort and Transform components and fact table code which includes aggregated fact table jobs and components like reformat, Rollup, scan, Join, Partition by key, Partition by Round Robin, Gather, Merge, Sort, Lookup etc.
Worked on Ab-Initio Conduct>it which creates the orchestration of Ab-Initio graphs using Plans and subplans and PDL for generic process of data transformation and good experience in Express IT to write the business rules.
Worked on advanced Ab Initio concepts like Meta programing and PDL to handle dynamic inputs and create the metadata on the fly.
Unix Shell Scripting to write deployment scripts to deploy Ab Initio graphs in various testing environments and production.
Experienced in designing scalable architectures addressing parallelisms(Pipleline, Component and Data parallelism) , data integration, ETL, data repositories and analytics, making use of Ab-Initio suite. Ab Initio applications using GDE such as batch graphs to create load files and web service graphs to send responses in SOAP and REST calls.
Ab Initio batch graphs to load Oracle tables and create ICFF lookup files.
Creation of Autosys and crontab schedules and jobs for batch processing and application deployments.
Involve in DevOps migration/automation processes for build and deploy systems.
Working with git hub enterprise to manage source code repositories and performed branching, merging, and tagging depending on the requirement.
Utilize UNIX, PL/SQL, Shell Scripting, AIR commands and Oracle in application development and create Autosys jobs to schedule the jobs in UAT and production.
Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
Developed materialized views for data replication in distributed environments. Created Packages and Procedures to automatically drop table indexes and create indexes for the tables.
Created Shell Scripts for invoking SQL scripts and scheduled them using CRONTAB and Control Center. Analyze requirements/user stories and come up with Technical design Ability to quickly grasp new concepts and apply software methodologies as per business needs.
Worked with Database Administrators, Data Modelers, Business Analysts on solution design and implementation.
Excellent communication skills and experience working on an Agile/Scrum team .
Good knowledge on Experience on Autosys, Unix Shell Scripts, Oracle DB, SQL Server DB, PL/SQL.
Worked with Release Management on Production Implementation Planning and support release activities. ETL Code Development, preparing Release Notes, Unit test case and results, Runbook/user guide.
Deployment guide, Code Review results, Tag requests and approvals, Requirement traceability Matrix Worked with scrum masters to do the capacity planning.
Overview
13
13
years of professional experience
Work History
Senior (ETL) Ab-Initio Developer
ConocoPhillips
Houston, USA
- Current
Analyzing the business requirements and working closely with business team and team leads
Developed the graphs for extracting, transforming and loading data into data marts/data warehouse by using Abinitio for vision application
Used EME for version controlling by check in and out and performed admin activities
Designed reusable routines and shell scripts for file validations
Reading and writing output files into AWS S3 bucket
Worked with Java team to understand the application knowledge and functionality
Creating JSON and XML files as inputs or output of the ETL process
Used Performance tuning techniques for ETL Graphs
Migrated data into Salesforce objects like (Accounts, Opportunities, OppLineItems, Users, Product etc
Strongly worked on air commands, parallelism, multiple systems and PDL
Extensively worked on mapping skills one format to different formats
Developed UNIX Shell scripts for running the pre/post processes, automated the scripts for graphs, and cleaning the space
Extensively Working on enhancements, Changes of existing solutions
Evaluate business and technical alternatives, recommended solutions and participated in their Implementation
Documenting the ETL process flow for better maintenance and analyzing the process flow
Working on production issues, fixing them and documenting for future reference
Used different components in Abinitio like Join, Reformat, Merge, Aggregator, Sort, Lookup, etc
Extensively worked on Abinitio advanced concepts like metaprogramming and PDL interpretations to create the dml files to read the source files, and handled the dynamic inputs with PDL interpretation
Worked on creating assets and generic graphs to load data from Teradata to Snowflake as part of Teradata cloud migration Analyzing table migration reports provided by business and migrating them, recreating Teradata SQLs to Snowflake SQLs
Using Abinitio GDE to generate graphs for ETL processes like migrating static tables to Snowflake
Worked with Data Quality Engineering team to set up the data quality checks to identify flawed data
Developed graphs for the ETL processes using Join, Rollup, Scan, Normalize, Denormalize and Reformat transform components
Backfilling tables in Snowflake after migrating them from Teradata to Snowflake
Supporting Production installs and monitoring after the installations
Working in Agile methodology and onsite-offshore model
Created load files and web service graphs to send responses in SOAP and REST calls
Generated the authentication token for which we use the credentials provided to connect the API url, with this we get the token using the call web service component
To connect the actual API path using authentication token to the URL where we load or unload the data or file
Conducted code walkthroughs with the team members and reviewed the unit test results with the business.
Ab Initio Developer
Vodafone
Berkshire, United Kingdom
01.2019 - 03.2023
Analyzing the business requirements and working closely with business team and team leads
Developed the graphs for extracting, transforming and loading data into data marts/data warehouse by using Abinitio for vision application
Used EME for version controlling by check in and out and performed admin activities
Designed reusable routines and shell scripts for file validations
Used Performance tuning techniques for ETL Graphs
Create load files and web service graphs to send responses in SOAP and REST calls
Generated the authentication token for which we use the credentials provided to connect the API url, with this we get the token using the call web service component
To connect the actual API path using authentication token to the URL where we load or unload the data or file
Migrated data into Salesforce objects like (Accounts, Opportunities, OppLineItems, Users, Product etc
Strongly worked on air commands, parallelism, multiple systems and PDL
Extensively worked on mapping skills one format to different formats
Developed UNIX Shell scripts for running the pre/post processes, automated the scripts for graphs, and cleaning the space
Extensively Working on enhancements, Changes of existing solutions
Evaluate business and technical alternatives, recommended solutions and participated in their Implementation
Documenting the ETL process flow for better maintenance and analyzing the process flow
Working on production issues, fixing them and documenting for future reference
Used different components in Abinitio like Join, Reformat, Merge, Aggregator, Sort, Lookup, etc
Extensively worked with files and Oracle DB
Worked on creating assets and generic graphs to load data from Teradata to Snowflake as part of Teradata cloud migration Analyzing table migration reports provided by business and migrating them, recreating Teradata SQLs to Snowflake SQLs
Using Abinitio GDE to generate graphs for ETL processes like migrating static tables to Snowflake
Worked with Data Quality Engineering team to set up the data quality checks to identify flawed data
Developed graphs for the ETL processes using Join, Rollup, Scan, Normalize, Denormalize and Reformat transform components
Backfilling tables in Snowflake after migrating them from Teradata to Snowflake
Supporting Production installs and monitoring after the installations
Working in Agile methodology and onsite-offshore model
Conducted code walkthroughs with the team members and reviewed the unit test results with the business.
Ab Initio Developer
Vodafone
Pune, India
06.2018 - 12.2018
Cleansing the data from the source dataset using Ab Initio components like Join, Dedup Sorted, Denormalize, Normalize, Reformat, Filter-By-Expression, Rollup
Worked with Departition components like Concatenate, Gather and Interleave
Usage of multifile system where data is partitioned for parallel processing
Deploying the Ab Initio graphs to production and creating jobs, monitoring them
Database experience using Teradata
Used IBM Datastage Designer for FSR ETL Application to develop jobs for extracting, cleaning, transforming and loading data into data marts/data warehouse
Used different stages of Datastage Designer like Join, Merge, Filter, Copy, Aggregator, Transformer, Sort, Data Set, Funnel, Lookup, Remove Duplicates, Modify, Change Data Capture, Change Apply, Surrogate Key, Column Generator, Row Generator, etc
Worked with Join, Look up (Normal and Sparse) and Merge stages
Worked with sequential file, dataset, file set and lookup file set stages
Created shell scripts to perform validations and run jobs on different instances
DataStage jobs were scheduled, monitored, performance of individual stages was analyzed and multiple instances of a job were run using Datastage Director
Supporting the production on 24/7
Actively involved in interacting with BAs to understand the specifications of the requirements for ETL process and end-to-end flow
Wrote SQL queries to optimize the data fetching from the different sources tables/files
Process and Transform data feed of customers that comes on a daily basis
Wrote UNIX shell scripts to archive old customer data files and scheduled job
Migrated scripts/graphs from DEV to UAT environment to test and validate
Worked with business to translate the business requirements into High level design, Detailed Level Design and Functional code.
Ab Initio Developer
Vodafone
Pune, India
01.2015 - 05.2018
Cleansing the data from the source dataset using Ab Initio components like Join, Dedup Sorted, Denormalize, Normalize, Reformat, Filter-By-Expression, Rollup
Worked with Departition components like Concatenate, Gather and Interleave
Usage of multifile system where data is partitioned for parallel processing
Deploying the Ab Initio graphs to production and creating jobs, monitoring them
Database experience using Teradata
Used IBM Datastage Designer for FSR ETL Application to develop jobs for extracting, cleaning, transforming and loading data into data marts/data warehouse
Used different stages of Datastage Designer like Join, Merge, Filter, Copy, Aggregator, Transformer, Sort, Data Set, Funnel, Lookup, Remove Duplicates, Modify, Change Data Capture, Change Apply, Surrogate Key, Column Generator, Row Generator, etc
Worked with Join, Look up (Normal and Sparse) and Merge stages
Worked with sequential file, dataset, file set and lookup file set stages
Created shell scripts to perform validations and run jobs on different instances
DataStage jobs were scheduled, monitored, performance of individual stages was analyzed and multiple instances of a job were run using Datastage Director
Supporting the production on 24/7
Actively involved in interacting with BAs to understand the specifications of the requirements for ETL process and end-to-end flow
Wrote SQL queries to optimize the data fetching from the different sources tables/files
Process and Transform data feed of customers that comes on a daily basis
Wrote UNIX shell scripts to archive old customer data files and scheduled job
Migrated scripts/graphs from DEV to UAT environment to test and validate
Worked with business to translate the business requirements into High level design, Detailed Level Design and Functional code.
Ab Initio Developer
CITI
Tampa, USA
09.2013 - 12.2014
Citibank has created a consolidated contracts and position data warehouse project called OPTIMA to support BASEL II reporting and management reporting
Analyzing the business requirements and working closely with business team and team leads
Developed the graphs for extracting, transforming and loading data into data marts/data warehouse by using Abinitio for vision application
Used EME for version controlling by check in and out and performed admin activities
Designed reusable routines and shell scripts for file validations
Used Performance tuning techniques for ETL Graphs
Migrated data into Oracle objects
Strongly worked on air commands, parallelism, multiple systems and PDL
Extensively worked on mapping skills one format to different formats
Developed UNIX Shell scripts for running the pre/post processes, automated the scripts for graphs, and cleaning the space
Extensively Working on enhancements, Changes of existing solutions
Evaluate business and technical alternatives, recommended solutions and participated in their Implementation
Documenting the ETL process flow for better maintenance and analyzing the process flow
Working on production issues, fixing them and documenting for future reference
Used different components in Abinitio like Join, Reformat, Merge, Aggregator, Sort, Lookup, etc
Extensively worked with files and Oracle DB
Worked on creating assets and generic graphs to load data from Teradata to Snowflake as part of Teradata cloud migration Analyzing table migration reports provided by business and migrating them, recreating Teradata SQLs to Snowflake SQLs
Using Abinitio GDE to generate graphs for ETL processes like migrating static tables to Snowflake
Worked with Data Quality Engineering team to set up the data quality checks to identify flawed data
Developed graphs for the ETL processes using Join, Rollup, Scan, Normalize, Denormalize and Reformat transform components
Backfilling tables in Snowflake after migrating them from Teradata to Snowflake
Supporting Production installs and monitoring after the installations
Working in Agile methodology and onsite-offshore model
Conducted code walkthroughs with the team members and reviewed the unit test results with the business.
Ab Initio Developer
Citi
Kolkata, India
07.2010 - 08.2013
Cleansing the data from the source dataset using Ab Initio components like Join, Dedup Sorted, Denormalize, Normalize, Reformat, Filter-By-Expression, Rollup
Worked with Departition components like Concatenate, Gather and Interleave
Usage of multifile system where data is partitioned for parallel processing
Deploying the Ab Initio graphs to production and creating jobs, monitoring them
Database experience using Teradata
Used IBM Datastage Designer for FSR ETL Application to develop jobs for extracting, cleaning, transforming and loading data into data marts/data warehouse
Used different stages of Datastage Designer like Join, Merge, Filter, Copy, Aggregator, Transformer, Sort, Data Set, Funnel, Lookup, Remove Duplicates, Modify, Change Data Capture, Change Apply, Surrogate Key, Column Generator, Row Generator, etc
Worked with Join, Look up (Normal and Sparse) and Merge stages
Worked with sequential file, dataset, file set and lookup file set stages
Created shell scripts to perform validations and run jobs on different instances
DataStage jobs were scheduled, monitored, performance of individual stages was analyzed and multiple instances of a job were run using Datastage Director
Supporting the production on 24/7
Actively involved in interacting with BAs to understand the specifications of the requirements for ETL process and end-to-end flow
Wrote SQL queries to optimize the data fetching from the different sources tables/files
Process and Transform data feed of customers that comes on a daily basis
Wrote UNIX shell scripts to archive old customer data files and scheduled job
Migrated scripts/graphs from DEV to UAT environment to test and validate
Worked with business to translate the business requirements into High level design, Detailed Level Design and Functional code.
Level-2 and Level-3 Support Analyst
CITI
Tampa, USA
Optima provides competitive edge to business leaders and government in fulfilling their missions
It provides solution partner who can add value in their pursuit of competitive advantage, productivity, and profitability
The bank strives on knowledge and skills to meet client’s value solution needs
The merchant bank is geared towards providing a full range of investment banking services
The project aims to give intelligent support by giving information about the business of the bank
It helps to analyze the business of the bank with respect to area i.e., nation wise, region wise, city wise and branch wise
Responsible for the stability of the environment and resolution of Level 3 issues of Ab-Initio
Working with Ab-Initio Vendor for any kind of issues which not resolve internally
Working with application team to help Ab-Initio related issue to resolve
Worked on strategic item and develop/perform POC with latest technology as per requirements
Worked upon Ab-Initio code release and code promotion from development environment to test and production environments
Project creation setup for new project requirements and creating required data directories
Design, develop and support scalable re-usable solutions
Interact with multiple partners developing and enforcing standards
Ab-Initio server key renewals for the Ab-Initio servers, knowledge in GDE/Server keys
Experience in Ab-Initio BIOS key setup for Ab-Initio servers Installation of Ab-Initio server software
Experience in Ab-Initio servers OS upgradation activities
Experience in using script as well as Ab-Initio air commands for code promotion process
Maintenance of EME technical repository
Knowledge on Ab-Initio environment parameters
Experience on UNIX commands and shell scripting & AIR commands and M-commands
Proficient with Multi File System techniques Good knowledge in AIR commands and also other EME related operations
Experience in Ab-Initio Continuity of business test
Ability to understand business drivers and balance business needs timelines, costs and best practices
Familiar with Linux shell commands to perform day to day operations, Working knowledge of Shell Scripting
Team player, self-motivated, very resourceful/requiring minimal direction, and able to work effectively either independently or be highly-collaborative with peers.
Cognos, Java Support Analyst
Dynpro India
India, India
Videocon Industries ltd
Was one of the initials companies that made it to the world
Videocon electrical captured the initial Indian electrical market and topped the chart for its products
Design, Implement and maintain java application phases
To take part in software and architectural development activities
Conduct software analysis, programming, testing and debugging
Identifying production and non-production application issues
Recommended changes to improve established Java application process
Develop application code for Java programs.
Education
MCA -
Osmania University
01.2006
BCA -
Osmania University
01.2003
Skills
ETL Ab-Initio GDE 4033 Ab-Initio Co>Op 4034
Programming skills UNIX, Shell scripting, Core and advanced JAVA
Database and DB tools Oracle, Teradata, SQL developer, Snowflake
Cloud CRM Product Sales Force Lightning
IT Tracker BMC Remedy, Service now and SVN
DevOps Tools Jira, GitHub, Confluence, Jenkin, Unit, SVN