Summary
Overview
Work History
Education
Skills
Additional Information
Technical Skills
Accomplishments
Certification
Timeline
Generic

Akarsh Kapoor

Rocky Hill

Summary

  • A seasoned technology program & portfolio management professional with 18+years of total IT experience and technical proficiency in the Data Architecture, Data Warehousing and Cloud related technologies teamed with Business/Data Requirements Analysis, Architecture & Design, Data Modeling, Development, and documentation.
  • Expertise with data architecture principles and standards for the various systems and leading the Development strategies for warehouse implementation, data acquisition and managing the Enterprise Data Models, including data dictionary/metadata registry.
  • Assisting clients in their data modernization journey and strategic needs by providing robust cloud architecture solutions on GCP.
  • Expertise in establishing and managing Enterprise Data Standards and Document all data architecture design and analysis work.
  • Strong Data warehousing knowledge & experience in Dimensional Data Modeling, Star Schema/Snowflake modeling and in DW/BI tools including DataStage, Talend, Abinitio and Attunity.
  • Leveraged GCP and DataStage expertise to lead critical data integration projects, demonstrating exceptional leadership and stakeholder management skills. Achieved seamless transitions, enhancing data quality and security across platforms

Overview

18
18
years of professional experience
1
1
Certification

Work History

Sr. Data Engineer

CVS Healthcare
11.2023 - Current

The client has initiated a cloud modernization initiative that includes the migration of multiple existing Hadoop applications to GCP. Four applications have been selected for this project and must be migrated within a defined timeframe.

Roles & Responsibilities

· Work with Hadoop application owner and understand the on prem architecture and landscape.

· Work with Solution engineer to create the tenancy to support the application migration

· Design the pipe lines on GCS, DataProc, Composer and python

· Design the pipe lines to migrate one time data from Hadoop to Big query

· Led daily standup, Scrum of Scrum, sprint planning, retrospective and ad-hoc meetings using rally.

· Stakeholder management, escalation management and presentation to the senior leadership.

Technical Lead

CVS Healthcare
11.2017 - 10.2023

Client is migrating multiple Care Manager platforms to a third-party application. Client had multiple Care Manager systems and supporting systems. This project involved migrating data incrementally based on program, Plan, State etc. Biggest challenge was to keep both systems updated and live. Team was responsible for reading data from source systems, Transform data into Target system acceptable format. Currently team is handling more than 80 files ((input + output) every day.

Roles & Responsibilities

· Analyze the business requirements, functional specification and technical Specification.

· Participate in the discussion with architects and business and give input for the solution.

· Closely work with business owners, architects to understand the business requirement and design the solution as per the need.

· Analysis of Existing Source and Target System

· Lead the team and assign the task and take care of all the deliverable.

· Involved in guiding peers and reviewing their code.

· Analyzed the job and tune them as per the need.

· Developed jobs Using Parallel Extender v11.7 to achieve better performance and throughput by efficient usage and developed strategies for Extraction, Transformation and Loading (ETL) the data into the target database.

Provide support during implementation of project.

Technical Lead

UPS
05.2017 - 10.2017

The purpose of RASP is to receive the financial data from source system (DB2, Sql Server and Hadoop Data Lake) and process it for the downstream RA system for G/L entry. As a part of this processing, we needs to perform below operations

Tagging

RASP-P must identify the transaction as per the given set of business rules and tag them as actual or accrual.

Reversal

RASP-P must identify the transaction as per the set of business rules and check if a reversal is required to make correct G/L entry.

Aggregation

RASP-P must aggregate incoming transactions from all sources at the 'trigger' level and send the aggregated data to RA system.

Spreading

RASP-P must identify when spreading will occur and for which transactions (Specific Account Level Accessorial and Unclassified Revenue) and supply the calculations that need to take place.

Along with each of the above process RASP need to perform balancing process to make sure there is no financial data loss is happing as the same will used to make G/L entry.

Roles & Responsibilities

· Analyze the business requirements, functional specification and technical Specification.

· Analysis of Existing Source and Target System

· Developed jobs Using Parallel Extender v11.5 to achieve better performance and throughput by efficient usage and developed strategies for Extraction, Transformation and Loading (ETL) the data into the target database.

· Analyzed the job and tune them as per the need.

· Involved in testing the developed codes, fixing defects if any and redoing the test and fix upon delivery to the client manager.

· Involved in guiding peers and reviewing their code.

· Provide support during implementation of project.

Sr. Developer

UnitedHealthcare
01.2015 - 04.2017

The purpose of 3Rs is to determine what it will take to meet the needs of Healthcare Reform PPACA's legislation that requires insurers to administer reinsurance, risk adjustment, and risk corridor programs, We need to allocate, calculate, store, and provide information to HHS and/or the states about these three programs for time periods ranging from 3 years to permanent with data storage requirements for each program of not less than 10 years.

Roles & Responsibilities

· Analyze the business requirements, functional specification and technical Specification.

· Analysis of Existing Source and Target System

· Developed jobs Using Parallel Extender v8.7to achieve better performance and throughput by efficient usage and developed strategies for Extraction, Transformation and Loading (ETL) the data into the target database.

· Analyzed the job and tune them as per the need.

· Involved in testing the developed codes, fixing defects if any and redoing the test and fix upon delivery to the client manager.

· Involved in guiding peers and reviewing their code.

Provide support during implementation of project

Sr. Developer

At&T
03.2012 - 12.2014

This project will provide a 360 degree end to end view from sales to billing with a new online tool which will enable stakeholders to see details for business customers across all strata. This new tool will provide a holistic view of the services customers have on their account(s) and provide a dashboard on work in progress including completed work for the account(s) across provisioning, service assurance (maintenance) and billing functions. Currently, there is no tool available to link customer accounts and products in a single view.

Roles & Responsibilities

· Analyze the business requirements, functional specification and technical Specification.

· Analysis of Existing Source and Target System

· Developed jobs Using Parallel Extender v8.7to achieve better performance and throughput by efficient usage and developed strategies for Extraction, Transformation and Loading (ETL) the data into the target database.

· Involved in guiding peers and reviewing their code.

· Involved in testing the developed codes, fixing defects if any and redoing the test and fix upon delivery to the client manager.

· Provide support during implementation of project

Sr. Developer

HSBC Technology & Services
09.2010 - 01.2012

The purpose of the Adjustments project is to provide functionality to allow sites to pass adjustments on a monthly basis at an optimum level of granularity for inclusion in MI and reports, and to allow users to see the impact of their adjustments in data analysis tools.

Adjustments will undergo the validation process to make sure that the adjusted attributes are in-line with the data held in GLEAM. The document covers the validation process which is applicable when Adjustments are uploaded via Bulk Upload or Manual Input screens or reworked as a part of rework process.

Roles & Responsibilities

· Analyze the business requirements, functional specification and technical Specification.

· Analysis of Existing Source and Target System

· Developed jobs Using Parallel Extender v8.1 to achieve better performance and throughput by efficient usage and developed strategies for Extraction, Transformation and Loading (ETL) the data into the target database.

· Involved in testing the developed codes, fixing defects if any and redoing the test and fix upon delivery to the client manager.

· Worked on performance tuning of the jobs.

· Responsible for preparation of Design Documents, Test Case Specifications, performance review and Coding

· Involved in guiding peers and reviewing their code.

Provide support during implementation of project

Developer

UnitedHealth Group
02.2010 - 09.2010

The purpose of the Strategic Payment Program is to bring a strategic
level focus on payment capabilities and objectives by aligning all
associated activities under a single program and vision.

Roles & Responsibilities

· Scoping and Requirements Study

· Analyze the business requirements, functional specification and technical Specification.

· Analysis of Existing Source and Target System

· Involved in analysis and developing of Data Stage jobs and bug fixing.

· Provide support for handling tickets which includes Incident management, Problem management

Developer

Aetna
08.2006 - 01.2010

Aetna provides various products like Medicare, Medicaid etc. Medicare products is for the people whose age is greater than 65 years. To provide such kind of insurance Aetna has to follow some rules stated by U.S government. One of that rules is MIPPA (Medicare Improvements for Patients and Providers Act).

Roles & Responsibilities

· Scoping and Requirements Study

· Analyze the business requirements, functional specification and technical Specification.

· Creation of various document like mapping, Design and understanding document

· Analysis of Existing Source and Target System

· Involved in analysis and developing of Data Stage jobs and bug fixing.

· Taken Care of all the quality related work and project management work.

Education

Master of Science - Computer

Vellore Institute of Technology
Vellore, Tamil Nadu

Bachelor of Science - Maths

M.J.P. Rohilkhand University
Bareilly, Uttar Pradesh

Skills

    DW & Analytics

  • Data Architecture
  • GCP
  • DataStage (7x,8x,11x)
  • Oracle, SQL Server
  • DW-BI, DW Architecture
  • Dimensional Data Modeling
  • ETL Architecture
  • UNIX, Shell Scripting
  • Business Intelligence
  • Data Quality
  • Data Security
  • Attunity
  • Portfolio/Program Management

  • Agile/Scrum/SAFe
  • SDLC/Waterfall
  • Leadership
  • Stakeholder Management
  • Strategic Planning
  • Efforts Estimation
  • Capability/COE Building
  • Application Management
  • Cloud Migration & Modernization

Additional Information

Sr. Manager Projects - Cognizant Technology Solution US Corp. (Jan-2015 – Till date)

Project Lead - Tech Mahindra Ltd (Feb-2012 – Jan-2015)

Sr. Software Engineer - HSBC Software Development India Pvt. Ltd (Sep-2010 – Feb-2012)

Sr. Software Engineer - United Health Group Ltd (Feb-2010 – Sep-2010)

Program Analyst - Cognizant Technology Solution India Pvt. Ltd (Aug-2006 – Feb-2010)

Technical Skills

ETL Tools: Ascential Datastage 7.5, Datastage 8.1, Datastage 8.7, Datastage   11.5, Datastage11.7, Attunity Replicate, Talend, Abinitio

Cloud Stack: GCP fundamental, Azure fundamentals, DataProc, Big query, GCS   bucket, Composer

Databases: DB2, Oracle, MS SQL Server 2005, MS SQL Server 2016, Snowflake

Scripting Language: Unix Shell, Python

Bigdata: Hadoop, Hive

Scheduling Tool: Zeke, Autosys, Control M, Tivoli (TWS)

Operating System: Windows XP, Windows 10, UNIX

Accomplishments

    · Active Member of Cognizant Attunity CoE (2018 – 2019)

    · Active Member of Cognizant IBM CoE (2019-2022)

Certification

· IBM DB2 8.1 - Family Fundamental external certification

· IBM DataStage V11.5.x Developer

· IBM DataStage V11.5.x Partitioning and Collecting

· Google - Cloud Digital Leader Trainings

· Azure – AZ -900 Fundamental

· GCP professional cloud architect by Udemy

· Hive certified by Udemy

· Abinitio trained by Cognizant Academy

Timeline

Sr. Data Engineer

CVS Healthcare
11.2023 - Current

Technical Lead

CVS Healthcare
11.2017 - 10.2023

Technical Lead

UPS
05.2017 - 10.2017

Sr. Developer

UnitedHealthcare
01.2015 - 04.2017

Sr. Developer

At&T
03.2012 - 12.2014

Sr. Developer

HSBC Technology & Services
09.2010 - 01.2012

Developer

UnitedHealth Group
02.2010 - 09.2010

Developer

Aetna
08.2006 - 01.2010

Master of Science - Computer

Vellore Institute of Technology

Bachelor of Science - Maths

M.J.P. Rohilkhand University
Akarsh Kapoor