Summary
Overview
Work History
Education
Skills
Timeline
Generic

Anitha Marri

Lebanon,OH

Summary

Worked as ETL QA Analyst /Lead over 15+ Years for across Auto/Healthcare/Manufacture/Retail and State (Benefit) / Workers Compensation domains. Lead Validations for many DW Migration projects including Cloud Migration and has strong knowledge of RDBMS Concepts and Data Warehouse Techniques - ETL process, data modeling, and Business Intelligence (BI) Concepts. Hold strong experience in QA Processes in both Agile and Waterfall methodologies & has Proven success with Excellent attention to detail and an ability to work meticulously and methodically while under pressure.

Overview

18
18
years of professional experience

Work History

ETL QA Lead (Projects)

Cognizant (Client – Albertson’s)
07.2022 - Current
  • Retail Cockpit is a one-stop analytics solution for Retail Operations. The application offers store users with critical information from many aspects of the operation, allowing them to run successful stores. It consolidates essential metrics, insights, and information into a single location that is both understandable and accessible for Stores. This tool allows store users to track their progress toward their goals.
  • Lead for all Major production implementations projects – Testing Approach, Estimations, Bug tracking, Post Prod Smoke & Production tickets.
  • Expert in ETL Validations – Source (Vendor Files / OLTP tables/Others) – BIM (Stage) – Confirmed (Enterprise Layer) - Analytical Layer (OLAP Retail BOD).
  • Responsible for Validating reporting functionality according to business Compliance as well as cosmetic for all the Retail apps – (Store, Backstage, Associate).
  • Worked hands-in-hand with Developers in the analysis of the ETL Code /SP ‘s in the bug resolution.
  • Hand-on experience on GITHUB CO-PILOT for the GCP Migration project for any SQL recommendations.
  • Lead for all Major production implementations projects – Testing Approach, Estimations, Bug tracking, Post Prod Smoke & Production tickets.
  • Created validation reports using power BI – automatic data quality check b/w source and analytical view on daily basis for Sales reporting.
  • Stand-alone lead the cloud migration for all the FACT tables Dimensional tables & Custom PBI queries for all the ETL pipelines & report modules.

1. EDW to EDM Snowflake cloud migration.

2. Snowflake to GCP Cloud Migration.

3. Exadata Migration to New Architecture.

  • Worked on many sub-projects within Retail successfully validation data as well as PBI reports.

1. Department structure changes to ORCA hierarchy from source.

2. Keeper logic implementation.

3. Modernization of ETL architecture for Sales module.

4. Several new business initiatives – NPS, Produce Score card, Health of the Store.

  • Helped in reducing the QA Cost to 45% by automating all the data validation in ICDEQ automation tool for Migration Projects.
  • Orchestrion and running the automating ICDEQ rules, helped in identification of bugs & load errors proactively during maintenance and regression validations.
  • Worked with Admin team in creating the SOW’s and generating quarterly budget required for every team.
  • Responsible to Document, implement, monitor & enforce QA Process and best practices for Quality and Timely Delivery across engagements.
  • Extensively used JIRA Scrum Methodology for Epics, Stories, Sprint Grooming, Sprint Planning, Daily Standup Calls and Retrospectives.

QA Onsite Manager

Client – Kellogg’s
06.2021 - 06.2022
  • Worked as Test lead for Kellogg’s Kortex Program for the PILOT & Master Data Application Migration tracks, effectively managing day to day QA related activities /challenges.
  • As an On-site lead, ensured the active & flawless communication b/w the offshore project team, leadership team and customer. Actively participated in the project team-Client meetings publishing the timely QA project status, highlighting the issues/blockers or challenges encountered in QA approach to the entire team & bringing those problem statements to closure.
  • Collaborated with project team & customer in presenting the Test Plan & Test approach documents for application migration track, E2E project demos, Testing approach for brining automation scheduling Tivoli to QA environment, Prod Data copy challenges to QA environment.
  • Extensively worked with QA team to develop the testing artifacts (Test Cases & Scripts) required for validating the E2E data lineage across all Regions>Domains >Data Sets.
  • Involved in developing the Test Plan & Approach for Application Migration track to cover -

Parallel Validation between the existing keystone platform & Kortex applications for PILOT.

Linear validation to ensure the data correctness from the multiple source systems to Data Lake

for Master data.

  • Performed E2E data validation to ensure the Data Correctness across various layers for PILOT Kortex Application for the migrated data sets (Full Load/ CDC):

1. Validation of Data Ingestion from multiple source system to S3/Raw & Landing zone.

2. Validation of data from S3/Landing Layer to the Transformed layer. (Redshift Data Lake).

3. Validation of data from Transformed layer to Consumption/Semantic layer.

4. Parallel Validation – Data correctness of Prod Migrated data @AWS Consumption layer Vs existing Keystone Sql Server database.

5. Parallel Validation of Tableau reports with existing Keystone Vs Kortex applications.

  • Organized & managed the Daily QA team calls for covering the status reporting, QA tasks hurdles & solutions.
  • Assisted in leveraging the TOSCA Automation tool to cover the Data Lineage use cases. All the Test cases for performing E2E data validations have been automated. using TOSCA.
  • Running QA triage meeting with project team to discuss open defects and issues during the execution phase of project. Used Octane as Test Management Tool for Test cases repository, Test Executions, Defect Management & Reports generation.
  • Guided the team in build reusable testing framework templates & streamlined the process of Data Warehouse testing, applying best practices.
  • Supported the UAT phase of the project in provide walkthrough of data validation steps E2E & providing the queries raised by customer.

Sr. ETL Test Lead

AF Group
05.2014 - 06.2021
  • Experience and good knowledge in the Worker’s Compensation insurance Life Cycle on Agency/ Policy/ Claims/ Commissions/ Dividends/ Audits and Involved in the data analysis for multiple source and target systems.
  • Involved in Iterative Agile SAFe – Scrum Methodology for rapidly changing or highly emergency requirements and following parallel development and testing by attending Daily Standup Calls, Sprint Planning, Execution, Retrospectives and Sprint Reviews.
  • Worked closely with Product Owner/Test Advisor/Developers in analysis of functional design specifications & source-to-target mapping document (STTM) to understand the business process and develop Test Strategy plan.
  • Developed & Implemented End-End Testing plan for Agency Life cycle by coordinating with vendors & different departments that integrate with the Agency data flow - From Intake à BDC acceptance à Agency creation à Policy & Claims association àAudit à Profit sharing à EDW loads
  • Implemented Testing Approach & End- End System/Integration activities for Agency data migration from Internal application to VUE vendor application. Performed Data as well as functional validations.
  • Policy Lifecycle major functionalities Agency creation, Quote processing, Policy Submissions, Changes, Cancels, Reinstate, Rewrites and Renewals in TAS/elink/Point IN & Guidewire Policy center validations
  • Hands-on Experience on Claims Lifecycle major functionalities like FNOL, Claims Creation, Loss Details & Time Loss, Exposures, Reserves, Payments & Financials, Correspondence & State Forms, History & Notes, Policy & Agency integration in Guidewire Claims application validations.
  • Worked on EIM Attinuity project on the Policy/Claim/Billing data migration to Sql server across various layers:
  • Validated from source tables to Landing zone to ensure the data replication from oracle to Sql server is done correctly through Data verification, Count checks and Duplicates.
  • Validated the data from Landing to EDS layer, by applying the transformation logic by filtering out the records and loading them into tables. (Data Integration Layer)
  • Finally, validated data from Integration to Semantic layer the final Mart tables for business users on analysis of data and used to generate reports and dashboards.
  • Worked extensively with Service Center Team, Claim Rep, claim payment Specialists & State Regulatory Team to resolve the claim EDI Production Issues.
  • Knowledge & hands-on experience on the different state Claim filings such as FROI (00,01,02,04) & SROI (02,04, UR, PY, RB, RE, CB, FN). Ensured the claims acceptance/rejection based on state requirements (Edit Matrix/ Element Requirement & Event requirements document.
  • Validated the different claim types Incident/Indemnity/Medical functionality on the EDI implementation from manual state filings to EDI (3 & 3.1) using Mitchell for specific states.
  • Analyzed Source to Target mapping document based on Master Data, Translate Tables (lookup) &amp Summary tables (Metrics) and validating the DDL, Counts, Not Nulls, Duplicates, PK & FK validation, business logic to be applied on target tables.
  • Tested SCD II, Full & Delta loads to large amount of data to couple of interfaces and making sure Inserts & Updates of loading the data is done correctly.
  • Used ALM 12.0 & Service Now as Test management tool to manage and organize Requirements Coverage, Test Cases, Test Scripts and Creation of defects.
  • Mentored the Newly Onboarded QA team in providing Insurance Guidewire knowledge & WC modules and QA practices/tools and assisted them with any blockers and helping in smooth workflow for all activities.

ETL Test Lead

Cardinal Health
04.2008 - 12.2012
  • Worked as an ETL QA & Onsite Test Lead on several projects both in waterfall & Agile methodology.
  • Worked on various pharmaceutical ordering interfaces for Cardinal such as PD ordering, Mobile, CSW, EDI, 340B & Import/Export.
  • Responsible for Functional/Technical/User documentation review and creation of Test Plan, High level Use Case, Test cases, Test Scripts, Traceability Matrix &Test Execution plan.
  • Performed End-End Data validation from Distrack box (Mainframes AS400) to Dotcom ODS by executing SQL queries in AS 400 iSeries, Sqldbx & MS Access based on Business rules
  • Extracted Source tables & target tables from multiple servers(Flat files, Oracle, and SQL server) and linked them to single platform (MS Access). Created and executed the SQL queries in MS Access to validate the ETL transformation rules in accordance with the Business logic.
  • Analyzing Data Processing implementing Data Mart applications mainly transformation processes using ETL tool and understanding of ETL workflows and ETL mappings and monitoring the Production Environment.
  • Create various SQL Test Queries by analyzing ETL Mapping document (Source > Stage > Target) business rules.
  • Performed Validation of DDL, Counts, duplicate data, Not Null, Default values, -ve testing, Transformation logic validation, Full loads, Delta loads (Inserts/Updates/Deletes).
  • Monitored & assisted the Newly Onboarded Onsite & Offshore test team activities & trained on the project tools needed unblocking the QA activities.
  • Everyday knowledge sharing sessions & updates with QA team, assigning QA activities & tracking the teams status and blockers and helping team with smooth flow for all activities.
  • Conducted System, Integration and Regression Test execution plans. Analyze and report the defects to project team following through the SDLC Defect Management Life Cyle.
  • As a Lead responsible for Uses Cases & Test Execution Process Demos for UAT Sign-Off process to the client.
  • Conducted regular meetings and provided updates are made to the Client, Management team and to On-site QA lead on the ongoing QA activities on behalf of overall team, Attended Project Status and Defect triage review meetings.
  • Worked on IBM Rational testing tools such as Requisite Pro for Requirements, Clear Quest for Defects, Rational Manual Testing for uploading & executing test cases and test scripts.

Functional/ Backend QA Analyst

State of Michigan
01.2012 - 06.2012
  • Worked for State Department (DHS Bridges project) - Participated in Walkthroughs of requirements gathering & JAD’s sessions for analyzing the test scenarios.
  • Created Test Scenarios & Test scripts for the Work requests schedule for every Release and performed Functional testing, Interface testing, Regression & Smoke testing where required.
  • Involved in the Interface testing to validate the data sent from the Bridges through batch process and information received to the Bridges from SSA.
  • Executed Test cases using positive and negative data for both frontend & backend using SQL testing to satisfy the Data completeness, Data transformation, Data Quality, performance & scalability for Bridges application.
  • Involved in testing Bridges application, Self-service application for different programs (FAP, FIP, SSI, CDC, MA/ ME) for different counties Calhoun, Ingham, Mecosta, SSPC West/East, Shiawassee.
  • Hand on Experience in creating Paper based application, Electronic based Document for applying for Benefits, Redetermination, request for changes, and Verifications upload.
  • Knowledge in Interfaces, Benefits Issuance, Correspondence, Management office resources & reports and involved in running of the batch jobs using OPCON Scheduling for testing purposes.
  • Hands on Experience in creating Environment request, Work request, Queries, reports (Graphs) for Test Execution status report.
  • Tested several Business Objects Reports for business needs including Dashboards, Drill-Down, Master-Detailed, Aggregated and Web Reports.
  • Hands on Experience in providing resolution for Non-Emergent tickets.

ETL Informatica Developer

Nationwide Insurance
12.2006 - 02.2008
  • ETL Analysis, Design and Developed the data acquisition process for the data warehouse including the initial load, subsequent loads, refreshes and testing all modules.
  • Worked with various sources for creating ETL’s, extracting data from sources - flat files /DB2/ Teradata and loading them into Target - DB2 & Teradata database.
  • Used various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Router for desirable business data output and fine-tuned the mappings for optimal performance.
  • Coordinating with the DBA’s for managing the project related data in DATABASES across different environments- DEV, QA, UAT and PROD.
  • Extensively worked on complex SQL Queries, made use of System variables, Mapping parameters and variables in the transformations as well as in the parameter files to initialize parameters.
  • Used Teradata Utilities like Fast Load, Multiload, Fast Export, and Tpump for extracting and loading the data into respective tables and loaded consolidated data using DB2Loaders.
  • Knowledge on UNIX shell command in scripting such as SSH, SCP, MAILX etc.
  • Involved in Routine checks of log and trace files for identifying problems requiring action and rectifying performance bottlenecks.
  • Extensively worked on Folder Migrations and Code Migrations
  • Worked with end users to validate that the data extracts produce correct results.
  • Involved in re-designing of the data model architecture for the better performance.
  • Worked on Documentation to describe program development, logic, coding, testing, changes and corrections.
  • Worked as a production support 24/7 *call pager, for the nightly batch process.
  • Familiar with workflow scheduling using Maestro.

Education

Master of Science - Masters in Electronics And Communication

University of Louisville
Louisville, KY

Skills

Trainings & Certifications

  • ISTQB Certified Tester Foundation Level (CTFL)
  • Certified Authority on Workers’ Compensation
  • AWS Certified Cloud Practitioner

Career Skills

  • Cloud Tools - Snowflake , Athena , RedShift , Big Query, GITHUB Co-PILOT ,AWS S3
  • Traditional Databases - DB2, Teradata, AS400 , Sql Server , Oracle, Sqldbx , SQL Server Management Studio
  • Reporting Tools - Tableau , Power BI
  • Informatica 71, 86 , 91 , 102
  • UNIX, Putty ,WINSCP
  • Scheduling Tools - Maestro, Control-M
  • QA Management Tools - Test Director, IBM Rational Tools ,HPQC 90 , Octane
  • Guidewire Application 8X(policy & claims)
  • Methodologies - Waterfall , Agile - Kanban , Scrum , Agile -SAFE
  • Microsoft tools ,OneNote and Office 365
  • Automation Tools - Katalon Studio , ICDEQ

Timeline

ETL QA Lead (Projects)

Cognizant (Client – Albertson’s)
07.2022 - Current

QA Onsite Manager

Client – Kellogg’s
06.2021 - 06.2022

Sr. ETL Test Lead

AF Group
05.2014 - 06.2021

Functional/ Backend QA Analyst

State of Michigan
01.2012 - 06.2012

ETL Test Lead

Cardinal Health
04.2008 - 12.2012

ETL Informatica Developer

Nationwide Insurance
12.2006 - 02.2008

Master of Science - Masters in Electronics And Communication

University of Louisville
Anitha Marri