Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Sruthi Yelamanchili

Summary

13.6 years of IT experience in ETL Testing (Data Warehousing Testing), Functional Testing, and Automation Testing. Strong domain knowledge in Banking, eWallet, Pharma, eCommerce, and Telecom. Hands-on experience with SQL, SQL Server, Oracle, PostgreSQL, Snowflake, and Db2. Well-versed in ETL processes (extract, transform, load), including capturing, reporting, and managing data from diverse sources. Skilled in Data Warehousing concepts, data models, star/snowflake schemas, and OLAP. Experienced in ETL development and testing using Informatica PowerCenter and ETL pipeline automation using PySpark. Proficient in Python, Pandas (ETL automation, data validation), and Pytest (unit and functional testing). Expertise in validation of source-to-target data mappings, data integrity checks, and data consistency analysis. Good experience with Autosys for scheduling and monitoring ETL workflows. Experience in Cloud-based ETL and data storage (AWS S3) with working knowledge of Azure and Databricks. Familiar with CI/CD and Version Control tools including Jenkins, Git, and GitLab. Skilled in Data Analysis using Pandas. Strong experience in Software Development Life Cycle (SDLC) and Agile/Scrum methodologies. Proficient in test planning, test case design, execution, defect reporting, and documentation. Strong interpersonal, written, and verbal communication skills, with the ability to work independently as well as in a team, and deliver quality results on time.

Overview

15
15
years of professional experience
1
1
Certification

Work History

Test Lead

Verizon
12.2024 - Current
  • Project: Lift and Shift Data Migration – On-Prem to AWS
  • Summary: Worked on a data migration project to move enterprise databases and ETL processes from on-premises systems to AWS using a lift and shift approach. Migrated Oracle(CLFRYTRP, DWPROD), DB2, Snowflake databases, and legacy systems like Kent PagePlus to the cloud with minimal architecture changes. Used Datastage to run and monitor ETL jobs, and tools like SQL developer and dbeaver for data validation and troubleshooting. Utilized Accelerator AI tool to run SQL queries for validating data accuracy and records counts across source and target systems. Ensured a smooth migration with no data loss and minimal downtime.
  • Environment: Datastage, Python, Pyspark, HP ALM, SQL Server, Jira, Oracle, SQL Developer, Unix, Shell Script, Autosys, SQL Server, Snowflake, Postgres, AWS EC2, S3
  • Responsibilities:
  • Define and document the overall test strategy for the data migration project.
  • Prepare detailed Test Plans, covering functional, system integration, regression, and data validation testing.
  • Define Entry and Exit Criteria for all testing phases.
  • Identify key test objectives, scope, testing types, tools, and environments.
  • Create and review detailed test cases and test scenarios aligned with business and technical requirements.
  • Ensure data validation rules are thoroughly covered in test cases for each ETL job.
  • Execute test cases and ensure traceability to business requirements.
  • Run DataStage jobs as part of the ETL testing cycle. Monitor job runs and analyze logs in DataStage Director to identify failures, performance issues, or anomalies. Capture and document validation queries used to verify each DataStage job’s output.
  • Prepared the Schell script for trigger the DataStage jobs from EC2. Run the unix commands and validate the job status.
  • Trigger and monitor Autosys jobs to validate end-to-end automation. Collaborate with scheduling and DevOps teams for job dependencies and execution timelines.
  • Develop and execute Python and PySpark scripts to validate large volumes of migrated data.
  • Perform record-level, aggregate-level, and field-level validation between source and target systems. Automate comparisons between on-premise and AWS-hosted data by using Accelerator AI tool.
  • Raise and track access requests for tools, databases, and environments.
  • Coordinate with Infra, Security, and Cloud teams to ensure timely provisioning.
  • Log, track, and manage defects in a centralized tool (e.g., JIRA, ALM).
  • Prepare Defect Reports, including root cause analysis, severity, impact, and resolution timelines.
  • Work with development and ETL teams to triage and resolve issues found in DataStage jobs and data pipelines.
  • Maintain a detailed Test Tracker to record test progress, status, and outcomes.
  • Generate daily/weekly test status reports for project stakeholders.
  • Document test results, validation queries, automation scripts, and evidence for audits.
  • Collaborate with Development, Cloud, DevOps, Database, and Business teams throughout the test lifecycle.
  • Participate in daily stand-ups, status meetings, and defect triages.
  • Identify testing risks and mitigation plans. Ensure test environment readiness and data availability. Enforce quality gates before production movement.

Sr. ETL QA Tester

Capital One
10.2023 - 11.2024
  • Environment: Informatica 10.5.4, Python, Pytest, HP ALM, SQL Server, FlatFiles, Excel, Windows PowerShell, Jira, Oracle, SQL Developer, Unix, Autosys
  • Responsibilities:
  • Developed Python-based ETL automation scripts to validate large datasets for data migration and transformation projects.
  • Collaborated with the ETL development team to ensure data flows were accurate and aligned with business requirements.
  • Created and executed SQL queries for data validation, ensuring accurate source-to-target data mappings.
  • Automated test cases using Pytest for verifying ETL processes, reducing manual effort by 50%.
  • Participated in data reconciliation activities, identifying and resolving data mismatches in complex ETL workflows.
  • Improved testing efficiency by introducing a new automation framework, cutting down test execution time by 40%.
  • Worked closely with data engineers to resolve ETL pipeline failures and optimize data processing.
  • Validate the execution order of ETL jobs using Autosys.
  • Measure job execution times under different loads, and also validate the tool’s ability to handle large volume of data without delays.
  • Wrote and executed Unix shell scripts to automate repetitive ETL Testing tasks, such as file validation and data reconciliation.
  • Analyzed Unix system logs to debug and resolve ETL job failures, ensuring smooth pipeline execution.
  • Prepared test cases by understanding the business requirements Data Mapping documents and technical specifications.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Wrote and implemented manual automated test scripts for Risk Warehouse decision support system using HP ALM.
  • Coordinated with development team in system integration test and services level unit testing.
  • Exported Manual Test Cases from MS Excel template directly to HP ALM and executed all the Test Cases in ALM with Pass/Fail/Blocked status
  • Actively participated in creating requirements Traceability matrices and Test plans
  • Prepared test requirements, test plan approach document and test cases
  • Performed Unit and functional testing for all the Mappings and Sessions.

ETL Data Tester

Roche
01.2021 - 10.2023
  • Company Overview: Roche is a pharmaceutical company, This project deals with various systems integrations data flow from upstream systems to downstream systems. Downstream Integration is the flow between AWS Athena Tables (Data Warehouse Publish Layer in AWS) and Target Files.
  • Environment: , Informatica Power Center, Ms Excel, SQL Developer, Salesforce, AWS S3,Pyspark, Athena Framework, Gitlab, Windows PowerShell, Winscp, HP ALM, Jira, Dbeaver, Python
  • Responsibilities:
  • Collaborated with the development and migration teams to validate the ETL process for migrating data from a traditional data warehouse to AWS Redshift.
  • Designed and executed test strategies and test plans for cloud-based data migration, ensuring data consistency, accuracy, and performance in AWS environments.
  • Utilized AWS service S3 for the ETL process, working with large datasets and ensuring smooth data flow and integration.
  • Created detailed test scripts, scenarios, and cases to validate ETL pipelines and transformations.
  • Performed source-to-target data validation, reconciliation, and data completeness checks using SQL and AWS tools.
  • Identified data issues and worked with developers to troubleshoot ETL bugs, ensuring data integrity throughout the migration.
  • Conducted performance and load testing to validate the efficiency of the new cloud-based ETL pipelines.
  • Assisted in automating ETL validation processes using Python and AWS Lambda functions.
  • Worked with business analysts and data architects to understand the project’s functional and data requirements, ensuring alignment with business goals.
  • Documented test results, defects, and progress reports for stakeholders.
  • Prepare SQL queries for various scenarios like count test, data check and duplicate check
  • Upload and execute the test cases in HP ALM tool
  • Log the defects in Jira tool for tracking purpose.
  • Run the workflows to load the data in target database by using Informatica PowerCenter tool
  • Verify the data by running python scripts in Gitlab and validate the reports.

ETL SQL Tester

AM Bank
08.2019 - 12.2020
  • Client Description: Customer details and account information saved in Staging. Extract the data from staging and transform into various tables in CM. And also load the same data into SDIS tables based on the PARAM_ISS values.
  • Environment: Informatica, HP ALM, Aginity editor, QlikView, Oracle, SQL SERVER, Winscp, Jira
  • Responsibilities:
  • Involved in preparing Test Plan based on User requirement documents.
  • Written test cases as per the Functional Specification documents and executed them.
  • Performed Interface/flow, Functionality & Data validation for Modules like Tax Calculation, Customer ID, Recipient, Payment and Confirm Validate Orders.
  • Performed Integration, Regression, Acceptance, System, Usability, etc. testing during different stages of the application development.
  • Requirements changed documented and test cases were modified/incorporated in respective Functional Specifications.
  • Responsible for performing various types of process evaluations during each phase of the software development life cycle including, audit, review, walk through and hands on system testing.
  • Involved in Functional testing, business requirement testing, data requirement testing, regression testing, system integration testing, data validation, and non-functional testing.
  • Analyzed the EDW mapping documents, Dimension tables and fact tables.
  • Involved in testing the data warehouse for both the initial load and the incremental loads of the target.
  • Tested application, logged found bugs into HP ALM, monitored their progress and verified their fix.
  • Actively participated in enhancement meetings focused on making the product more effective in real time scenario.

Software Tester

HP
12.2015 - 06.2019
  • Client Description: This project aimed to Data flow of upstream and downstream systems of applications. These applications are LH, Polaris, VCP, CPAT, Getpaid and DSAS. VCI, VCP, CPAT and Getpaid are part of the C&C system. The application and data flow of upstream and downstream systems for VCI, the inbound and outbound paths for file transmissions, HPSB conversion of file names and tidal job timings.
  • Environment: Toad, Batch Server, VCI System, HP ALM
  • Responsibilities:
  • Participated in full phase of project’s SDLC right from design, development, testing and implementation of application.
  • Review Test Requirements, Creating Test Plans and Test Strategy documents. Testing Schedule Preparation Monitoring and Tracking Testing Activities.
  • Preparing and Executing SIT test Cases/Scripts in HP ALM, Creating Functional/Regression Test Cases in HPALM.
  • Involved in UAT testing in UAT environment, Prepared UAT test cases and testing estimates.
  • Developed Test Scripts using Functional Requirement Documents. Test scenarios are created for Positive, Negative, Boundary cases, etc. Performed all types of testing on workflows and server software.
  • Documented software defects using bug tracking system and reported defects involving program functionality, output, online screen and content to software developers.
  • Coordinate in administering and managing the bugs in Defect Tracking Tool and Preparing and Managing the metrics.
  • Assist developers to reproduce the defects. Provide daily status report, defect matrices, daily updates to lead Provide daily status report on task assigned to the Project Manager/Client.

Software Tester/Analyst

Williams Sonoma
08.2010 - 11.2015
  • Client Description: Williams Sonoma is an eCommerce website, this Self-Service Enablement project aims to provide Amazon like order visibility, order tracking and returns functionality on our E-com websites. The order tracking/shipment visibility provided on our websites is very minimal. About 30% of the calls coming into the Care Centre are related to order/shipment tracking related queries. And also aimed to provide additional information to the customer about the status of Sutter Street direct ship furniture orders. The project also aims to provide additional info throughout the furniture delivery process.
  • Environment: Bugzilla, Sterling Commerce, TIBCO, DTC, E-Commerce, DSUI
  • Responsibilities:
  • Participated in full phase of project’s SDLC right from design, development, testing and implementation of application.
  • Review Test Requirements, Creating Test Plans and Test Strategy documents.
  • Testing Schedule Preparation Monitoring and Tracking Testing Activities
  • Preparing and Executing SIT test Cases/Scripts in HP ALM, Creating Functional/Regression Test Cases in HP ALM.
  • Involved in UAT testing in UAT environment, Prepared UAT test cases and testing estimates.
  • Developed Test Scripts using Functional Requirement Documents. Test scenarios are created for Positive, Negative, Boundary cases, etc. Performed all types of testing on workflows and server software.
  • Documented software defects using bug tracking system and reported defects involving program functionality, output, online screen and content to software developers.
  • Coordinate in administering and managing the bugs in Defect Tracking Tool and Preparing and Managing the metrics.
  • Assist developers to reproduce the defects. Provide daily status report, defect matrices, daily updates to lead Provide daily status report on task assigned to the Project Manager/Client.

Education

Bachelors - Information Technology

Jawaharlal Nehru Technological University
05.2010

Skills

  • Programming & Scripting: SQL, Python, Pandas, Unix
  • ETL Tools: Informatica PowerCenter, Datastage
  • Big Data & Analytics: PySpark , Apache Spark, Databricks
  • Cloud Platforms & Services: AWS (S3, Redshift, Glue, Athena, EC2), Azure
  • Databases: Oracle, SQL Server, PostgreSQL, Snowflake, Db2
  • Database Tools: DBeaver, Toad, SQL Developer
  • Scheduling Tools: Autosys
  • Version Control & CI/CD: Git, GitLab, Jenkins
  • Testing & Bug Tracking Tools: HP ALM, JIRA, Bugzilla
  • API & Integration Tools: Postman
  • File Management: WinSCP

Certification

ISEB-ISQTB

Timeline

Test Lead

Verizon
12.2024 - Current

Sr. ETL QA Tester

Capital One
10.2023 - 11.2024

ETL Data Tester

Roche
01.2021 - 10.2023

ETL SQL Tester

AM Bank
08.2019 - 12.2020

Software Tester

HP
12.2015 - 06.2019

Software Tester/Analyst

Williams Sonoma
08.2010 - 11.2015

Bachelors - Information Technology

Jawaharlal Nehru Technological University
Sruthi Yelamanchili