Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

SRUJANA NALLAMOTHU

Argyle,TX

Summary

ETL/AWS DATA QUALITY ANALYST / ETL TESTER

PROFESSIONAL SUMMARY:

ETL Data Quality Analyst/SQA with 14+ years of progressive experience in ensuring Quality Data Systems with focused experience in Business Intelligence (Database, ETL, OLAP, Data warehouse & Reporting), Cloud (AWS), Hadoop, Big data Technologies, Web based applications etc.

HIGHLIGHTS:

  • Conversant with all phases of), Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) in Requirements gathering, Analysis, Design, Coding, Testing etc.
  • Experience in Data Analysis, Data Validation, Data Profiling, Data Verification, Data Mapping, Data Loading, Data Warehousing/ETL/Database/Big data/Cloud-AWS testing.
  • Excellent coordination through client, source systems and user interactions, experience in requirements analysis, mapping of business processes to technical requirements.
  • Expertise in writing complex queries in validating data into Data Warehouse/ ETL applications.
  • Experience in ETL methodology for supporting Data Extraction, Transformations and loading process in a corporate-wide ETL solution using SQL BI, Informatica Power Center.
  • Expertise in implementing/creating scripts to validate complex business rules as per the mapping documents.
  • Experience in implementation of Type I and Type II Dimension tables and Fact tables.
  • Experience in Metadata and Star schema/Snowflake schema. Analyzed Source Systems, Staging area, Fact and Dimension tables in Target D/W.
  • Highly performed backend testing to check the Data Integrity throughout the database.
  • Strong experience in preparing Requirement Traceability Matrix (RTM), Agile Testing Document, Test Strategy, Developing Test Plan, Detailed Test Suits, writing Test Scripts by decomposing Business Requirements, and developing Test Scenarios to support quality deliverables.
  • Professional experience in Integration, Functional, Regression, System Testing, UAT, Black Box and GUI testing.
  • Strong Knowledge and understanding of ETL/Data Warehouse/Big Data concepts – HDFS, HIVE, HBASE. Good working experience in writing HIVE QL/Queries in validating the data against HIVE tables.
  • Good Knowledge of basic UNIX Shell Scripts or Commands.
  • Strong Domain Experience in Mortgage, Banking, Insurance, Health care and government-based applications.
  • Good knowledge of AWS services S3, EC2, Lambda, Redshift, Athena, Cloud Watch, AWS Batch etc.
    Expert in problem solving and bug tracking using bug tracking tools HP Quality Center, ALM, Clear Quest, JIRA.
  • Experience in working with Software Development team in resolving Defects, presenting the Defect Status reports, resolving requirement, and observed design inconsistencies.

Overview

13
13
years of professional experience
1
1
Certification

Work History

Sr. Software Quality Assurance Analyst 3

Security Finance
04.2021 - 05.2022

Project Description:

Project - Credit Reporting (Business as Usual)

The Scope of this project is SFCS furnishes information to Credit Reporting Agencies (CRAS) Equifax, TransUnion, and Experian on a monthly basis on behalf of subsidiary portfolios of customers and accounts to facilitate accurate reporting and to comply with SFCS responsibilities as a data furnisher, data provided to CRAS is reported within the Metro 2 Format which enables Security Finance to report business scenarios accurately and comply with Fair Credit Reporting Act (FCRA) as it relates to accuracy and integrity of data being reported.

Responsibilities:

  • Involving in daily scrum meetings and sprint meetings to discuss workflow and project specifications
  • Extensive domain knowledge on how the 3 credit bureaus (TransUnion, Equifax, Experian) works hand in hand in maintaining the credit worthiness and history of the customer
  • Actively collaborating with developers and product owner to clarify requirements, especially in terms of testability, consistency, and completeness and develop manual and automated test cases, execute them and ensure standard QA processes are followed
  • Analyzing system requirements and developing detailed test plans for system testing
  • Converting the user stories (agile methodology requirements documents) and design documentation into test design products: test scenarios, test cases, and test scripts
  • Involved in defect management, identify bugs, isolate issue, report bugs using JIRA and qTest, tracking the defect status and work with developers, product managers and ensure issues are resolved on timely basis
  • Validate that data being reported to CRAS is mapped to the consumer’s file with greater consistency and as per the Metro 2 guidelines
  • Develop automated tests using python scripting to parse the Metro2 files and validate data as per each account statuses and verify all the Metro2 fields are reported accurately as per FCRA guidelines
  • Validate SSIS jobs to verify that correct data is extracted from the SFC branch systems to centralized credit reporting database monthly which is key for reporting accurate data to CRAS
  • Writing complex queries in Sybase ASE to validate customer related information and verify that accurate information from the database related to payment history is being reported
  • Extensively involved in testing Customer Information website which is build using .NET and verify that customer’s personal information, payments history, charges, and Metro2 information based on monthly reporting accurately reflected
  • Participating proactively in team retrospectives, suggesting, and implementing improvements

Environment: JIRA, AGILE/SCRUM, SharePoint, Sybase ASE, NET, SSIS, Microsoft SQL Server Management Studio, qTest, Python, Excel

ETL/AWS Data Quality Analyst/Lead

Client: Fannie Mae
01.2020 - 03.2021

Project - MRP BAU(Business as Usual)

Responsibilities:

  • Assisting in root cause analysis, investigating any data errors or anomalies, and assisting in the implementation of solutions to correct data problems
  • Participate in Agile/Scrum methodology and work on JIRA user stories for delivering software solutions
  • Work closely with business analysts and application team to document design recommendations and define quality gates using proven test-driven development techniques to ensure quality delivery of software
  • Decompose high -level information into details, abstract from low-level information to a general understanding and distinguish user requests from the underlying true needs throughout the SDLC
  • Supporting ETL data validation from data sources through each step of extract, transform and load process including final load to target tables
  • Analyzed complex functional requirements (Source to Target mappings workflows) and translate them to detailed test suites/scenarios/strategies to automate various functionalities
  • Responsible for running various jobs to validate job orchestration and data loads using AWS batch console
  • Ensure accurate data and reporting in terms of content and infrastructure by identifying data issues and source anomalies
  • Involved in Extensive testing of Type I and Type II Dimension tables and Fact tables
  • Responsible for creating test artifacts (Test Cases, Test Plans, Agile testing document, Test Strategy, Requirements Traceability Matrix (RTM) etc.)
  • Perform Functional Testing, Regression Testing, Integration Testing, End-to-End Testing
  • Develop complex SQL scripts/queries to verify proper mapping of data elements, verify data extracts, record counts and to validate business rules using Redshift
  • Prepare test data to verify the functionality
  • Test reports with filters and metrics in Micro strategy for business analysis
  • Responsible for reviews of test metrics, documents, reports and schedules with management and business teams

Environment: SQL Workbench, Netezza, UNIX, Putty, JIRA, AGILE/SCRUM, SharePoint, ETL, Data warehouse, DB Visualizer, Data Testing Framework (DTF), Java, AWS,ATHENA,S3,EC2,Step Functions, LAMBDA,AWS Batch, SQL Workbench, Zeppelin, TALEND, MicroStrategy

Sr. ETL/Cloud/ Data Quality Analyst

Client: Fannie Mae
04.2018 - 12.2019

Project Description:

Project - MRP-CMDS/CMBI Cloud Build Out

The Scope of this project is migrating data in the current On-Prem Netezza database to AWS – Cloud. The data from upstream source (S3 Buckets) is pulled into AWS- Cloud using ETL Tool Talend on which Transformations are performed in Prepare layer and then loaded into Final tables in Insight Layer and then moved to AWS Redshift for reporting purposes. The primary goal of the project is to make sure that the portfolio data in On- Prem match exactly the portfolio data in AWS –Redshift tables for a given portfolio date

Responsibilities:

  • Analyzing user requirements, understood business rules in order to build the Enterprise-wide data quality environment
  • Identified the different sources from which the data is coming as feed to the Warehouse
  • Developed High level SQL scripts to validate the data for the applications built up using Talend
  • Designed various Test Scenarios and Test Cases to validate the integrity of the data passing to the target database (Redshift)
  • Analyze any source system issues and finding root cause based on actual business process
  • Writing complex SQL queries to compare data between Netezza (On-Prem) Vs Redshift (Cloud)
  • Develop test suites, summary reports and other artifacts based on business requirements documents and mapping documents to ensure that entire source to target mapping flow is as per the business expectation
  • Validate Table Structure/DDL etc.
  • For all Staging tables (Prepare layer) and Final tables (Insight layer)
  • Validate Data that is loaded from source files (S3- Buckets) to staging tables and test the transformation logic as per mapping documents
  • Validate On-Prem staging tables Vs Athena Staging Tables (Prepare Zone) for a given portfolio date using Automation tool DTF (Data Testing Framework) and communicate the discrepancies with Development
  • Run the Jobs to load the data from the source and validated if the jobs designed are as per the requirement
  • Support Data Quality Managers to develop tools for data governance and master data management
  • Track defects using JIRA and communicate source discrepancies with the upstream source systems

Environment: SQL Developer, ORACLE, Netezza, UNIX, Putty, JIRA, AGILE/SCRUM, SharePoint, ETL, Data warehouse, DB Visualizer, Data Testing Framework (DTF), Java, AWS,ATHENA,S3,EC2,Step Functions, LAMBDA,AWS Batch, SQl Workbench, Zeppelin, TALEND.

Sr. Database/ETL Data Quality Analyst

Client: Fannie Mae
08.2017 - 04.2018

Project Description:

Project - MRP - EDI Integration

The Scope of this project is MRP will transition data sourcing from FDW to EDI for the following portfolio populations-Single Family Loans and Loan Commitments, Multifamily Loans and Liabilities (the Liabilities portfolio includes Debt, Derivatives, and the Other Investment Portfolio (OIP), which can include both assets and Liabilities

  • The primary goal of the project is to make the portfolio data that is sourced from EDI match exactly the portfolio data that would have been sourced from FDW so that the risk analytics processes that MRP performs can continue to function as they are currently configured

Responsibilities:

  • Experience assessing testing processes, creating, implementing testing strategies using Agile-testing methodology
  • Participate in daily stand- up meetings, sprint planning and retrospective meetings and other required meetings as needed
  • Participate in design, development, testing of Data Warehouse (ETL) applications and ensure that deliverables meet the functional and design specifications
  • Analyzing any source system issues and finding root cause based on actual business process
  • Writing complex SQL queries to compare Netezza/Oracle tables
  • Develop test suites, summary reports and other artifacts based on business requirements documents and mapping documents to ensure that entire ETL source to target mapping flow is as per the expectation
  • Involved in writing complex SQL queries by combing data from multiple tables/sources with various filters
  • Loaded the test data from source systems to target and validated if the attributes are populated as per the business requirements document, mapping document and transformation rules
  • Validated the Source, Stage and Target (End-to-End) based on the test conditions derived from the business requirements
  • Executed test suites, scenarios, scripts, or procedures to reconcile records with source, publish and communicate results and statuses using established processes
  • Reconciled the population of all the attributes from source to target counts along with the data, analyzed the results, and communicated the issues with the upstream source systems
  • Created artifacts like Requirements Traceability Matrix (RTM), Test results, Test cases, Test plan, Test Scenarios, Agile testing document etc
  • Actively involved in the meetings and coordinated with upstream and downstream systems and identifying the business scenarios and issues
  • Worked to minimize risk by identifying, communicating issues/risks in advance
  • Involved in the ICART deployments to migrate code to higher environments by coordinating with the team

Environment: SQL Developer, TOAD, Netezza, UNIX, Putty, ICART, ORACLE, JIRA, AGILE/SCRUM, HP ALM, SharePoint, IBM Rational Clear Quest, ETL, ELT, Data warehouse, DB Visualizer, Data Testing Framework (DTF), Java, Shell scripting.

Sr. Hadoop Data Analyst /QA

Blue Cross Blue Shield of Texas
05.2016 - 07.2017

Project- Integrated Care Coordination – Data Infrastructure (ICC-DI)

Project Description:

The main objective of the project is to implement a centralized data warehouse to ingest the files from different vendor sources. The information systems for business were customized and utilized the healthcare enterprise data warehouse to collect, organize and store data from all the sources to integrate the historic data for user reporting

Responsibilities:

  • Validate the Data Model against the BRD and Functional Specs and identified Design issues related to missing columns, column data types/lengths, Table joins
  • Profiled and Inspect the production source data and validated against the Design and Transformation Rules
  • Several issues identified during this process
  • ETL process is tested for each Table of the Data Store and all the columns for all the tables are tested for all the possible scenarios for the Transformation Rules
  • Expert in data cleansing for accurate reporting of campaign data
  • Thoroughly analyze data and integrate different data sources to process matching functions
  • Perform Data analysis, Data Validation, data Lineage, data Cleansing, Data Verification on the Source data
  • Participate in daily standup and per project needs to do defect triage and track the status on open issues and to discuss on proceedings of new task assignments
  • Work with the Subject Matter Experts to understand and analyze impacts of various data acquisitions
  • Responsible for reviewing Use cases with the Business and align the test scenarios accordingly
  • The base tables which are refreshed /loaded as one time or month-over-month using source flat files or excel data are validated on database to check the right data is loaded in the expected format
  • Write complex SQL queries to validate the data as per mapping rules
  • Work on multiple issues raised by different users of data Warehouse and aid in analyzing and modifying user queries to pull the reports
  • Good experience in writing HIVE QL/Queries in validating the data against HIVE tables
  • Validate ingesting of files from different source systems to Data Lake in the required partitions
  • Validate the ELT pipelines from source to Data Lake and Data Lake to Hive Tables
  • Verify the job flow from HDFS to HIVE tables by running jobs and validated the columns based on the transformation logic
  • Identify Test cases at Unit level, System level, Enterprise level

Environment: Microsoft SQL Server 2008 R2, SSIS, SSRS, Teradata14.0, Teradata SQL Assistant, Flat files, HP ALM, SQL Server Procedures and Packages, AGILE/Scrum Methodology, Horton works Distribution(Hadoop), Hive, HDFS, HBASE, Red hat LINUX.

Sr. Data Warehouse/Data Analyst

Visual Consultants Inc
07.2014 - 04.2016

Responsibilities:

  • Participate in the project life cycle from analysis to production implementation, with emphasis on system test plans, execution, analysis, and results
  • Participate in walkthroughs of business requirements and design specifications for the projects
  • Detailed understanding of design specifications documents, Load Documents, S to T mappings, Transformation rules documents and Data Models
  • Involve in detail understanding of Star Schema and Snow Flaked architecture
  • Involve in Extensive testing of Type I and Type II Dimension tables and Fact tables
  • Conduct system testing from Landing Zone (LZ) to Data Warehouse to validate data for table loads, element level transformation rules as specified in design/load documentation and source target mappings
  • Extensively used simple and complex custom SQL queries to test the data and the transformation rules and table load rules from Source to Target tables
  • Extensive testing has been performed on Current, history tables and Fact tables
  • Perform data Analysis and data profiling using complex SQL on various sources systems to ensure accuracy of the data between the warehouse and source systems
  • Design test plans, test scripts, test scenarios and test data for UAT testing
  • Develop test reports and participate in testing prioritization and archived test results and user signoff
  • To establish strong working relationships with selected business functions enabling strong communication and expectation management
  • Support Data Quality Managers to develop tools for data governance and master data management
  • Perform or assist with the performing of reconciliations of static data between systems and externals sources

Environment: Informatica, HP Quality Center, SQL SERVER, SQL, UNIX, Windows XP, Oracle 10g/11g Agile, Business Objects, Jira, Autosys.

ETL Data Quality Analyst

Client: Minnesota Depart Of Revenue(MNDOR)
05.2013 - 06.2014

Project Description:

Project - Data, Information and Knowledge Management,

The main objective of this project is to develop a data warehouse to provide reports for the compliance and audit department to identify the anomalies in the tax withheld amounts reported on the individual tax returns and the amounts reported on W-2 by employers. This project is to involve in W-2 matching project to analyze and test if W2’s was matching and to generate reports. These reports would help auditors identify the fraud and take the action

Responsibilities:

  • Validate data moving from source to ODS to EDW
  • Understanding of Star Schema and Snowflake Schema, relationship between Fact and Dimension tables
  • Perform functional, integration, regression, UAT and end to end testing for this project
  • Extensively used Oracle to write SQL Queries to verify and validate the Database Updates
  • Verify complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables
  • Provide the management with weekly documents like test metrics, reports, and schedules
  • Prioritize and report defects using Quality Center to present documents and reports in weekly team meetings
  • Analyze Functional Requirements and Use cases and developed Test Plans, Test cases and Test scripts and Traceability matrix

Environment: Informatica 8.6, HP Quality Center, Oracle10g, SQL, PL/SQL, SQL*Plus, UNIX, shell scripting, JIRA, WinSQL, Business Objects.

ETL Data Quality Analyst /QA

Client: Deloitte Consulting LLP
06.2012 - 04.2013

Project Description:

Project - HealthCare Benefits Exchange (PA, HBE)

PA HBE Project is about building a Health benefits exchange (HBE) for facilitating Obama care mandate. PA state Health dept has the health data of certain individuals it already services, which must be merged into the exchange for HBE to be a complete single service going forward. The Data from health dept ACES is sourced in flat files periodically, which are converted into HBE compatible format and loaded to RDBMS (Oracle)

Responsibilities:

  • Validate data moving from source to ODS to EDW
  • Understanding of Star Schema and Snowflake Schema, relationship between Fact and Dimension tables
  • Extensively use Oracle to write SQL Queries to verify and validate the Database Updates
  • Verify complex ETL Mappings based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables
  • Prioritize and report defects using Quality Center to present documents and reports in weekly team meetings
  • Analyze Functional Requirements, Use cases and develop Test Plans, Test cases and Test scripts and Traceability matrix

Environment: Informatica, HP Quality Center, Oracle10g, SQL, PL/SQL, DB2, SQL*Plus, UNIX, shell scripting

Graduate Assistant

Oklahoma Christian University
08.2010 - 04.2012

Project Description:

Project - Student Administration System

The student administration system was designed to make the task of storing the student information easier. It stores all the information related to the students regarding course, fees paid, roll number, etc. The system is also used to ensured security of data and easy retrieval of the data whenever necessary. Project developed as a "PART OF COURSE REQUIREMENT"

Responsibilities:

  • Follow the algorithms given by the Professors and developing tables and database queries
  • Develop proper procedures and functions for the project
  • Develop triggers for proper working of the queries and ensure that proper error messages are generated upon errors
  • Use ORACLE as the backend connectivity and data storage
  • Write SQL and PL/SQL stored procedures to create database tables and to store data into tables
  • Use PL/SQL triggers to identify any erroneous data entered by the users.

Environment: ORACLE 9i,PL/SQL, Windows 7

Data Analyst

Satyam Computer Services Ltd
05.2009 - 07.2010

Project Description:

Project - Insurance Information System (IIS)

The client for this project (ICICI) is a leading insurance organization in India serving businesses and individuals with a wide range of insurance products and insurance-related services. This Project is an Insurance Information System, which gives online quotes for customer's Auto and Home insurance in different states. The system keeps track of all the insurance quotes and its customer details and related projects. System consists of different modules like Customer details, Benefits and Packages. Provision was also available to directly apply for a policy and make necessary payments on-line through Internet

Responsibilities:

  • Participate in the project life cycle from analysis to production implementation, with emphasis on system test plans, execution, analysis and results
  • Participate in walkthroughs of business requirements and design specifications for the projects
  • Detail understanding of design specifications documents, Load Documents, S to T mappings, Transformation rules documents and Data Models
  • Extensively use simple and complex custom SQL queries to test the data and the transformation rules and table load rules from Source to Target tables

Environment,: Oracle 9i, PL/SQL, .NET, , Java Script, Windows.

Education

Master of Science - Computer Engineering

Oklahoma Christian University
Edmond, OK
04.2012

Bachelor of Science - Computer Science and Engineering

JNTU
Hyderabad, India
03.2009

Skills

  • Operating Systems: Windows 95/98/00/XP, MS-DOS, UNIX
  • ETL/BI Tools: Informatica 91/86/81, Business Objects, Micro strategy, Talend, SSSIS
  • Amazon Web Services: EC2, S3,Cloud Watch, Redshift, Lambda, Athena etc
  • Databases/DB Tools: Oracle 8x/9x/10x, SQL SERVER 2008/2012, MySQL, Teradata, TOAD, SQL*PLUS, SQL*Loader, SQL Developer, Netezza, SQL Assistant, SQL Workbench, SAP ASE/SYBASE ASE, Db Visualizer
  • Programming: Core JAVA, NET technologies (C#), SQL, Python
  • Test Management/Bug Tracking Tools: HP Quality Center, HP ALM, XRAY, qTest
  • Big Data/Hadoop Technologies: Hadoop, HDFS, HIVE, HBASE

Certification

SQL Advanced

SQL Basic

Data Warehouse and SQL

Timeline

Sr. Software Quality Assurance Analyst 3

Security Finance
04.2021 - 05.2022

ETL/AWS Data Quality Analyst/Lead

Client: Fannie Mae
01.2020 - 03.2021

Sr. ETL/Cloud/ Data Quality Analyst

Client: Fannie Mae
04.2018 - 12.2019

Sr. Database/ETL Data Quality Analyst

Client: Fannie Mae
08.2017 - 04.2018

Sr. Hadoop Data Analyst /QA

Blue Cross Blue Shield of Texas
05.2016 - 07.2017

Sr. Data Warehouse/Data Analyst

Visual Consultants Inc
07.2014 - 04.2016

ETL Data Quality Analyst

Client: Minnesota Depart Of Revenue(MNDOR)
05.2013 - 06.2014

ETL Data Quality Analyst /QA

Client: Deloitte Consulting LLP
06.2012 - 04.2013

Graduate Assistant

Oklahoma Christian University
08.2010 - 04.2012

Data Analyst

Satyam Computer Services Ltd
05.2009 - 07.2010

Master of Science - Computer Engineering

Oklahoma Christian University

Bachelor of Science - Computer Science and Engineering

JNTU
SRUJANA NALLAMOTHU