Summary
Overview
Work History
Education
Skills
Additional Information
Timeline
Generic

Karthik Hulkoti

Chandler,AZ

Summary

Around 15 years of Experience as QA Engineer, ETL & big data tester, Data Analyst , Business system analyst working on Big Data & ETL data Analytics projects .Experience in Business Application Software Testing, Quality Assurance, Testing process, Testing methodologies and strong experience on STLC and overall testing approaches. Experienced in leading teams from project inception through release planning. Develop Mapping logic for all Data elements for Project requirement and establish transformation logic from business requirement to the ETL deployment. In-Depth knowledge Treasury - ALMT -Regulatory reporting like US LCR,FR2052a 5G/6G, KLMIT,ILST, IFRS9 reporting . Experience working in Data Profiling and Data Governance projects. Experience in doing preliminary analysis on any projects and present data that supports proposed solution/Design of this product. Experience in Regulating the data from SOR’s (Source of record) with valid data quality rules And creating dashboard in OBIEE, Power BI & tableau . Experience working in Capital Market ,Liquidity & risk management projects . Good Knowledge on DWH Concepts ,ETL Testing & Big data Testing. Good Knowledge on ETL tools like Informatica Power center, Abhinito. Experience working on Major Components in Bigdata ecosystem Hadoop,Hive,Sqoop,Spark. Experience in Data management and Testing Bigdata Applications implemented using Spark & Python. Have industry experience of 15 Years across various domains including Banking ,Insurance ,HealthCare and Retail. Have 6+ years’ experience in Agile development life cycle. Have good work experience in Data warehousing/Big data/Hadoop/Business reporting projects in which have worked in technologies and tools like - Informatica,Teradata, Microstrategy,Tableau, Business objects, Mainframe, SSIS package, SQL server 2008, Azkaban scheduler, Big data file concepts, Google cloud platform ,DevOps CICD . Have experience in Banking, Healthcare & Insurance domain. Expertise in ETL testing, Business report testing, Big data testing and Automation testing. Very good in SQL Scripts,PL/SQL , Unix shell scripting ,Python Programming and Hive query language.

Overview

15
15
years of professional experience

Work History

Sr QA Engineer/Business Data Analyst

BMO Financial Group
05.2019 - Current

Environment:

Datawharehouse, HDFS,Hive,Sqoop,Gerrit,Jekins,Shell,Teradata,Python, Agile,Scrum Model ,Tidal scheduling tool, PL/SQL,Jekins,Gerrit, Jira , OBBIE and Tableau reporting tool.Data Analysis,Business system Analysis ,Data profiling, Data governance

Responsibilities:

  • Analyzed Business Requirements ,Rules and work with Business Analysts and Users in designing Data flow mapping ,test plan and business solutions related to ETL logic needed to load heterogeneous data from various source system of bank
  • Providing QA estimates for all Test deliverables, review Business requirement and technical specification document
  • Working in Agile development life cycle and follow Test driven iterative development
  • Attend sprint-planning sessions- Prioritize story(work), daily stand ups and provide estimate story for preparing QA scripts, test cases, test execution and Product Owner review
  • Create Test Strategy, Test optimization, effort distribution in agile environment, facilitate review of strategy and secures approvals from appropriate stakeholders for all QE activities, analyzing data gaps
  • Actively collaborating with developers and business stakeholders to clarify requirements, especially in terms of testability, consistency, and completeness
  • Develop UDFs to encrypt and decrypt customer personal information for protecting personal information
  • Generate and Validated reports and data in Tableau using aggregated data to give visual representation
  • Develop automated and manual SQL test scripts to test ETL loads, data aggregations, data conversions , data extracts and Datawarehouse and Slowly changing dimension(SCD) types
  • Preparing complex SQL & PL/SQL scripts for data validation in Oracle, Teradata and other databases and automate the count and mapping test for various schemas and tables across the database
  • Build and maintain CICD automation framework to automate all test cases used to validate Data in file and table
  • Prepare test scenarios/test cases based on acceptance criteria and review with product owner
  • Prepare test data for different test scenarios/business rules
  • Validate regulatory reports like EU LCR,US LCR ,FR2052a,ILST and CORRE as part of federal regulatory submissions which had complex business rules
  • Design of Automation scripts for Sanity, Smoke and Regression tests using Unix shell scripting ,python programming and Jenkins to create builds to setup automation framework
  • Sqooping data from Oracle database and other source files into Hive and Oracle tables using shell scripts and HDFS commands
  • Record and track all defects identified during testing phase any cycle back with core development team for any fix/fine-tuning
  • Develop defect Matrix Report for various project stakeholders and teams
  • Perform business system analysis ,Data analysis ,root cause analysis and recovery method From major/critical system issues and provide impact analysis report to business and Dev team as and when required
  • Perform validation of report on OBIEE and Tableau Dashboards ,data quality checks and
  • KPI’s( Key process indicators) using SQL and PL/SQL scripts
  • Identifying Automation scope with in project and propose feasible automation frame work
  • Develop Mapping logic ,Health check solution for data elements to make sure data profiling and data quality is not breached
  • Using knowledge of Metadata and Star and snowflake schemas ,setup scripts to create test source tables and data in support of testing loading of Staging area ,Fact and Dimension tables in Target Datawarehouse
  • Setup and Launch unix scripts to trigger Batch jobs to load Target tables through backend
  • Produce reports that are clear,effective ,insightful and traceable-whether for defect discovered during test or non-compliance issues found during process review or audit .Utilize existing tracking systems,report mechanisms ,agreed conventions ,and metrics when preparing and providing such reports.

Sofware Engineer/ETL & Big data QA Lead

Wipro Ltd, Bank of the West
11.2015 - 05.2019

Environment:

  • Datawharehouse,Bigdataecosystem,GoogleCloudplatform(GCP),HDFS,Hive,Sqoop,Gerrit,Jekins,Shell,Teradata,Python,Spark,Agile, Scrum Model ,Tidal scheduling tool,PL/SQL,Jekins,Gerrit, Jira , OBBIE and Tableau reporting tool

Responsibilities:

  • Analyzed Business Requirements and Rules and work with Business Analysts and
  • Business Users in preparing Test Plan
  • Providing QA estimates for all Test deliverables
  • Working in Agile development life cycle and follow Test driven iterative development
  • Attend sprint-planning sessions- Prioritize story(work), estimate story for preparing QA scripts, test cases, test execution and Product Owner review
  • Attend daily stand ups and update scrum team about previously done tasks and also next tasks planned to do
  • Create Test Strategy, Test optimization, effort distribution in agile environment, facilitates review of strategy and secures approvals from appropriate stakeholders for all QE activities
  • Actively collaborating with developers and business stakeholders to clarify requirements, especially in terms of testability, consistency, and completeness
  • Measuring and reporting test coverage across all applicable coverage dimensions
  • Process Demand,Verified data and Offers to analyze various Loyalty and Kohls charge program using Spark
  • Implement incremental model for multiple databases to ingest data from GCP
  • Mysql,Teradata to GCP big data bucket using sqoop
  • Devlop UDFs to encrypt and decrypt customer personal information for protecting personal information
  • Generate and Validate reports and data in Tableau using aggregated data to give visual representation
  • Migrate and validate aggregated data from Hive to BigQuery for faster query responses for business teams
  • Analyze and fix code optimize flows to avoid long processing and wastage of cluster resources
  • Perform tableau report validation as part of Tableau version upgrade also created test automation suite using BI validator tool
  • Analyze YARN to understand major blockages
  • Worked on GCP Devops ,Security Developers to fix environmental issues and integrate projects into GCP environment
  • Build and maintain CICD automation framework to automate all test cases used to validate Data
  • Prepare test scenarios/test cases based on acceptance criteria and review with product owner
  • Prepare test data for different test scenarios/business rules and manage it in google cloud/Teradata database environment
  • Preparing complex SQL and HQL scripts for data validation in hive, Oracle and other databases
  • Validate regulatory reports like EU LCR,US LCR ,5G and CORRE as part of federal regulatory submissions
  • Perform Integration, System, Regression testing, UAT and Deployment Support for implementations of various Business requirements
  • Sqooping data from Oracle database and other source files into Hive tables
  • Perform Integration, System, Regression testing, UAT and Deployment Support for implementations of various Business requirements
  • Record and track all defects identified during testing phase any cycle back with core development team for any fix/fine-tuning
  • QA artifacts preparation from Master Test Plan (Contains overall Testing Activities and testing strategy to be covered) till Test Closure Summary
  • Preparing daily status, defect reports and reviewing them with stake holders
  • Provide root cause analysis and recovery method from major/critical system issues
  • Perform business system analysis and provide impact analysis report to business and Dev team as and when required
  • Perform validation of report on OBIEE and Tableau Dashboards
  • Perform validation of data quality checks and KPI’s( Key process indicators) using SQL and PL/SQL scripts
  • Identifying Automation scope with in project and propose feasible automation frame work
  • Implement and maintain CICD automation framework using Jenkins for Devops
  • Maintain Jenkins build executed jobs and represent same when customer requests it
  • Implementation and validation of semi structured data formats like XML and Json
  • Maintaining code versioning using GitHub, Gerrit and Jenkins
  • Prepare POC’s (Proof of concept) to demonstrate advantage of new technologies and to propose more creative and better solutions to business needs.

Sr ETL QA Engineer/IT Analyst

Tata Consultancy Services, Chase Bank
08.2008 - 11.2015

Environment:

Teradata SQL Assistant, Informatica, Toad SQL assist, Oracle and TeradataDB.TWS scheduling tool,PL/SQL,Unix shell scripting , Perl programming

Responsibilities:

  • Requirement Gathering and liaising between Business and Project team
  • Actively collaborating with developers and business stakeholders to clarify requirements, especially in terms of testability, consistency, and completeness
  • Providing QA estimates for all Test deliverables
  • Prepare test scenarios/test cases based on acceptance criteria and review with product owner
  • Prepare test data for different test scenarios/business rules and manage it in google cloud/Teradata database environment
  • Data Loading in Target tables using ETL Tool Informatica
  • Creating complex SQL scripts to be executed on Teradata SQL Assistant and doing a source to target validation
  • Analyzed Business Requirements and Rules and worked with Business Analysts and
  • Business Users in preparing Test Plan & Test Design
  • Providing test plan , test case & result walkthrough to all Business users ,stakeholders and Product owner to sign-off testing
  • Executing stored procedures for testing PL/SQL codes for various stages of data load to data DW and data mart
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Communicating with Dev, BA’s , Project manager regularly and ensuring project deliverables are on Track
  • Onsite & offshore QA Team co-ordination
  • Managing all QA activities like test planning , requirement gathering ,release management ,test execution and managing defects in ALM and quality center
  • QA artifacts preparation from Master Test Plan (Contains overall Testing Activities and testing strategy to be covered) till Test Closure Summary
  • Extracting Data from Heterogeneous Sources and processing them to staging layer,Work tables and Target tables
  • Loading data into tables using TWS (Tivoli workload Scheduler) and Control-m scheduling tool
  • Validating data by applying all business rules and processing data till BASE layer
  • Creating SQL scripts to be executed on Teradata SQL Assistant and doing source to target validation
  • Validating CDC logic correctness on target tables
  • Optimizing/Tuning several complex SQL queries for better performance
  • Keeping track of defects in ALM in HP ALM and QC, working with dev & BA team to resolve issues faster in order to unblock testing activities
  • Communicating with Dev
  • Leads, BA’s , Project manager, offshore team and onshore regularly and ensuring project deliverables are on Track
  • Measuring and reporting test coverage across all applicable coverage dimensions
  • Preparing daily status, defect reports and reviewing them with stake holders
  • Perform Unit, Integration, System, Regression, End-to-End Testing(Manual &
  • Automated)
  • Automate file validation and file to table comparison using shell scripting and perl
  • Scripting.

Education

Bachelor Of Engineering - Computer Science And Engineering

B M S College of Engineering
Bangalore India
06.2008

Skills

  • Database: Oracle ,Teradata, MS SQL server,
  • DB Client Tool:Teradata SQL Assistant, TOAD for Oracle, SQL Developer
  • BigData Ecosystem: Hadoop , Hive,Spark,Sqoop,HDFS
  • SchedulingTool:Azkhabhan,Airflow,TWS(TivoliWorkloadScheduler),Control-m,Autosys, Tidal
  • Test Management tool: HP quality center ,ALM,qTest, Jira,
  • ETL Tools : Informatica, Abhinito,Netezza
  • MS Office: Word ,Access, Power Point, Excel
  • Test Automation Tool: Selenium using Python, Data validation framework using python
  • Programming languages: Shell scripting, SQL scripts ,PL/SQL and Python programming
  • Reporting Tool : OBIEE ,Microsoft Power BI and Tableau
  • Dev-ops: CICD frameworking using Jenkins,Gerrit
  • Could: AWS Data Analytics (Redshift,S3,Glue,EMR,Athena)

Additional Information

  • Experience working in Google Cloud(GCP) Dataproc and Bigquery. Good understanding of Hadoop Architecture including YARN and various components such as HDFS,Resource manager ,Node Manager,Name Node,Data Node. Experience in importing and Exporting data using Sqoop from RDBMS to HDFS and Vise- versa. Good Knowledge on Teradata SQL Assistant, Toad for Oracle,SQL devloper, SQL Server Database. Good Knowledge on Python programming and experience in Preparing automation script for Data validation using Python programming . Experience working with OBIEE and Tableau Reporting Tool for Validating BI reports used by Business for Data Analytics. Experience in Unix Shell Scripting & Perl automation and maintaining shell Scripts for automation purpose Experience working with ETL job schedulers like Control-m,TWS (Tivoli Work load Scheduler),Autosys and Big data job schedulers like Azkhabhan & AirFlow DAG used loading data in target tables. Expertise level knowledge in entire software development lifecycle (SDLC) process from business analysis to development, testing, deployment, documenting, maintaining and user training, experience on AGILE and SCRUM design methodologies. Experience working on Dev_ops and having experience working on CICD automation framework using Jekins,GitHub and Gerrit tools . Worked extensively with JIRA tool to create Stories , dashboards and report related metrics. Worked extensively with HP Quality Center /ALM , qTest to map requirements to test cases, create test sets and carried out test runs manually as well as using automated test scripts. Strong ETL testing experience and SQL and PL/SQL testing experience. Experience on analyzing PL/SQL script and testing PL/SQL scripts. Experience in executing/trouble shooting stored procedure as a part of testing PL/SQL script Extracting defect status reports and managing defect status using the HP ALM and conducting defect status calls. Experienced in testing Web applications and client/server applications Preparing Test status and defect status reports and review them daily. Excellent experience in preparing Test plan document which includes test strategy. Excellent experience in preparing Test closure summary. Expertise in creating the test data for Extraction and Transformation processes and resolved critical data issues by following the data standards of the project. Extensive working knowledge in UNIX environment. Strong working experience in preparing the test plan, test run and managing the test run effectively using the HP ALM and qTest. Involved in end to end QA artifacts Preparation from Master Test Plan to Test Closure Summary

Timeline

Sr QA Engineer/Business Data Analyst

BMO Financial Group
05.2019 - Current

Sofware Engineer/ETL & Big data QA Lead

Wipro Ltd, Bank of the West
11.2015 - 05.2019

Sr ETL QA Engineer/IT Analyst

Tata Consultancy Services, Chase Bank
08.2008 - 11.2015

Bachelor Of Engineering - Computer Science And Engineering

B M S College of Engineering
Karthik Hulkoti