Executed complex Quality Assurance activities using appropriate test tools and best practices
Identified and thoroughly analyzed defects, issues, and risks.
Hadoop QA Engineer
Bank of America - Tata Consultancy
Jacksonville, FL
10.2015 - 03.2018
Conducted comprehensive testing, including infrastructure, capabilities, load/stress, and performance testing, across various Hadoop platforms using performance-driven methodologies
Managed certification processes for new tool versions and software upgrades (JDK, CDH, Cloudera Manager) in development, testing, and production clusters
Developed scripts to optimize Hive and Impala performance
Automated transactions and added value through UNIX Korn shell scripting
Executed proof-of-concepts (POCs) for new data wrangling and ETL tools like Pentaho and Trifacta
Utilized Apache Kafka for handling log messages across multiple systems
Ensured effective cross-application communication testing.
Hadoop Developer - Deposits
Tata Consultancy
Chennai, India
02.2012 - 09.2015
Conducted comprehensive testing, including infrastructure, capabilities, load/stress, and performance testing, across various Hadoop platforms using performance-driven methodologies
Managed certification processes for new tool versions and software upgrades (JDK, CDH, Cloudera Manager) in development, testing, and production clusters
Developed scripts to optimize Hive and Impala performance
Automated transactions and added value through UNIX Korn shell scripting
Executed proof-of-concepts (POCs) for new data wrangling and ETL tools like Pentaho and Trifacta
Utilized Apache Kafka for handling log messages across multiple systems
Ensured effective cross-application communication testing.
Production Support Analyst
Tata Consultancy
Developing Informatica sessions and workflow to handle the ETL processing
Creating Datastage tasks and stage for ETL
Pulling data from different sources like Mainframe, UNIX, Oracle and all
System monitoring and solving job failures quickly
Ensure the stability of Bank of America's Teradata and act accordingly
Action taken to ensure data availability on-time to meet the SLA
Analysis of code to resolve issues from clients and users
Perform disaster Recovery exercise every year to ensure data availability in case of disaster situations
Handling Teradata Hardware and software upgrade
Reviewing the code and standards for the new application coming into production
Implementing code fixes and enhancements for frequent abends.
Education
Skills
Domain and Technology: Banking, Communication and Financial Services, Data Warehousing, Insurance
Guidewire Certified Specialist - DataHub and InfoCenter Integration - Elysian Certification and upgraded the same to Innsbruck
Career Accomplishments
10+ years of experience in Quality Assurance, Data Validating, Testing App Development, and Data Analysis teams with proficiency in Data Warehousing and RDBMS technologies. 5+ years of ETL and Data integration experience in developing ETL mappings and scripts.