Experienced and reliable Quality Assurance professional with over 11+ years of experience working in companies to ensure the highest quality outcomes possible in Mobile, Health Insurance, Construction & Engineering and Leave & Disability domains.
Possess excellent skills in Manual Testing, Salesforce Functional Testing and Automation Testing.
Strong experience on Selenium Web Driver with Java and Python programming languages.
Expertise on Hybrid Framework with Page Object Model.
Worked on API automation using Salesforce API services.
Proficient in designing and executing the test cases based on the Business requirements.
Certified AWS Cloud Practitioner.
Overview
11
11
years of professional experience
1
1
Certification
Work History
Quality Assurance Engineer
Amazon.com Services LLC
10.2022 - Current
Design test automation framework and write automation test scripts to improve the test automation coverage
Shortlist the manual test cases that will provide the more value when automation and reduce manual testing time and efforts
Write automated tests to evaluate and test the specific feature by performing a series of steps which covers the end-to-end functionality and test the real time scenarios
Define test automation plan that work across multiple platforms and browsers
Design and publish reports after every test run to keep stake holders updated regarding the testing progress for every sprint
Add test cases to the deployment pipeline so that every new line of code is validated against the team's automated tests
Rerun the failed automated test cases and debug the failures to update the test cases in case of functionality change or raise a defect if test case fails due to a bug
Perform thorough regression testing when the bugs are resolved
Monitor the health of the application APIs by running automated test suites after every deployment and as part of regression testing to check all the application integration is intact after deployment
Review requirements, specifications, and technical design documents to provide timely and meaningful feedback
Create detailed, comprehensive, and well-structured test plans and test cases
Estimate the effort and resources, prioritize the functionality that needs to be tested, plan the time lines, and coordinate testing activities for each release that is going to be in production
Develop and apply testing processes for new and existing products to meet client needs
Working directly with Developers, Product Managers and Technical Program Managers to ensure quality of the product
Execute an extensive list of test cases to scrutinize the given feature and discover any unaddressed gaps
Perform the different types of testing like sanity testing, regression testing, API testing and performance testing that were finalized in the test strategy
Prioritize and communicate defect status, collaborating closely with development teams to ensure fast remediation
Facilitate communication between teams to resolve complex issues and ensure product quality
Identify and report issues where the given feature either does not work as expected or throws errors
Retest these issues once they are fixed
Analyze system logs, data and application performance to identify root cause of failures, particularly through tools like AWS CloudWatch, Dynamo DB, Athena and others
Sign off on the specific feature after all of the test cases are executed and there are no open issues left to let the stake holders know that testing is complete and they can proceed with the deployment to the production
Organize and drive bug triages to define the criticality of open issues, track bug fixes, and discuss the finalized approach for fixing a particular issue
Draft Go/No-Go documents and drive related discussions with stake holders and managers from each work stream of the application to discuss any open issue/defect or risk in their work stream before the launch of the application
Salesforce QA Engineer
Amazon Development Centre India Pvt. Ltd.
04.2020 - 10.2022
Design test automation framework and write automation test scripts to improve the test automation coverage
Shortlist the manual test cases that will provide the more value when automation and reduce manual testing time and efforts
Write automated tests to evaluate and test the specific feature by performing a series of steps which covers the end-to-end functionality and test the real time scenarios
Define test automation plan that work across multiple platforms and browsers
Design and publish reports after every test run to keep stake holders updated regarding the testing progress for every sprint
Add test cases to the deployment pipeline so that every new line of code is validated against the team's automated tests
Rerun the failed automated test cases and debug the failures to update the test cases in case of functionality change or raise a defect if test case fails due to a bug
Perform thorough regression testing when the bugs are resolved
Monitor the health of the application APIs by running automated test suites after every deployment and as part of regression testing to check all the application integration is intact after deployment
Review requirements, specifications, and technical design documents to provide timely and meaningful feedback
Create detailed, comprehensive, and well-structured test plans and test cases
Estimate the effort and resources, prioritise the functionality that needs to be tested, plan the time lines, and coordinate testing activities for each release that is going to be in production
Develop and apply testing processes for new and existing products to meet client needs
Working directly with Developers, Product Managers and Technical Program Managers to ensure quality of the product
Execute an extensive list of test cases to scrutinize the given feature and discover any unaddressed gaps
Perform the different types of testing like sanity testing, regression testing, API testing and performance testing that were finalized in the test strategy
Identify and report issues where the given feature either does not work as expected or throws errors
Retest these issues once they are fixed
Sign off on the specific feature after all of the test cases are executed and there are no open issues left to let the stake holders know that testing is complete and they can proceed with the deployment to the production
Organize and drive bug triages to define the criticality of open issues, track bug fixes, and discuss the finalized approach for fixing a particular issue
Draft Go/No-Go documents and drive related discussions with stake holders and managers from each work stream of the application to discuss any open issue/defect or risk in their work stream before the launch of the application
Quality Assurance Engineer
Oracle
08.2017 - 03.2020
Implemented automation framework from scratch using Selenium Web Driver with Java, TestNG with Page Object Model as a design pattern
Developed CI/CD pipeline using Jenkins for scheduled automation test executions over the nights
Owning the automation of the N-1th sprint test cases, executing and analyzing the failures for every run
Responsible for Smoke testing and Regression testing for every release
Responsible for automation framework enhancements and maintenance of the scripts and framework
Automate the manual test cases which will reduce the manual testing efforts to run the regression testing for every release
Debug and analyze the automation test results and raise the functional issues accordingly
Maintain the existing automation scripts as per the recent changes in the functionality of the product
Monitor the health of the product for every build deployment using CI/CD pipeline runs and escalate the major issues as needed
Publish the automation test report across stake holders and product managers
Defined the automated regression testing process including UI tests and API tests
Senior Software Engineer-QA
ValueLabs
09.2013 - 07.2017
Involved in preparation of Test Plan, Test Cases, Test Scripts, Test Scenarios and QA Handover Document to cover overall quality assurance testing
Performed both Manual and partial Automation testing and verified actual results against expected results
Conducted Functional Testing, Regression Testing, Smoke Testing, User Acceptance Testing
Updated the Traceability Matrix in daily, send status report and daily tracking report to test manager
Sending weekly status reports to client with summary of automation team efforts
Involved in peer reviews meeting with the team for test planning and Analyzing user stories to write the test cases
Work in Agile development environment with frequently changing requirements and features
Prepared Requirement Traceability matrix, Test data, Test strategy, Test Coverage Matrix, and Test reports
Providing weekly builds Regression Support by executing test cases, evaluating test results and maintenance of New/Fixed bugs
Involved in analyzing Test Plan, Test Scenario, Test Cases, User Guides, Defect Management, Metrics, Release Report, Review, and Status Reports
Used to coordinate with development team for Bug fix issues
Performed User Acceptance and Regression Testing with UAT test scenarios
Finding and reporting defects and subsequently validating the fix, repeating the process until done
Coordinate efforts between development teams and offshore enterprise test team
Responsible for tracking the defect ownership through the SDLC phases
Participated in BRD walkthroughs along with Business System Analysts, development team to understand the requirements
Involved in Agile Sprint planning and daily scrums which included three-week sprint
Participated in daily stand-up meetings where the stakeholder was involved along with Scrum Master and developers
Developed the Test Plan, Test scripts, and Test cases for all the different modules in the system to perform the functional and regression testing
Automated the Test Cases and created the Test Suites locally within Eclipse for Functional, Regression and Browser Compatibility using Selenium WebDriver
Prepared and Execution of test scripts using Robot Framework and Selenium Web-Driver
Experience on API testing using REST Client library
Prepared test data using JSON and MS Excel files and reading the same for test execution
Experience in using version control management tools like SVN, created a trunk to merge the files from local Branch to central repository