Lead the automated testing efforts for multiple projects , collaborating closely with cross-functional teams to ensure high quality software delivery with aggressive timelines
Developed and maintained automated test scripts using Seleniumweb driver in Java
Developed Test Automation scripts for API, using RestAssured
Postman/ThunderClient for API validation and run collections using Testkube for smoke test or quick regression
Designed and implemented the POM to enhance test script reusability and maintainability
Created and managed test automation frameworks using TestNG|JUnit|NUnit and integrated with Jenkins for CI/CD
Automated regression test suites, reducing manual effort by almost 60-65% and accelerates to release cycle
Collaborated with development team to identify test scenarios and create comprehensive test plans
Utlilzed version control systems like Bitbucket/Github for code repository management and version tracking
Reported and tracked defects in JIRA,working closely with developers to ensure timely resolution
Strengthened collaboration with developers, identifying and addressing issues in the early stages of the development process.
Actively participated in agile ceremonies such as sprint planning and retrospective meetings, providing valuable feedback on testing activities.
Increased software reliability, performing thorough regression tests to ensure seamless performance after updates.
Generated detailed test reports and metrics to track test execution, coverage and outcomes
Using Dynatrace to monitor entire session replays while or after regression suite is executed
Mentored Junior team members QA/QEs on automation frameworks and best practices.
Performance
Gather Non-functional requirements by going through the non-functional requirements documents and inputs from business team and architecture team
Prepare Test Plan and Test Strategy to be followed as part of performance testing and get it reviewed and approved with the application team, architecture team, Business team and other stakeholders
Create a work load model, schedule load test and execute the tests
Create test scripts for the identified application flows using tools like Loadrunner, Jmeter & K6
Ensure the test environment and test data is setup for the identified scenarios
Load modeling of application scenarios, design and execution of load test of assigned modules
Execute load tests in TestKube ,Loadrunner enterprise & Jmeter distributed model
Raise defects for all performance issues found in application
Actively Perform Root Cause Analysis and work with development team to get an optimal solution for performance defects and get a fix deployed
Prepare detailed performance test report and share with Business , Scrum Masters, Product team, Architecture team
For each sprint , testing the component level at each microservice level & e2e application performance for all applications and compare with earlier releases before go live
Working with DBA team and application team if any performance defects found in the application
Created and distributed detailed status and test reports during and after each test cycle
Evaluated cause of occurrence to recommend and initiate corrective actions
Collaborated with cross-functional teams to develop effective solutions for process improvements
Analyzed data to identify trends in performance and quality
Expert in Installation, administration and maintaining Conduktor application to do KAFKA operations
Worked on different testing tools like Jmeter, K6,LoadRunner, OATS, NeoLoad and other performance tools to generate load
Worked on different monitoring tools like AppDynamics, Dynatrace, Grafana, Kibana, splunk, Az app insights , Conduktor monitoring for Kafka
Enterprise IT level trainings, support for performance testing & usage of tools to generate load
Train Junior resources and onboard in performance testing projects to increase the test coverage across IT – 20% of my work
80% of my work is core performance testing & engineering activities.
Reduced production defects by designing and executing comprehensive quality control processes.
Supported quality team members during corrective action updates.
Enhanced automation test coverage by implementing new testing strategies and approaches.
Achieved higher levels of test accuracy by designing and executing data-driven tests for various input scenarios.
Mentored junior team members, sharing knowledge and insights on automation testing tools and techniques to foster continued growth and success within the team.
GlobalLogic
Senior Performance Engineer
07.2018 - 01.2020
Job overview
Participated in the full software development life cycle (SDLC) from requirements gathering through system implementation
Developed comprehensive performance test plans and strategies for web-based applications
Planned test schedules or strategies in accordance with project scope or delivery dates
Ensured that all changes are tested adequately before being released into production environments
Provided technical support to troubleshoot any production issues related to application performance
Designed and implemented automated tests to measure the performance of applications under varying conditions
Worked closely with cross-functional teams such as developers and QA engineers throughout the entire process
Wrote well-designed, testable code
Created scripts using scripting languages like JavaScript, Python, Perl
To automate processes
Met deadlines while maintaining high-quality deliverables
Analyzed system log files and metrics from various monitoring tools to debug application performance issues
Performed root cause analysis of complex production issues by correlating data from multiple sources
Utilized advanced techniques such as caching, clustering, load balancing, sharding
To improve application efficiency
Developed and implemented automated performance testing strategies for high-traffic web applications using JMeter
Implemented proactive alerting mechanisms using Prometheus and Grafana dashboards to detect potential problems early on
Analyzed information to determine, recommend and plan installation of new system and modification of existing system
Conducted performance analysis, identified bottlenecks, and provided remedial solutions
Monitored key system components during peak load conditions
Prepared reports or correspondence concerning project specifications, activities or status
Monitored application performance using various tools such as LoadRunner, AppDynamics
Designed, executed, and monitored load tests to identify scalability bottlenecks in distributed systems
Worked closely with DevOps team members to deploy infrastructure changes needed to support performance objectives
Optimized databases queries for improved response times under heavy loads
Produced detailed reports on system performance metrics such as throughputs, latency times and response times.
Infosys-Walgreens
Test Analyst
11.2017 - 07.2018
Job overview
Technology health remediation is a program to conduct performance testing for all the wholesale application which are migrated to the new technology
Performance
Expertise in leading onshore-offshore model and acted as onsite coordinator where my role is to assign work for 16 members reported from offshore and make sure deliverables are on-time and helped team if they stuck with any issues
Gather Non-functional requirements by going through the non-functional requirements documents and inputs from business analysts
Understand and analyze requirements for performance testing
Determine feasibility of performance test tool with the application under test
Prepare Test Plan and Test Strategy to be followed as part of performance testing and get it reviewed and approved with the application team
Create test script for the identified application flows
Schedule load test and execute the test in performance center
Ensure the test environment and test data is setup for the identified scenarios
Load modeling of application scenarios, design and execution of load test of assigned modules
Raise defects for all performance issues found in application
Perform Root Cause Analysis (RCA), verify and track defects till closure
Prepare detailed performance test report and share with the application team
Preparation and submission of status reports of assigned tasks to project stakeholder
Conduct daily status calls with the team to get the status for each application
Share the daily status updates to the client for each application
Managed, monitored and maintained performance test environments for various applications
Created detailed reports outlining the findings of each performance test conducted
Evaluated new technologies or techniques that may help improve overall application performance
Used bug tracking system and report defects to software developers
Evaluated and recommended software for testing and bug tracking
Assisted in setting up an efficient environment for measuring the effectiveness of new features and functionalities added into the product line
Produced detailed reports on system performance metrics such as throughputs, latency times and response times
Visited beta testing sites to evaluate software performance
Provided technical support to troubleshoot any production issues related to application performance
Developed and implemented scripts to automate the process of performance testing
Collaborated with development teams to resolve performance issues in a timely manner
Worked closely with stakeholders to ensure that their requirements were met within specified timelines
Performed root cause analysis of performance related issues during software development life cycle
Developed comprehensive performance test plans and strategies for web-based applications
Designed test plans, scenarios, scripts, and procedures
Participated in QA meetings with developers and other teams to discuss current projects and upcoming tasks
Documented all steps taken throughout the entire process including test plans, results, metrics
Maintained an up-to-date knowledge base of industry trends and best practices related to Performance Testing
Analyzed data from multiple sources to determine trends and develop actionable insights for optimization purposes
Installed and configured recreations of software production environments to allow testing of software performance
Conducted historical analyses of test results
Utilized a variety of tools such as JMeter, LoadRunner, NeoLoad to perform load testing
Planned test schedules or strategies within project scope and delivery dates.
Automation
Participated in all meetings planned for release and obtain necessary technical requirement. Such meetings include design review, test execution timeline etc.
Identified high level scenarios to perform test execution for requirements in current release.
Conducted Functional testing, Regression Testing using selenium with Data-driven framework and Key- Word driven framework.
Created automation test scripts using data Driven framework to test the web applications using Selenium WebDriver with JAVA.
Execute batch jobs and analyze the results to provide recommendations.
Participated in sprint planning, daily stand-ups, and retrospective meetings as part of an agile development team.
Developed test code in Java language using Eclipse, IDE, and Junit framework.
Conducted compatibility testing across multiple browsers, devices, and operating systems.
Work closely with development team, Application Support team and helpdesk team to trouble shoot issues in testing environments.
Involved in developing Integration Test Plans, System Test Plans and Performance Test Plans for the applications.
Performed Regression testing after each build release of the application and updated the scripts by executing the Regression suites built using Selenium and approve for Regression Suite.
Expertise in handling List Boxes, Drop down Menus, Mouse Actions, Frames, synchronization and all types of web elements, Pop-Ups using Selenium.
Worked on execution of automation scripts in cross browsers.
Experience in writing complex XPATH using following and preceding and using functions like contains and not contains.
Contributed to the improvement of testing processes and methodologies, including the adoption of new tools and technologies.
Conducted backend testing using SQL queries to validate data for database and Used SQL queries for retrieving data from database for executing user specific test cases.
Infosys-UBS-Offshore
Test Analyst
08.2015 - 10.2017
Job overview
Diagnosed network latency issues by capturing packet traces using Wireshark or similar tools
Monitored server resource utilization during load testing activities to identify areas of improvement or optimization opportunities
Conducted historical analyses of test results
Investigated existing systems for potential problems before launching new versions into production
Collaborated with development teams to ensure timely resolution of issues found during testing phase
Monitored bug resolution efforts and track successes
Participated in product design reviewed to provide input on functional requirements, product designs, schedules, and potential problems
Documented key findings from each phase of the performance testing process along with recommendations for remediation steps
Performed load, stress and endurance tests on software applications to identify performance bottlenecks
Developed strategies for simulating user interactions at scale using cloud-based solutions
Instrumented code via Application Performance Monitoring tools such as AppDynamics and Dynatrace
Performed manual and automated testing using various industry standard tools such as HP LoadRunner and JMeter
Documented test cases and created reports outlining findings from various types of tests performed
Planned test schedules or strategies within project scope and delivery dates
Monitored program performance to ensure efficient and problem-free operations
Evaluated existing architecture designs for potential scalability issues or limitations prior to launching into production environments
Developed automated scripts to perform performance testing on web applications using JMeter
Provided feedback and recommendations to developers on software usability and functionality.
Infosys-UBS
Test Analyst
08.2014 - 08.2015
Job overview
Project Title: Global timesheet registry system(GTRS)
Client: UBS has internal projects to fill the timesheets and project code allocations within UBS
We are testing the performance of the application
Role and Responsibilities: Gather the Non-functional requirements by going through the non-functional requirements documents and inputs from business analysts
Prepare Test Plan and Test Strategy to be followed as part of performance testing and get it reviewed and approved with the application team
Create test script for the identified application flows
Ensure the test environment and test data is setup for the identified scenarios
Application deploy and bouncing bases on the need
Write Splunk queries and design dashboards based for the application modules
Monitor system resources with Yourkit, generate CPU and Memory snapshot for load tests
Load modeling of application scenarios, design and execution of load test of assigned modules
Raise defects for all performance issues found in application
Perform Root Cause Analysis (RCA), verify and track defects till closure
Prepare detailed performance test report and share with the application team
Preparation and submission of status reports of assigned tasks to project stakeholder
Utilized SQL queries to analyze large datasets for insights into system performance
Developed and implemented performance analysis processes to identify areas of improvement in existing systems
Performed root cause analyses on complex problems related to application or system issues, identifying solutions that improve overall efficiency
Provided training sessions for colleagues regarding proper use of tools used in performance analysis tasks.
Infosys
Test Analyst
10.2013 - 08.2014
Job overview
Utilized SQL queries to analyze large datasets for insights into system performance
Delivered strategic guidance on how companies could better utilize their resources
Implemented best practices for monitoring application health, including setting up alerts when thresholds are exceeded
Created dashboards and reports to present results of performance analyses to executive-level management
Monitored server resources regularly, taking proactive steps when utilization levels exceed pre-defined thresholds
Collaborated with developers to improve product quality through performance optimization
Used bug tracking system and report defects to software developers
Conducted system analysis and identified bottlenecks in application architecture and code
Installed, maintained, and used software testing programs
Developed and implemented scripts to automate the process of performance testing
Performed root cause analysis of complex system problems to identify areas of improvement
Identified areas where automation could be used to increase efficiency in the testing process
Produced detailed reports on system performance metrics such as throughputs, latency times and response times
Conducted software compatibility tests with programs, hardware, operating systems, and network environments
Planned test schedules or strategies within project scope and delivery dates
Worked closely with cross-functional teams such as developers and QA engineers throughout the entire process
Performed root cause analysis of performance related issues during software development life cycle
Utilized a variety of tools such as JMeter, LoadRunner, NeoLoad to perform load testing
Maintained an up-to-date knowledge base of industry trends and best practices related to Performance Testing
Documented test procedures to produce replicability and compliance with standards
Conducted historical analyses of test results
Designed and developed automated testing tools
Ensured that all systems were optimized for optimal user experience by analyzing response times under different loads.
TCS
Systems Engineer
08.2010 - 10.2013
Job overview
Provided technical support for the development of large-scale distributed systems
Developed and implemented system performance monitoring tools to identify latency issues and optimize server resources
Evaluated system architecture to verify security and scalability
Monitored system logs for any suspicious activity or errors that need further investigation
Provided technical guidance in support of system development and troubleshooting
Monitored performance metrics of existing automation suites and identified areas where improvements can be made
Wrote test cases for new features based on product requirements and user interface design documentation
Created detailed test plans to ensure comprehensive coverage of all requirements
Evaluated experimental data to prepare engineering analysis, conclusions and recommendations
Provided input to engineers to assist in development of test protocols
Developed and implemented test plans, strategies, and procedures to ensure quality of products
Analyzed system logs to troubleshoot errors encountered during the execution of automation scripts
Participated in continuous improvement activities of testing services team
Investigated real-time data and application logs to identify bottlenecks and application issues
Maintained a library of reusable automation components to reduce development time
Configured Jenkins jobs for continuous integration and continuous delivery pipelines
Designed and developed automated testing tools
Created quality and safety-related documentation to support test validity.