Summary
Overview
Work History
Education
Skills
Clearance Status
Timeline
Generic

Litton Shahreair

Farmington Hills,USA

Summary

Software Performance Test Engineer specializing in cloud-based applications and performance optimization at Cengage. Proficient in JMeter and LoadRunner, providing actionable insights through comprehensive performance analysis. Communicates findings effectively to stakeholders, enhancing system reliability and scalability.

Overview

18
18
years of professional experience

Work History

Software Performance Test Engineer

Cengage
Farmington Hills, MI
01.2018 - Current
  • Worked on cloud-based enterprise applications (AWS), ensuring performance, scalability, and reliability across distributed systems.
  • Created and maintained performance test scripts in Apache JMeter for web applications, RESTful APIs, and backend services, using parameterization, correlation, and assertions to simulate realistic user behavior.
  • Developed modular and reusable JMeter test plans using CSV Data Set Config, Regular Expression Extractors, and JSON Path Extractors to support dynamic and data-driven testing.
  • Integrated JMeter test execution into CI/CD pipelines using Jenkins, enabling automated regression and performance verification.
  • Utilized Dynatrace to monitor the performance and availability of web, application, and database servers, with focus on JVM health, thread utilization, CPU, memory, and backend errors.
  • Conducted in-depth log analysis using Splunk, Dynatrace, and custom profiling tools to identify system anomalies and bottlenecks.
  • Recommended and implemented JVM tuning, GC optimization, and cache configuration strategies for improved system performance.
  • Monitored and analyzed cloud-native microservices and distributed workloads using Dynatrace, capturing transaction traces across all tiers.
  • Designed and executed benchmark and load tests, validating system capacity and identifying performance degradation under high concurrency.
  • Scheduled and managed load test cycles, monitored middleware, application, and network performance using Datadog, and reported results to stakeholders.
  • Created comprehensive Performance Test Designs and documentation, aligned with functional and technical specifications outlined in test plans.
  • Conducted performance testing using Gatling for web-based applications, ensuring test reliability and repeatability.
  • Designed performance test architectures and defined end-to-end test scenarios covering stress, endurance, and capacity testing.
  • Leveraged Performance Center to manage test assets, organize requirements, and execute structured test campaigns.
  • Utilized LogicMonitor and Grafana for real-time monitoring and tuning of server-side components supporting order management systems.
  • Collaborated with developers, DevOps, QA, and product teams to troubleshoot, isolate, and resolve performance-related defects.
  • Analyzed client and server-side metrics, identified critical bottlenecks, and delivered actionable performance reports to technical and business stakeholders.
  • Environment: Cloud-based (AWS) enterprise applications, Gatling, JMeter, Grafana, Apache Tomcat, Java, Logic Monitor, Dynatrace, Splunk, JIRA, Oracle 12c, MongoDB

Performance Test engineer

Department of Veterans Affairs
Washington, DC
02.2013 - 12.2017
  • Developed comprehensive Performance Test Plans and test scripts aligned with project requirements, covering key workflows and business-critical functions.
  • Set up and managed performance test environments, including LoadRunner installations on remote machines accessed via Citrix.
  • Designed and implemented workload distribution strategies and goal-oriented scenarios to simulate realistic user load across multiple applications.
  • Performed REST API testing to validate the reliability, responsiveness, and throughput of backend services under load.
  • Utilized IBM Rational Performance Tester (RPT) to script and execute performance/stability tests, incorporating parameterization and correlation to emulate real user scenarios.
  • Configured advanced runtime settings in LoadRunner VuGen and Performance Center (e.g., think time, pacing, browser emulation, and timeout settings) for accurate simulation.
  • Created and executed LoadRunner VUser scripts for performance and load testing of applications under varying conditions.
  • Conducted performance testing on SIT environments using HP Performance Center, ensuring early identification of scalability risks.
  • Analyzed key metrics such as transaction response time, hits per second, transaction rate, and throughput to assess application performance against SLAs.
  • Designed Performance Test Strategies and implemented test scenarios using LoadRunner Controller, including baseline, stress, and endurance tests.
  • Interpreted performance data using LoadRunner Analysis, including Throughput, Transactions per Second, and Response Time graphs, to identify trends and anomalies.
  • Generated and delivered detailed performance reports to stakeholders, highlighting bottlenecks and SLA breaches with clear recommendations for optimization.
  • Conducted web services performance testing using LoadRunner, ensuring reliability and scalability of SOAP and REST-based services.
  • Utilized CA Application Performance Management (Introscope/APM) to monitor server-side metrics, identify performance degradation, and correlate issues across tiers.
  • Supported production teams in performance troubleshooting, identifying root causes for issues related to CPU, memory, network latency, I/O, and database contention.
  • Leveraged HP ALM for test case management, defect tracking, and reporting, ensuring collaboration across QA, development, and business teams.
  • Used Google Analytics to gather and interpret production usage data for more targeted and realistic performance test scenarios.
  • Collaborated closely with Developers, Architects, Business Analysts, Project Managers, and Release Teams to ensure alignment with performance goals and deployment readiness.
  • Assisted with test environment configuration and validation, working hands-on with infrastructure teams to ensure accurate test conditions.
  • Environment: LoadRunner 12.55, Apache Tomcat, WebLogic, Java, Mainframes, CA Introscope, Google Analytics, AppDynamics, JIRA, Oracle, HP ALM

Test Engineer

Market Smith
Plano, TX
02.2011 - 01.2013
  • Collaborated with developers and QA teams to resolve defects and ensure high-quality deliverables.
  • Designed and executed performance and scalability test plans using LoadRunner and Performance Center in SIT environments.
  • Developed test strategies and goal-oriented scenarios in LoadRunner to simulate real-world user behavior and workload distribution.
  • Created and maintained LoadRunner VUser scripts for stress, load, and baseline testing of critical application flows.
  • Analyzed performance using LoadRunner graphs (Throughput, TPS, Rendezvous, etc.) to identify bottlenecks across CPU, I/O, network, and database tiers.
  • Conducted web technology testing with Apache JMeter, including scripting, test execution, and distributed testing.
  • Performed manual correlation in LoadRunner and dynamic data handling in JMeter to ensure script reliability.
  • Validated key performance metrics such as throughput, response time, memory usage, and resource utilization.
  • Created and executed automated scripts using JMeter and BadBoy, enhancing test coverage and efficiency.
  • Prepared detailed Performance Test Reports, highlighting key metrics (e.g., CPU %, disk I/O, response time) and presenting findings to stakeholders.
  • Documented business scenario flows and ensured test coverage aligned with functional priorities.
  • Environment: Quality Center, LoadRunner 11.50, JMeter, WebSphere, Java, Mainframes, XML, HTML, MS-Office, SiteScope, Oracle

Performance Test Engineer

Baker Hill Corporation
Carmel, IN
09.2009 - 01.2011
  • Developed comprehensive Performance Test Plans to guide testing efforts and ensure alignment with project requirements.
  • Set up the performance test environment, installed LoadRunner on remote machines, and connected through Citrix for testing.
  • Correlated database VUser scripts to simulate real-world usage and ensure accurate performance testing.
  • Designed and executed performance test scenarios, creating LoadRunner scripts and handling correlation and parameterization using VuGen.
  • Executed performance test scenarios using LoadRunner Controller and analyzed results using LoadRunner Analyzer to assess system behavior under load.
  • Utilized LoadRunner to measure and analyze response times of critical business transactions under various load conditions.
  • Monitored infrastructure performance using SiteScope, Wily Introscope, and Microsoft SCOM to ensure system stability and availability.
  • Created automated load testing scripts with LoadRunner to evaluate the performance of web servers and identify potential bottlenecks.
  • Compiled and prepared executive-level performance test summary reports for stakeholders, highlighting key findings and recommendations.
  • Analyzed performance results using graphs, resource monitors, and LoadRunner’s analysis tools, identifying load patterns and areas for optimization.
  • Created and executed Batch test scripts for stress testing and validating system capacity.
  • Environment: LoadRunner 11.0, Apache, WebLogic, Java, Mainframes, CA Introscope, Google Analytics, AppDynamics, JIRA, Oracle

QA Tester

MetLife
New York, New York
07.2007 - 08.2009
  • Reviewed Business Requirement Documents (BRD) and Technical Specifications to ensure complete understanding of functional and non-functional requirements.
  • Created and executed LoadRunner scripts using VuGen, simulating real-world user interactions and designing scenarios based on business and system specifications.
  • Enhanced LoadRunner scripts with transactions, parameterization, and correlation to improve accuracy and test coverage.
  • Debugged and optimized LoadRunner scripts to ensure error-free execution and consistent performance results.
  • Conducted backend database validation by writing and executing complex SQL queries to ensure data integrity.
  • Generated bug reports and tracked issues using TestDirector, presenting clear and actionable testing results to the project team.
  • Developed and executed automated functional and regression test scripts using QTP, validating application workflows across multiple releases.
  • Performed GUI and data-driven testing in QTP, leveraging checkpoints (Standard, Table, Database) to validate front-end behavior and backend data consistency.
  • Designed scalable data-driven and keyword-driven automation frameworks to increase script reusability and maintainability.
  • Analyzed regression test results using QTP and addressed inconsistencies to maintain performance across builds.
  • Contributed to test planning, execution, and defect triage meetings, ensuring alignment with project timelines and quality standards.
  • Participated in data migration validation post-deployment, verifying successful data transfers and system configuration across environments.
  • Led test kick-off meetings, coordinated testing activities, and facilitated defect resolution sessions with cross-functional teams.
  • Identified and documented key tasks, risks, and assumptions related to the release cycle to mitigate potential issues.
  • Delivered daily, final, and weekly status reports, and conducted walkthroughs with the project and management teams to ensure transparency and progress tracking.
  • Environment: LoadRunner 9.0, TestDirector, QTP, Windows, WebSphere, UNIX, Oracle, Java, MS Office

Education

BSc -

National University of Bangladesh

Skills

  • HP ALM and Performance Center
  • Load testing tools
  • Selenium and JMeter
  • IBM Rational Performance Tester
  • SQL profiling and monitoring
  • Application performance management
  • Google Analytics and Splunk
  • Datadog and AppDynamics
  • Gatling and OctoPerf
  • Operating systems: Solaris, UNIX, Windows
  • Database management: Oracle, SQL Server, DB2, MS Access

Clearance Status

VA Public Trust

Timeline

Software Performance Test Engineer

Cengage
01.2018 - Current

Performance Test engineer

Department of Veterans Affairs
02.2013 - 12.2017

Test Engineer

Market Smith
02.2011 - 01.2013

Performance Test Engineer

Baker Hill Corporation
09.2009 - 01.2011

QA Tester

MetLife
07.2007 - 08.2009

BSc -

National University of Bangladesh
Litton Shahreair