Summary
Overview
Work History
Education
Skills
Key Roles & Responsibilities / Strengths
Timeline
Generic

Sreekrishna Duddumpudi

Summary

Over a decade of IT experience specializing in Performance Engineering and Testing services. Successfully led the establishment of Testing Center of Excellence (CoE) efforts within banking and retail domains. Provides thought leadership and expertise in Non-Functional Testing, excels in defining appropriate testing strategies for complex software systems. Effectively communicates QA status in Daily Scrum Meetings, highlighting risks and delays. Strong experience in performance testing tools and methodologies across various technology platforms. Expertise includes monitoring Windows/Unix environments, analyzing test results, troubleshooting issues, and leading tuning activities for optimal performance outcomes.

Overview

13
13
years of professional experience

Work History

Performance Engineer

Southwest Airlines
07.2022 - 03.2025
  • Participate in sprint planning to understand strategy for performance testing & system reliability testing
  • Observability of different system’s to measure the current state based on the data it generates, such as logs, metrics, and traces. Using AppD and [iTops for Alerts and Notification], Grafana (To monitor queue messages), & Splunk [Message logs]
  • Created automation Regression suite for all service – control panel (SCP) pipeline services in Karate Framework and conducted regression testing pre and post cycle of CICD implementation
  • Effective Communication with all stakeholders during project lifecycle and partnering with risk analysis and mitigation plan.
  • Design and implement the Performance test scripts using load runner & Jmeter tool for Web(http/html), Web services
  • Design the test scenario for different load conditions (Load, Volume, Stress, Endurance Testing) and conducted load testing in Blazemeter for Jmeter scripts and Performance Center for LR scripts
  • Performing test execution and monitoring JVM, Threads , top SQL queries using AppD , result analysis and Reporting.
  • Providing recommendations to optimize the client side and server-side performance of the application.
  • Create the final performance impact analysis report in confluence with detailed observation and recommendations for each release.
  • Act as one point of contact for the performance testing activities.
  • Environment: LoadRunner 12.62, Jmeter, Micro Focus Performance Center 12.62, Grafana, Swagger, Web Services, Web HTTP, App Dynamics , Blazemeter

Senior Software Engineer

Wells Fargo Bank
04.2020 - 05.2022
  • Company Overview: Wells Fargo Company is a nationwide, diversified financial services company with$1.6 trillion in assets. Founded in 1852, Wells Fargo provides banking, insurance, investments, Mortgage, and consumer and commercial Finance through more than 9,000 locations, more than 12,500 ATMs, online (wellsfargo.com), and mobile devices.
  • Develop and Implement strategy for Load/Performance/Scalability/Resiliency Testing to ensure all applications meet performance and NFR requirement for each release.
  • Design of testing framework Monitoring strategy, Test environment planning and readiness
  • Performance problem analysis, system analysis, review of business requirement document and functional/nonfunctional requirement documents.
  • Partnering with developers, business analysts, operational team and other stakeholders during the project lifecycle.
  • Effective Communication with all stakeholders during project lifecycle and partnering with risk analysis and mitigation plan.
  • Design and implement the Performance test scripts using load runner tool for Web(http/html), Web services Tru-Client & Citrix protocols.
  • Design the test scenario for different load conditions (Load, Volume, Stress, Endurance Testing)
  • Performance test execution, Monitoring, result analysis and Reporting.
  • Providing recommendations to optimize the client side and server-side performance of the application.
  • Providing recommendations for server configuration for production environment to ensure continuity of business.
  • Create the final performance impact analysis report with detailed observation and recommendations for each release.
  • Act as one point of contact for the performance testing activities.
  • Environment: LoadRunner 12.56, Micro Focus Performance Center 12.56 SQL, Java, .NET, SOAP UI, Web Services, Web HTTP, App Dynamics and Pivotal Cloud Foundry (PCF)

Sr. Performance Lead Engineer

CGI – PNC Bank
07.2018 - 03.2020
  • Creation of project plan, test plans, estimations, development and tracking projects at every phase
  • Responsible for defining the performance goals and objectives based on the client requirements and inputs.
  • Used VuGen to generate Vuser Scripts for Web (HTTP HTML), Web Services, and Winsock.
  • Created Performance Test scripts for SalesForce application using Web(HTTP/HTML) and API’s services.
  • Executed multi-user performance tests, used online monitors and real-time output messages.
  • Analyzed, interpreted, and summarized relevant results in a complete Performance Test Report.
  • Developed and implemented load and stress tests with Performance Center, and present performance statistics to application teams.
  • Automated test scripts for regression testing.
  • Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
  • Analyzed the results of scripts execution by using Performance Center.
  • Developed Load Stress testing scenarios for performance testing using Controller by creating 1000 to 1500 virtual users.
  • Analyzed server resources such as Available bytes and Process Bytes and Heap usages to look for performance bottlenecks.
  • Analyzed the server resources such as available bytes and process bytes for Memory Leaks.
  • Responsible for monitoring & tuning performance of all testing system in the Landscape.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
  • Developed Vuser scripts and enhanced the basic script by adding custom code.
  • Prepared data for Parameterization of the values in the scripts for multiple scenarios by querying the Oracle data.
  • Introduced rendezvous points in the script for stressing the application for specific transactions.
  • Wrote comprehensive Performance Test Plan.
  • Environment: .Net application, Linux & WebSphere, Jmeter, Dynatrace, Splunk for log analytics.

Application System Engineer

Wells Fargo
03.2015 - 05.2018
  • Understand the Functional System Design documents and prepare the Test plan for every release
  • Worked on three applications in parallel which involved MQ, ETL, SP and web services
  • Collecting the production data and prepare the volumetric (work load model).
  • Validated existing test scripts and created new scripts whenever new requirement comes for the application in the release.
  • Designing the test scenario for different load conditions (Load, Volume, Stress, Endurance Testing)
  • Prepare test data using SQL queries in each release
  • Worked on Unix servers and did environment setup before actual execution kick off
  • Followed up with environment team for code deployment and servers configuration changes whenever required
  • Monitored servers using Introscope , SiteScope, jvisual VM and by running unix commands in servers when test was running
  • Monitored various performance counters like Java Heap memory usage, CPU Utilization, Free Physical memory by using wily Introscope
  • Found Memory issues in applications by analyzing GC logs and also analyzed application logs for evaluating application errors
  • Taking thread dump whenever it was necessary and analyzing same to find blocked/stuck threads to find root cause of application issues
  • Prepared Performance Impact analysis document and shared to stake holders and other teams with these analyzed results for every application in every release by comparing with previous release /test results
  • Highlighted applications performance issues observed during testing to Environment and development teams
  • Environment: LoadRunner 12, 11.52, Linux, Quality center /ALM 11.0, SQL, Java, .NET, SOAP UI, Web Services, Web HTTP, batch jobs, OEM and Dynatrace

Programmer Analyst

Cognizant
08.2012 - 02.2015
  • Worked with the Business Analysts to determine Business Requirements and set standards for Performance Evaluation in Agile methodology.
  • Responsible for performance testing using Performance Center.
  • Developed Load Test Scripts by using Performance Center for entire site and did the Parameterization, Pacing, and correlation.
  • Responsible for setting runtime settings in Performance Center.
  • Involved in QA processes and methodologies including participation in requirements sessions, use case reviews, design meetings, QA task planning, work effort estimations and risk mitigation skills.
  • Correlated the dynamically created session data in the Vuser Scripts in VuGen to synchronize with the application.
  • Performed Load and Stress Testing using LoadRunner.
  • Responsible for Performance testing Java application.
  • Involved in designing Load, Stress and Failover Testing scenarios based on SLA for various systems and future load projections.
  • Well versed in using Load Runner’s Web/HTTP, Web Services protocols.
  • Coordinated with legacy team to schedule batch jobs.
  • Responsible for scheduling monitoring scenarios and analyzing results for identifying performance bottlenecks.
  • Used Web Services protocol to transfer SOAP messages from one environment to another environment.
  • Involved in executing the scenarios and monitoring the server response timings monitoring throughput Hits/Sec and Trans/Sec.
  • Identified performance issues with Load Balancer configuration and settings memory leaks deadlock conditions database connectivity and hardware profiling.
  • Acted as coordinator for performance testing activities with the client as well as with offshore team to provide maximum testing support.
  • Analyzed Online Monitor Graphs like Runtime Graphs, Transaction Graphs, Web Resource Graphs, and System Resource Graphs.
  • Cooperated and performed with development group to resolve the difficulties came across in a test execution.
  • Involved in audit meetings and got an initiative to reach QA testing goals. Prepare and present statuses (Daily, Weekly, monthly, post release etc.)
  • Environment: LoadRunner 11, 11.52, Linux, Quality center /ALM 11.0, SQL, Java, .NET, SOAP UI, Web Services, Web HTTP, and Dynatrace

Performance Engineer

MetLife Insurance Company
12.2012 - 05.2013
  • Defining performance goals and objectives based on the client requirements and inputs.
  • Used Neoload tool to create performance test scripts for Web (HTTP HTML) and Web Services
  • Extensively worked in Web and Web Services Protocols in Neoload
  • Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production.
  • Responsible for developing and executing performance and volume tests
  • Develop test scenarios to properly load stress the system in the lab environment and monitor debug performance & stability problems.
  • Partner with the Software development organization to analyze system components and performance to identify needed changes in the application design
  • Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
  • Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
  • Met with developers to generate Test Data.
  • Wrote test plans, test scripts and scenarios for functional, Load, Stress, Performance and regression testing, Peak and Endurance Testing.
  • Good understanding of components such as Controllers, LG machines, Diagnostics, etc.
  • Performed statistical analysis of test results to report on the quality of the software.
  • Evaluated test results and maintained an issue tracking database to provide feedback to the development team.
  • Strategized plan schedule and coordinated testing to meet delivery dates and commitments.
  • Environment: Jmeter, Load Runner SQL, Agile, SOAP UI, XML, C, UNIX, MS Visio, Windows XP.

Performance Test Analyst

ACE Insurance
08.2012 - 12.2012
  • Designed and developed the High Level & Detailed Test Plan using standards.
  • Extensively involved in documenting the complete testing process based on the functional Requirements.
  • Developed Test scripts using LoadRunner by recording test cases and enhanced scripts by adding checkpoints, parameterization and correlation.
  • Responsible for Automation Scripts & library functions maintenance.
  • Used SQL queries and generated reports. Ensure all test plans, test cases and documentation were traceable to original requirements.
  • Worked closely with software developers, engineers in fixing the defects.
  • Sent the reports to all team members with Daily Status to track the updates in testing.
  • Environment: LoadRunner 9.52, Quality Center 9.2, Oracle 10g, SQL, Agile, SOAP UI, XML, C, UNIX, MS Visio, Windows XP

Education

Master of Technology - Computer Science and Engineering

Vellore institute of technology
05.2012

Bachelors in Technology - Computer Science and Engineering

Anna university
05.2010

Skills

  • Web/App Servers IIS, Web Logic, Web Sphere, Apache Server
  • Operating Systems Oracle Linux & Windows Servers
  • Test Automation Tools Micro Focus PC, Load Runner, NeoLoad , Jmeter
  • Server monitoring Tools AppDynamics, Perfmon, Splunk, Sitescope, New Relic, Wily Introscope, Thread Dump & Heap Dump Analysis, Linux Commands
  • Network Filters HTTP Watch, Fiddler, Wireshark
  • Databases SQL & DB2
  • Issue/Bug Tracker Jira & ALM
  • Cloud Environment Pivotal Cloud Foundry (PCF) , AWS

Key Roles & Responsibilities / Strengths

  • Application Performance Testing: Interaction with multiple stake holders to understand the Client requirement and their production environment (existing or proposed) to provide a suitable strategy to test the performance of the application. Understanding the business of the client through interactions with the Business Analysts, Project Management teams and identify the business case scenarios for scoping. Skilled in various automation tools and monitoring tools (Sitescope, Wily-Introscope, AppDynamics, DynaTrace etc). Have a good experience on latest technologies like open Dashboard , AWS - Cloud Watch monitoring tools on Cloud environment. Automation of the business scenarios into test scripts using performance automation tools like Load-Runner , & Jmeter. Well versed in using Load Runner’s Web/HTTP, Web Services, Tru Client & Citrix protocols. Executing multiple tests (load, stress, soak, Scalability, failover, Chaos) and monitoring the test. Analyzing the results and provide report with recommendations (if any) for better performance. Experience on web Analysis / debugging tools Fiddler, Http Watch, Dynatrace Server.
  • Application Performance Management: Setting up of monitoring tools (App Dynamics, Splunk) and Setting up of alerts, emails and snapshots for above threshold utilization, errors (critical) and user defined system anomalies. Monitoring the application health, performance and reporting the discrepancies to the concerned stake holders. Design the workload models like Smoke, Load, Volume and Endurance Tests using Performance center, ALM, Controller, and Jmeter and analyzed the client and server-side performance results. Extending support to the project team with Root Cause Analysis of the observed issues. Analyze the application behavior and project future needs of hardware.
  • Analytics: Analysis of the performance test results including client side (User count, Server Hits, Response Times, Throughput, Transactional information) and server side (CPU, Memory, Network, Disk utilization). Analysis of thread dump and heap dump logs from application server. Provided a detailed Non-Functional analysis report which helped in fixing the memory leaks, time consuming methods, web service calls, network attributes, & CPU utilization and other performance aspects. Performing Root Cause Analysis. Data extraction from application servers. Data preparation by mining the data from raw server logs (profiling, message, catalina, access logs) and creating usable data. Analyzing the information from the logs. Analysis report includes. Application behavior - User load sustenance, Response times, Request processing capacity. System behavior - CPU utilization, Memory Utilization, Threads utilization, JVM utilization, DB Response Times. Issues - Memory Leaks, High CPU utilization, Thread utilization, Heap utilization, High Response Time. Errors - HTTP errors, Data Errors, System errors.

Timeline

Performance Engineer

Southwest Airlines
07.2022 - 03.2025

Senior Software Engineer

Wells Fargo Bank
04.2020 - 05.2022

Sr. Performance Lead Engineer

CGI – PNC Bank
07.2018 - 03.2020

Application System Engineer

Wells Fargo
03.2015 - 05.2018

Performance Engineer

MetLife Insurance Company
12.2012 - 05.2013

Programmer Analyst

Cognizant
08.2012 - 02.2015

Performance Test Analyst

ACE Insurance
08.2012 - 12.2012

Bachelors in Technology - Computer Science and Engineering

Anna university

Master of Technology - Computer Science and Engineering

Vellore institute of technology
Sreekrishna Duddumpudi