Overview
Work History
Education
Skills
Websites
Certification
Work Experience Summary
Timeline
Generic

Ruchirkumar Trivedi

Edison,NJ

Overview

14
14
years of professional experience
1
1
Certification

Work History

QA Lead and Engineer/Automation Tester

Eversource Energy
01.2019 - 12.2023
  • Define and implement a comprehensive test strategy that incorporates both Automation and Manual Testing methodologies suitable for the organization's software applications and infrastructure
  • Plan testing phases considering the unique requirements of the energy supply industry
  • Design, develop, and maintain an Automation Framework using tools like Java and Selenium
  • Ensure the framework supports the scalability and modularity required for the various systems in the organization
  • Conduct regular reviews of the automation scripts and framework components developed by the team to ensure quality and adherence to standards
  • Set up and configure the necessary tools and environments for the selected framework
  • For a Cucumber framework, integrating it with tools like Selenium, and setting up a BDD environment
  • Designing Hybrid framework combination of key driven and data driven scripts, cucumber framework writing Gherkin syntax features and step definitions
  • Hands on experience in generating test scripts in TestNG and JUnit to Parameterized and Dynamic Tests
  • Develop and write test cases using JUnit framework to validate each unit of the software performs as designed
  • Implement test cases using JUnit annotations and assertions to validate expected outcomes
  • Create and develop JMeter test scripts to simulate various user behaviors and load scenarios
  • Generate and share automated test reports, highlighting the results, defects, and areas of concern
  • Integrate tests into CI/CD pipelines (like Jenkins or GitLab CI) ensuring tests run automatically with code commits or deployments
  • Work closely with stakeholders to understand big data requirements, including data sources, processing logic, and expected outcomes
  • Develop a comprehensive test plan outlining the scope, objectives, resources, schedule, and testing approach for big data testing
  • Collaborate with stakeholders to define UAT objectives, scope, and criteria
  • Develop a UAT test plan outlining the testing strategy, resources, schedule, and exit criteria
  • Identify and involve key end-users (UAT Testers) who will participate in the testing process
  • Set up a dedicated UAT environment that mirrors the production environment
  • Sign-off on the UAT phase, signaling that the system is ready for production deployment
  • Document the results of UAT, including test cases, test execution logs, and any feedback received
  • Create a UAT summary report to provide an overview of the testing process and outcomes
  • Validate the process of acquiring data from various sources and loading it into the big data ecosystem
  • Validate different configuration of Maximo to ensure it adapts to various environments
  • Test basic functionalities such as asset management, work order management, and inventory management
  • Test Maximo's integration with other systems (e.g
  • GIS systems)
  • Conduct load testing to assess how Maximo performs under heavy user loads
  • Test existing functionalities in Maximo after updates, patches, or upgrades to ensure that new changes do not negatively impact existing features
  • Validate data exchange between Maximo and external systems
  • Validate functionality of new enhancements of Maximo fields and work orders
  • Verify the accuracy and completeness of geographic data
  • Ensure that spatial data layers align correctly with real-world features
  • Measure the time it takes to render maps, execute spatial queries, and perform data updates
  • Test the accuracy of geocoding (converting addresses to geographic coordinates) and reverse geocoding (finding addresses from coordinates)
  • Test map navigation, zooming, panning, and other user interactions
  • Oversee the process of ETL Testing in informatica tool to ensure that data extraction, transformation, and loading processes are error-free, ensuring data integrity
  • Work with Business Analysts, Data Architects, and ETL developers(Informatica Developers) to understand data models, transformation rules, and business requirements
  • Regularly review test cases, SQL scripts, and results
  • Conduct Database Testing using SQL queries to validate data consistency, integrity, and accuracy in the database systems used by the organization
  • Validate data integrity, especially when Power Apps interacts with external data sources like SharePoint, SQL databases, or Data verse
  • Utilize Power Platform's analytics to assess app performance and usage
  • Integrate various testing tools with other tools in the DevOps pipeline, ensuring seamless data flow and reporting
  • Test microservices in isolation and in collaboration with other services
  • Ensure data migrated from an older system to a new big data system is done correctly, without any data loss or corruption
  • Ensure data processing and storage doesn't fail in case of a node or component failure
  • Write SQL queries to validate extracted, transformed, and loaded data
  • This includes Data checks between source and target systems, Transformation logic validation
  • Aggregation or business rule, Validation Null or data type checks
  • Identify, log, and report data discrepancies or issues with the ETL process
  • Create automation scripts using Tosca's script-less capabilities, ensuring they are modular and maintainable
  • Review and understand requirements logged in ALM to ensure test cases cover every requirement
  • Assign modules or components of the application in Tosca under test (AUT) to various team members
  • Generate comprehensive reports indicating data validation results, discrepancies, and performance metrics
  • Create and Maintained ServiceNow ATF scripts in different environments for running them in recurring year for ServiceNow upgrades
  • Creating and maintaining Manual test scripts for all the projects in excel data sheet
  • Validate data when moving from on-premises databases to Azure data services using tools like Azure Data Migration Service (DMS)
  • Test the integration points of various data services
  • For example, ensure Azure Data Factory pipelines are pulling and pushing data correctly between services
  • Test the integration between Databricks and other systems, tools, or services that the pipeline interacts with (e.g., databases, cloud services, messaging systems)
  • Implement regression testing to ensure that changes or updates to the data pipeline do not introduce new issues or break existing functionality
  • Ensure star schema, snowflake schema, or other data warehouse architectures are correctly implemented
  • Lead API Testing initiatives to validate the functionality, reliability, performance, and security of APIs used in the organization's software systems
  • Familiarize oneself with the API's endpoints, request methods (GET, POST, PUT, DELETE, etc.), request parameters, and expected responses
  • Use tools like JMeter or LoadRunner to check the API's performance under load
  • Test load balancing configurations to ensure that incoming traffic is properly distributed across servers
  • Execute performance tests using JMeter, monitoring system behavior under load
  • Design and execute Performance Testing strategies to ensure that the software applications and systems can handle the expected load, especially during peak energy distribution times
  • Even with a strong emphasis on automation, ensure that critical areas are validated through manual testing, especially scenarios that are complex or less frequent but have a high business impact
  • Mentor and train the QA team on industry best practices, ensuring proficiency in both manual and automated testing tools and techniques
  • Encourage continuous learning, especially in areas like Java, Selenium, and other tools that can enhance the automation capabilities of the team
  • Collaborate with developers, system architects, and business stakeholders to understand system functionalities, updates, and potential challenges
  • Generate clear and comprehensive reports on test progress, defects, and overall software quality
  • Maintain detailed documentation of test plans, test cases, and test results
  • Proactively identify potential risks in the software or systems, especially considering the critical nature of energy supply
  • Propose and implement mitigation strategies for identified risks.
  • Enhanced software quality by developing comprehensive test strategies and plans.
  • Streamlined QA processes for improved efficiency, reducing project completion time.

Sr. Automation/Manual Test Engineer

CVS
03.2017 - 12.2018
  • Developing Web-based data mining tool for worldwide factory use to query NFS test data archives and report failure statistics that enabled detection and analysis of failure trends by Quality Engineering team
  • Performing peer code reviews and provided debug support to product test code developers to ensure bug-free software releases to the manufacturing sites
  • Providing technical support and troubleshooting assistance to worldwide regional Test Engineers regarding test code implementation and the setup and configuration of factory test infrastructure
  • Tracking test code enhancement and defect submissions in Microsoft Project and defect tracking system and followed through on all issues to resolution
  • Demonstrated expertise in planning, designing, and executing comprehensive test strategies following the Waterfall model, ensuring the thorough validation of software requirements and functionalities
  • Proficient in sprint planning, backlog refinement, and executing test cases within Agile Scrum frameworks, emphasizing rapid feedback and adaptation to meet evolving project needs
  • Extreme willingness to learn and help others
  • Involving in tracking, reviewing, analyzing defects using Jira
  • Coordinating with the developers in resolving the testing defects
  • Worked on Selenium Web driver to write automation scripts for functional and regression testing
  • Developed effective, comprehensive automated tests that encompass pre-conditions, data setup, assumptions, and document test steps in a fast-paced agile environment
  • Handle tasks such as data ingestion, transformation, and processing using Azure Data Lake services
  • Additionally, they ensure data security and compliance, monitor data lake performance, and work with data engineering and data science teams to enable effective data-driven decision-making within an organization
  • QA estimation of Functional and Regression testing and assist in the identification of testing environment needs
  • Collaborated with product owners, developers, architects and UX to ensure product enhancements are delivered to Moody quality standards
  • Manage laboratory computer database tracking system to monitor testing process from beginning to end of production
  • Test the stored procedures functionality vs
  • ETL mapping after the migration of code from PL/SQL to ETL mappings
  • Work with data validation and testing of SQL and DB2 databases
  • Responsible for applying the Corporation’s Software Configuration Management processes to projects, setting up and maintaining TFS/GIT/GitHub infrastructure and supporting a continuous delivery model by automating software build and package migration processes
  • Integrated the test suites to Jenkins to execute them automatically after every successful deployment
  • Test all the extracted data from Teradata database that is transformed and load into target database
  • Impart and organize training to newcomers in the team regarding different technologies like SQL, PL/SQL and different processes
  • Perform smoke testing when migration requests to move the changes from development to QA / UAT / production environment
  • Involved with ETL test data creation for all the ETL mapping rules
  • Communicated discrepancies determined in testing to impacted areas and monitored resolution.

Quality Assurance Analyst/ETL Tester

EBSCO Information Services
03.2012 - 12.2016
  • Worked as Quality Assurance (Automation) Engineer in the project financial application that followed the agile model
  • Create test plan, test cases, test scripts to support specific software testing objectives, and assist team in creation, review, and finalization of agile stories and story acceptance criteria
  • Involved in regression test planning, work assignment, setting goals / objectives for the release, work trackers, and involved in regular project level status meetings
  • Prepared Integration Test Case Design Flow for the project
  • Designed Batch Testing Plan for the project (to validate host and flex job testing
  • Identified and added critical scenarios which were not covered in the Regression Inventory using GAP analysis
  • Attended daily defect status meetings and weekly team status meetings for reporting testing status and shared ideas for improved testing efforts
  • Performed analysis, Assessment, Task distribution, Scheduling, Reporting, meeting scheduling, etc
  • Using MS Project, MS Power point, MS word, MS Excel
  • Attended weekly status meetings with development and management Teams
  • Participated in the business workshops to understand the functional requirements and to document the assumptions and gaps
  • Reviewed the business requirements with the business teams and worked with IT team on infrastructure setup activities
  • Prepared the technical specification documents, unit test reports
  • Actively involved in all phases of the project, that includes SIT, UAT phases
  • Worked on data validation by running the various Autosys jobs
  • Executed the SQL queries to verify the data from the database tables
  • Created ETL Packages and implement new data extracts, transformations, and load routines
  • Worked closely with the development team and Business to develop the test cases and acceptance criteria
  • Worked with the Business teams to review the test cases
  • Used HP ALM tool to create and monitor the defects
  • Worked on testing the new versions of the code and software releases
  • Actively involved in SIT, UAT and Performance Testing
  • Documented the test results and reviewed them with the Business teams
  • Followed the QA Best practices and encouraged the testing standardization across all the objects in the project
  • Recorded test cases, Master test plan in HP ALM and generated defect reports, charts for statistical analysis
  • Participated in the business workshops to understand the functional requirements and to document the assumptions and gaps
  • Worked with the offshore development team for knowledge transition and offered support to resolve the defects efficiently
  • Coordinated identification of the test data needs and test data preparation with the business for testing various ETL load jobs
  • Conducted defect triage meetings to prioritize and resolve the issues with stakeholders
  • Ensured the fixed defects are retested to prevent reoccurrence
  • Worked well with the developers and business users to provide accurate estimates and completed the task on-time
  • Monitored Workflows and sessions using PowerCenter workflows monitor.

Data Analyst(Market Mapping)

Tata Consultancy Services
11.2009 - 04.2011
  • Interpret data, analyze results using statistical techniques and provide ongoing reports
  • Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality
  • Acquire data from primary or secondary data sources and maintain databases/data systems
  • Identify, analyze, and interpret trends or patterns in complex data sets
  • Filter and clean data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems
  • Work with management to prioritize business and information needs
  • Locate and define new process improvement opportunities.

Education

Master’s in technology management -

Herzing university
Atlanta, Georgia
01.2018

Bachelor’s in commerce -

Maharaja Sayaji Rao University
Gujarat, India
01.2008

Skills

  • ETL Database Testing/Informatica
  • SQL
  • Power BI and Reporting
  • Selenium with core Java
  • TestNG Unit Testing
  • Cucumber BDD Framework
  • Selenium Hybrid Framework/Data Driven
  • TOSCA/QTest Automation Test case development
  • Azure database/Cloud Platform exposer
  • ServiceNow(ATF)
  • JIRA
  • HPQC
  • GitHub
  • Waterfall & Agile Projects exposer
  • Performance Testing
  • API Testing with Rest assured
  • Requirements Analysis
  • Regression Testing
  • Test Case Design
  • Test Automation Expertise
  • Team Leadership
  • Risk Assessment
  • Project Management
  • Defect Management
  • Test Strategy Development
  • Cross-browser Testing

Certification

  • Automation Specialist Level 1
  • QTest Level-1
  • SQL for database science
  • ServiceNow- Automated Test Framework
  • Introduction to Generative AI

Work Experience Summary

Dedicated Test Lead and QA Engineer with 12 years of hands-on experience in the manual and automated testing fields. Expert in creating, leading, and optimizing whole testing lifecycles while ensuring that products meet the highest standards of quality, performance, and dependability. Experienced in creating and executing strategic test plans, writing test scripts on the run, leading agile testing teams, and working closely with development teams to reduce risks early in the software development lifecycle. competent in using the newest test automation tools and technologies along with a systematic manual testing approach to validate complex software ecosystems. skilled at establishing an environment of continuous improvement, QA best practices implementation, and thought leadership. demonstrated capacity to close the communication gap between technical and non-technical stakeholders, ensuring clear communication and aligned project objectives. Communicate test progress, test results, and other relevant information to project stakeholders and management. Result focused and customer-oriented QA lead/Engineer with Professional expertise in designing test strategy, test planning, estimation, leading testing effort for large/complex projects, Report Generating. Strong understanding of QA principles, QA Process, Use Cases, and Software Development Life Cycle – Agile, Waterfall and V-Model. Proven ability to lead complex testing project from initial conceptualization through implementation in Global delivery model with testing team based both onshore and offshore. Extensive working experience in functional testing, Blackbox testing, smoke testing, System testing, Regression testing, Strong experience working with Oracle, SQL server and MS Access. Designing of the automation framework, whether it's Data-Driven, Keyword-Driven, Hybrid, or Page Object Model (POM) in a Selenium Java setup. Periodically review the automation scripts developed by team members to ensure they adhere to coding standards and best practices. Plan and schedule test execution, especially if using tools like Selenium Grid to run tests in parallel across different environments. Hands on in preparing Automation test scripts using TestNG and Junit. Create comprehensive ETL test cases, keeping in mind data migration, data integrity, data transformation, and reconciliation processes by leveraging TOSCA capabilities. Schedule AutoSys jobs and monitor ETL test execution. Experienced in generating Power BI Reports (Preferable Power BI) is required. Implement monitoring and logging mechanisms within Talend jobs to capture performance metrics and diagnose issues in production. Ensure that the data sourced from various sources is loaded correctly into the big data system. Validate the processing logic (transformations, business logic, etc.) and ensure data is loaded into the target system correctly. Strong technical skills with Microsoft SQL Server Database and SSIS Developed test plans based on test strategy for Data Warehousing Application. Created and executed test cases based on test strategy and test plans based on ETL Mapping document. Written complex SQL queries for querying data against different databases for data verification process. Prepared for technical specifications and Source to Target mappings. Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security. Experience in data- driven test using Jenkins and Excel. Proficient in Functional Testing tool Quick Test Professional (QTP/UFT) and the various frameworks in QTP/UFT. Developed Hybrid and cucumber framework for executing selenium test scripts. Experience with continues integration tool Jenkins, build tool Maven and version control / source code management tool GitHub implementation for Selenium scripts and also had an exposure to DevOps. Good knowledge of Silk Performer LoadRunner JMeter and other load generation software applications Created and Verified Web services API requests, SOAP Protocols. Experience in manual testing with working experience on Functionality (Web Application), GUI, API testing, Regression, System, Integration, Ad-hoc, Sanity, Smoke and end to end Testing support. Good Knowledge on Mobile testing.

Timeline

QA Lead and Engineer/Automation Tester

Eversource Energy
01.2019 - 12.2023

Sr. Automation/Manual Test Engineer

CVS
03.2017 - 12.2018

Quality Assurance Analyst/ETL Tester

EBSCO Information Services
03.2012 - 12.2016

Data Analyst(Market Mapping)

Tata Consultancy Services
11.2009 - 04.2011

Master’s in technology management -

Herzing university

Bachelor’s in commerce -

Maharaja Sayaji Rao University
  • Automation Specialist Level 1
  • QTest Level-1
  • SQL for database science
  • ServiceNow- Automated Test Framework
  • Introduction to Generative AI
Ruchirkumar Trivedi