Summary
Overview
Work History
Education
Skills
Skill Set - Technical Environment
Timeline
Generic

Ruthvij Polasa

Atlanta,USA

Summary

Over twenty-two years of experience in Design, Development, Quality Assurance, Production Operations and Analysis of software applications. Managed a team of DevOps Engineers responsible for deploying and managing the solution across various deployment scenarios, both on-premises and in the azure cloud. Collaborated closely with infrastructure, application, product and service architects to ensure the viability of the solution for multiple deployment environments. Established and maintained build and release pipelines using Azure DevOps Release Management, PowerShell, and bash scripts. Administered Azure DevOps organization to maintain security and reliability for all personnel. Managed high availability systems in remote datacenters using Windows Server, IIS, SQL Server, and Oracle DBs. Collaborated with IT and development teams to ensure site reliability, performance, and continuous delivery of releases and updates. Strong knowledge of Software Development Life Cycle (SDLC) & Software Testing Life Cycle (STLC). Experience in creating Manual and Automated scripts for performing functional, regression, integration and system testing using Test Script Language (TSL). Experience in leading Quality Assurance process through test environment setup, test plan sign off, pre-beta testing, setting up the process to resolve and track issues, UAT, Creating deficiency reports, summary of test results and final results sign off Expertise in review of Business Functional Specifications, Functional Design Specifications, and Detailed Design Specifications Interacted with Developers and discussed technical problems and reported bugs Involved in review meetings with Project Managers, Developers and Business Associates for Project Planning, Coordination and implementing various QA methodologies Ability to direct multiple projects effectively in fast paced and high-pressure settings A skilled problem-solver and an efficient Team player, collaborative and self-starter, excellent oral and written skills with try to break the system aptitude. Extensive experience in building and executing test plans and test cases according to business, functional and user requirement specifications. Experience includes testing of Client/Server and WEB based applications using automated testing tools. Experienced in defining Testing Methodologies, Designing Test Plans and Test Cases, Verifying and Validating Web based e-Commerce applications and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Life Cycle (SDLC). Experience in using Quality center 9.2, IBM Jazz for storing test plans, test cases and test procedures and used as Bug Tracker. Coordinate in Integrated and Non- Integrated Releases (Agile). Excellent in writing scripts in TSL, Shell Scripting, Perl, VBScript and JavaScript. Expertise in SQL, PL/SQL and Oracle 8i/9i. Experience with the relational databases like Oracle and SQL-Server. Experience working with distributed teams (offshore). Involved in reviewing the functional and non-functional specifications which are to be included in the project/release

Overview

22
22
years of professional experience

Work History

Production Operations - Software Development Team Lead/Manager

Amdocs
02.2011 - Current
  • Openet/, Telecom is a world-leading provider of scalable processing and transaction management solutions
  • As a solution provider to AT&T implemented:
  • Managed team of DevOps Engineers as they worked to deploy and manage the AT&T solution on-premises or in the cloud
  • Prepared and maintained PRR (Production Readiness Review) tracker, to make sure all deliverables in place before production deployment (Go/No-Go)
  • Made sure all CRs go thru thorough approval process and SOX compliance
  • Worked closely with infrastructure, application, product and service architects on making the solution viable for multiple deployment scenarios
  • Established and maintained build and release pipelines and processes using Azure DevOps Release Management, PowerShell and bash
  • Guided a development team switching from Windows/IIS environment to a new Cloud-targeted application
  • Worked together with the team to define the architecture and provided them with all the tools and procedures needed to run the application in test and then production environments starting from writing docker files and helm charts, setting up application and environment configuration, CI/CD pipelines and finishing with running and monitoring the application in production environment
  • Used Docker and Kubernetes to manage, scale and update containers for improved automation possibilities
  • Provided database administration including replication, log shipping, performance tuning, backup and maintenance using database engines such as SQL Server, PostgreSQL
  • Maintained log aggregation and monitoring tools like Microsoft Azure Application Insights, SolarWinds AppOptics, Pingdom
  • Created and executed automation processes, enhancing application scalability and functionality
  • Developed automation scripts using PowerShell, Python and bash
  • Maintained systems and procedures documentation
  • Built effective pipelines and enhanced production infrastructure through targeted database and cluster management
  • Quickly responded to and resolved debugging issues within production systems, minimizing error and risk
  • Documented and supported standards and procedures per organizational and client requirements
  • Openet Policy Controller (OPC) is Openet’s implementation of 5G Core Network (5GCN) Policy Control Function (PCF) & Binding Support Function (BSF) Network Function
  • OPC solution provides 5G PCF, 5G PCF + Policy and Charging Rules Function (PCRF) Service Based Interface (SBI), and 4G Cloud Native PCRF Diameter capabilities
  • OPC is compliant with 3GPP Release 15 Policy and Charging Control (PCC) PCF
  • The Openet Binding Support Function (BSF) NF enables the 5G Core Network architecture to support the binding of Application Function (AF) or Network Exposure Function (NEF) requests to a specific PCF NF instance that maintains Policy and Charging Control (PCC) over the related UE's PDU Session
  • This is especially important in large deployments where multiple and separately addressable PCF instances are deployed
  • Mediation solution to manage MVHS, DATA and Voice services
  • Fusion Works-Balance Manager to manage account balances in real-time, organize subscriber hierarchies and enforce spending limits
  • Worked on various BM enhancement projects and Change Request testing/Certification which includes SBP, Postpaid and MRC customer types
  • Analyzed Business requirements, HLD and ADD to develop Test Plans, Test Cases and Test Scripts for the Functional and Regression testing
  • Prepare Test Package and RTM on Monthly releases for Internal and External review calls with Development and Project Management Team
  • Extensively used Quality Center to maintain Test cases and for tracking Defects
  • Documented POT (proof of testing) on each test execution
  • Automate test scripts using UNIX commands and shell scripts
  • Extensively use VNC/Putty to access UNIX box for starting Servers and executing ATF UseCases
  • Verify and validate test results by checking Logs
  • Verify the integrity of the database using SQL queries through Oracle Sql Developer
  • Worked on migration projects, such as migrating PreLTE subscribers from GCDR-monitored to Sy-monitored interface (PCRF)
  • Mock the Test data according to validate scenarios
  • Used WebTrax to Create and Track WorkRequests
  • Co-ordinate with Frontend application team to test happy path scenarios and validate integrity between applications
  • Effectively worked with business team with regards to various projects SRS documentation updates and corrections- EOD Prime Project (2013)
  • Understand ADD/Business Requirements and help design UseCases/Testcases
  • Designed and executed Test cases using Openet’s proprietary testing tools (WMED/ATF)
  • Thorough understandings on applications like Balance Manager, OEP, Mediation (MVHS, DATA and Voice)
  • SME for Balance Manager Application
  • Analyze requirements in detailed and raising flags if there are any contradictions in the business rules
  • (ADD Vs HLD)
  • Worked on device testing (Birdy project)
  • Worked as part of DVT's (BM, OEP)
  • Worked effectively on Mediation Migration (U2L) efforts
  • Worked effectively on Pre parallel testing, Comparison testing and load testing on U2L migration project
  • Worked effectively on enhancement Mediation projects (MVHS, DATA and VOICE) and Willows
  • SME on Willows and Mediation (MVHS/DATA) tracks
  • Minimize complexity and actively seek ways to simplify i.e
  • By using automated testing tools to execute and maintain the artifacts
  • Deploying new Workspace/ Builds
  • Working as one of the OPC service delivery (SD) team member since, Coordinate with Product Team with regards to OPC releases
  • Product Team Provides fix in release location (confluence pages) as helm charts or images or config changes
  • SD Team will take specific config file or image or helm
  • SD team checks in new helm chart in repo central, images are pulled by proxy
  • SD team also creates new Release branch in AT&T GIT (code cloud) with release specific changes
  • Involved in IoT testing
  • Developed karate scripts for functional and Regression Test suites
  • Involved in OPC code drops/installations using MOP in ATT Labs
  • Performed Smoke test using Jenkins upon successful installation in ATT Labs
  • Supporting ATT test team and troubleshooting if they have come across any issues in their test scenarios
  • As a team lead handling all Prod Ops team’s responsibilities
  • Supporting ATS team with regards to certifying MOP
  • Environment: Azure DevOps, CI/CD, DevOps, GitOps, OpenStack, Kubernetes, Flux CD, docker, helm, Pingdom, Jenkins, Karate, Unix, XML, HTML, Oracle Sql Developer, VNC 4.1.2, JIRA, UML, Quality Center 10.0, Windows NT, WebTrax, ATF.

UAT Team Lead/ Sr. QA Engineer

Cox Communications
Atlanta, GA
04.2010 - 01.2011
  • Cox Communications is one of the most popular Cable companies, worked on wireless implementation projects which involves MVNO network model and own network model
  • Tested and Certified Siebel CRM application developed for Telesales and Retail channels
  • Application allows Telesales and Retail agents to place orders, activations and provision wireless accounts
  • Manage phone lines, pay bills, upgrade/down grade phones, manage minutes and many more additional VAS services
  • Worked on various projects related to the Billing, web enhancements, upgrade services, family allowances, the Taxes being charged, any new service that being introduced into the market like new Plans, phones, services etc
  • Project involved functional, regression, GUI, smoke, backend testing of the application to meet and validate specified requirements
  • Responsibilities:
  • Worked closely with User groups to determine user requirements and goals
  • Analyzed Business requirements and System specifications to develop Test Strategies, Test Plans, Test Cases and Test Scripts for the Functional, Integration, Regression testing and User Acceptance testing
  • Baseline In-scope and Out of scope test cases and maintain Traceability matrix mapping to requirements
  • Executed test cases and performed functional and regression testing
  • Perform VAS feature testing such as mPortal, rDVR, UCM, WAPS, SMS, MMS and email
  • Ensured the execution of UAT test cases and documentation of test results
  • Provided UAT testing status information to the QA Manager, project manager, and various business stakeholders
  • Executing UAT scripts, in order to ensure operational quality, system integrity and verify system processes meet user needs
  • Create UAT Test Strategy, UAT Test Plans and UAT defect report
  • Documentation of Requirements, Wireframes, Specifications, Design, Quality Test Plan, Integrated Test Plan and Software Turnover Document
  • Used Quality Center to log in defects and for Defect Tracking
  • Executed Production verification, Roaming verification and Porting test cases
  • Perform Device testing on White label phones such as Samsung, HTC, Apple-iPhone, Motorola, LG and Cox branded Kyocera phones
  • Execute Roaming Test cases by visiting market places (Oklahoma City, OK)
  • Used Remedy to log production defects and raise CUI or UNO ticket based on the severity and route to production support team
  • Tracked the progress of test case planning, implementation and execution results
  • Assisted project teams to implement and document standards, procedures, artifacts, and plans consistent with QA and Test deliverables for the project team
  • Performed various boundary tests as well as data validation to ensure record retrieval tests
  • Created and executed Manual Test scripts to verify complex system requirements and to validate new functionality
  • Addressed high risk areas early, refined requirements as the project evolved, accommodated change and moved forward even without a firm set of requirements
  • Performed testing iteratively by fragmenting milestones and by testing important scenarios at the end of each Iteration
  • Environment: Siebel CRM, JSP, XML, HTML, MS Sql, QTP 8.0, Quality Center, Windows NT, Toad, Remedy.

Test Lead/ Sr.QA Engineer

InterCall
Atlanta, GA
10.2007 - 03.2010
  • InterCall is one of the most popular and rapidly growing conferencing company
  • As a Quality Assurance Engineer maintained QA standards, procedures and processes (FRD’s, HLD’s and Build Reports)
  • Manually tested different billing applications like CAS Core App, CAS GUI, Billrunframework and e-commerce enabled web products like WebEx, IUM and IWM
  • Ecommerce is an optional feature that InterCall will turn ‘ON’ upon a customer’s request
  • Project involves Designing and execution of test cases for production bug fixes and enhancement tickets
  • Project involved functional, regression, GUI, Load, Smoke, backend testing and upgrades & network support
  • Responsibilities:
  • Involved in writing several test cases for CAS web and core applications and execution based on FRD and HLD
  • Converting Business and System requirements into positive and negative test cases
  • Performed both manual and automated tests to conduct functional and regression tests on the application
  • Perform the Acceptance, Mod, Regression, Browser and Post deployment testing
  • Responsible for User Acceptance Test (UAT) Planning
  • Designing UA Test Cases
  • Selecting a Team that would execute the (UAT) Test Cases
  • Executing Test Cases
  • Documenting Test Results, Defects for every test cycle
  • Resolving the issues/Bug Fixing
  • Sign Off
  • Used QTP for functional and regression testing and Developed reusable VB scripts
  • Involved in White Box testing and Black Box Testing
  • Responsibilities include regularly interacting with the project manager and prepare the project plan for Integrated and Non-Integrated releases (Agile)
  • Prepare Code Publication Document (CPD) for the deployment (Integrated & Non-Integrated Release)
  • Participated in releases and Network upgrades for the projects and perform smoke testing
  • Perform testing on Audio bridges like Spectel, Compunitex, IVR and IICP (WIC) Usage collection in respective Databases
  • Validated various processes (Collector, Convertor, Pre-rater and Extraction) involved in CAS Middle ware application manually using SQLs
  • Created process flow charts for each bridge process
  • Conducted defect management tracking, verification and closure using Quality Center 9.2
  • Mocking the Test data to validate various scenarios
  • Validated the scalability of the application by configuring Config.xml file
  • Verify the integrity of the database using SQL queries through Aqua data studio (IDE)
  • Extensively used Putty to access Server and Deploy code (Global.jar & Config.xml files) using UNIX commands for each CM & Bug tickets for Integrated and Non-Integrated releases
  • Checked the Log files for each core application processes
  • Took initiatives in co-ordinating with various teams to create test data for Genesys (IUM) ->Intercall Usage streaming & billing automation process
  • Worked as a primary QA Engineer on highly monitored project by Leadership Team
  • Project involves Web CDR fetching, collecting, web-audio synchronizing (JMS messages), rating and extracting (DI jobs) IUM Per Minute and Multimedia usage, Time zone conversions and Topaz monitoring
  • Share innovative ideas among the team in finding a work around for a scenario
  • Prepared process flow diagrams using MS-Visio
  • Knowledge transfer to new hires, offshore coordination
  • Facilitate the status call with Project Manager
  • Environment: Quality Center 9.2, Java, UNIX, Visual Basic 6.0, ASP.NET 3.0, QTP 8.2,XML, XPath, XQuery,XSchema, ORACLE 9i, Java Script, HTML, Aqua data studio 4.7.2, Data Integrator XI 3.0, JMS, Windows 2000/NT, Putty.

Sr.QA Engineer

Siemens Information Processing Services
Chennai, India
11.2005 - 09.2007
  • The application was developed for Siemens Energy Services (SES); this system is very user friendly for entering the data in the appropriate fields for users
  • The project titled “Billing System” is a system which is developed for the maintenance of the entire Billing details of the SES, UK
  • In this project we can feed and maintain the details of the various customers' information, and take meter operation requests through Appointment Booking application (SiemensAppointmentBooking and Reporting Engine-GUI App), various transactions such as Meter changes, Tariff changes, Address Changes and Spot Billing etc
  • Responsibilities:
  • Involved in all phases of testing life cycle (analysis, design and execution)
  • Created test requirements based on the business specifications and listing of test requirements at very high level
  • Document overall test strategy for unit, integration, system, regression and UAT testing
  • Extensive experience in functional testing, unit testing, integration testing, regression testing, black box testing, GUI testing, back-end testing, browser compatibility testing, load/performance testing in different stages of Software Development Life Cycle (SDLC)
  • Peer review of the documents like Test plan and test scripts developed by the Team
  • Training the user how to use the tracking system and Life cycle of the bug reports in the tracking system during UAT
  • Used structured manual testing processes and techniques while exploring automation Possibilities
  • Involved in development of Test plan of the testing process and determine the test environment requirements for the application
  • Analyzed system requirements, developed & executed detailed Test plans, Test cases, Test data, TSL scripts for testing the functionality, GUI, security, and usability of the Join Process
  • Responsible for Integration, Front-end and Backend testing
  • Updated QTP scripts for regression testing and run them in different environments for data validation
  • Utilized Test Director for creating use cases / test cases and implementing requirements trace ability to showcase full coverage of testing the applications
  • Used Test Director as the defect-tracking tool to report application defects and enhancement requests
  • All the QA documents were according to SOX compliance
  • Performed the Back-End integration testing to ensure data consistency on front-end by writing and executing SQL queries on the Oracle database
  • Evaluate and Report Test Results & reported the overall progress periodically to the Project Management.

QA Engineer

Hewlett-Packard, GeBC
Chennai, India
03.2003 - 09.2005
  • The Configuration Database system is developed to provide the end user with the information required to configure and setup various server systems
  • This system also serves as a knowledge base for any technical issues user encounters
  • This system was developed as Oracle forms application with Oracle reports used as the reporting tool
  • Responsibilities:
  • Involved in back-end testing (Manual) for the full functionality of Web Services
  • Responsible for requirement analysis, identification and documentation of required system and functional testing efforts for all test scenarios (Positive and Negative tests)
  • Derived and developed Requirements, Functional, Regression Test Cases from Use Cases and Test Scenarios
  • Involved in preparing Use Cases and Test Scenarios based on BRD (Business rules documentation) and Technical Specification
  • Document overall test strategy for unit, integration, system, regression and UAT testing
  • Peer review of the documents like Test plan and test scripts developed by the Team
  • Training the user how to use the tracking system and Life cycle of the bug reports in the tracking system during UAT
  • Used Test director to write Manual Test Cases, organize, and execute the test cases efficiently
  • Executed test cases, tracked and analyzed defects using Web Service Tool called Clear Quest
  • Defects were reported and tracked fixes using the Bug tracking tool called Clear Quest
  • SQL server used (Wrote SQL queries) for inserting, retrieving and updating the data at backend level
  • (To generate different sets of custom data)
  • Executed SQL scripts/queries for data verification to compare the expected results with database
  • Tested AS/400 applications through user interface for the business process functionality as well as browsed the intermediate files for code updates
  • Responsible for core data conversions and functionality enhancements from mainframe to SQL server
  • Mainframe Testing included but not limited to systems, systems integration, functional and UAT testing
  • Involved Partially In Front End Application Testing For UI and Regression Testing
  • Evaluated and Reported Test Results & reported the overall progress periodically to the Project Manager
  • Submitted Status Report and filled the Dashboard for QA Manager and Project Manager Weekly
  • Close interaction with software developers to understand application functionality and performance issues
  • Highly Involved in Managerial Meetings and attended various conference calls at different levels from QA, Developers and Clients
  • Worked with business customers, software engineers, QA engineers, and project leads to ensure successful roll out of high-quality application
  • Environment: Microsoft SQL Server, Mainframe, AS/400, XML, IIS 5.0 and 6.0, IE 5.0, Windows XP

Education

MBA - Information Systems

Sikkim Manipal University
India

BSc - Computer Science

Osmania University
India

Skills

  • Microsoft Azure DevOps
  • CI/CD
  • DevOps
  • Docker
  • VMware Client35
  • VSphere Client40
  • RedHat Linux 4
  • RedHat Linux 5
  • RedHat Linux 6
  • RedHat Linux 7
  • Ubuntu
  • CentOS
  • Solaris8
  • Solaris9
  • Solaris10
  • Unix
  • Windows servers 2003
  • Windows servers 2008
  • Windows servers 2008 R2
  • Windows servers 2012
  • Windows 2000
  • Windows XP
  • Windows 7
  • Windows 10
  • SDLC
  • Agile/Scrum
  • Waterfall
  • Chef
  • Puppet
  • Ansible
  • AWS
  • CA Workload Automation (AutoSys) AE
  • Windows Task Scheduler
  • Kubernetes
  • OpenShift
  • JIRA
  • Bit Bucket
  • Confluence
  • Jenkins
  • SVN
  • CVS
  • Git
  • GitHub
  • Gitlab
  • Bitbucket
  • Ant
  • Maven
  • Apache HTTPD
  • Apache Tomcat Server
  • WebSphere
  • LDAP
  • JBOSS
  • IIS
  • SMTP
  • SNMP
  • ICMP
  • TCP/IP
  • FTP
  • TELNET
  • DHCP
  • DIG
  • UDP
  • RIP
  • OSPF
  • BGP
  • Eclipse IDE
  • IntelliJ IDEA
  • NetBeans IDE
  • Notepad
  • Oracle 11g
  • Oracle 10g
  • Oracle 9i
  • SQL Server 2012
  • SQL Server 2008
  • MySQL
  • Teradata
  • IBM Netezza
  • C
  • Java
  • NET
  • Python
  • Ruby
  • Perl
  • Shell Programming
  • Shell
  • Oracle

Skill Set - Technical Environment

Microsoft Azure DevOps, CI/CD, DevOps, Docker, VMware Client3.5, vSphere Client4.0, RedHat Linux 4/5/6/7, Ubuntu, CentOS, Solaris8/9/10, Unix, Windows servers [2003, 2008, 2008 R2, 2012] Windows 2000, XP, Windows 7, Windows 10, SDLC, Agile/Scrum, Waterfall, Chef, Puppet, Ansible, AWS, CA Workload Automation (AutoSys) AE, Windows Task Scheduler, Docker, Kubernetes, OpenShift, JIRA, Bit Bucket, Confluence, Jenkins, SVN, CVS, Git (GitHub, Gitlab, Bitbucket), Ant, Maven, Apache HTTPD, Apache Tomcat Server, WebSphere, LDAP, JBOSS, IIS, SMTP, SNMP, ICMP, TCP/IP, FTP, TELNET, DHCP, DIG, UDP, and RIP, OSPF, BGP, Eclipse IDE, IntelliJ IDEA, NetBeans IDE, Notepad ++, Oracle 11g/10g/9i, SQL Server 2012/2008, MySQL, Teradata, IBM Netezza, C, Java, .NET, Python, Ruby, Perl, and Shell Programming, Shell, Python, Oracle, MySQL

Timeline

Production Operations - Software Development Team Lead/Manager

Amdocs
02.2011 - Current

UAT Team Lead/ Sr. QA Engineer

Cox Communications
04.2010 - 01.2011

Test Lead/ Sr.QA Engineer

InterCall
10.2007 - 03.2010

Sr.QA Engineer

Siemens Information Processing Services
11.2005 - 09.2007

QA Engineer

Hewlett-Packard, GeBC
03.2003 - 09.2005

MBA - Information Systems

Sikkim Manipal University

BSc - Computer Science

Osmania University
Ruthvij Polasa