Summary
Overview
Work History
Education
Skills
Certification
Additional Information
Timeline
Generic

TERRI M. SAUNDERS-HARLEY

WALDORF,MD

Summary

Process and quality specialist with more than 20 years’ experience coupled with MBA, BA in information technology.

OBJECTIVE: Manage each project’s scope and timeline. Coordinate sprints, retrospective meetings and daily stand-ups. Coach team members in Agile frameworks. Facilitate internal communication and effective collaboration. Be the point of contact for external communications (e.g. from customers or stakeholders). To seek and maintain full-time position that offers professional challenges utilizing interpersonal skills, excellent time management and problem-solving skills. Work with product owners to handle backlogs and new requests. Resolve conflicts and remove obstacles that occur. Help teams implement changes effectively. Ensure deliverables are up to quality standards at the end of each sprint. Guide development teams to higher scrum maturity. Help build a productive environment where team members ‘own’ the product and enjoy working on it.

SUMMARY QUALIFICATIONS:

Well-versed in building positive relationships with customers and other stakeholders. Strong requirements gathering, scope development and inventory coordination abilities. Skilled at overseeing complex, high-value technical projects with excellent planning competencies.

Instrumental Senior Systems Engineer bringing 20 years of experience achieving ambitious goals in challenging IT environment. Diligent, forward-thinking and adaptable to dynamic company, customer and project needs. Successful at motivating teams to meet demanding timelines.

Organized and dependable candidate successful at managing multiple priorities with a positive attitude. Willingness to take on added responsibilities to meet team goals.

Hardworking and passionate job seeker with strong organizational skills eager to secure Senior Systems Engineer position. Ready to help team achieve company goals.

Detail-oriented team player with strong organizational skills. Ability to handle multiple projects simultaneously with a high degree of accuracy.

Team-oriented individual promoting exemplary presentation, project management and risk oversight skills. Demonstrative Scrum Agile Metrology with background communicating effectively with and leading high-performance teams. Considered expert in prioritizing tasks and optimizing workflows.

Systems engineer with 10+ years of experience in infrastructure development for cloud computing platforms. Practiced at transitioning customers from on-premise solutions to cloud-based deployments. Versed in utilizing cloud infrastructure to alleviate pitfalls inherent to on-site computing. Provider of smooth, well-planned transition architecture. Veteran Systems Engineer versed in developing solutions for mission-critical needs. Specializes in deploying systems within ongoing operations while causing minimal disruptions. Committed to verifying compatibility, required operational thresholds and budgetary compliance of developed products.

Overview

24
24
years of professional experience
1
1
Certification

Work History

Systems Engineer

SanCorp Consulting, LLC
06.2021 - Current
  • Serve as a Systems Engineer my primary responsibilities are supporting the Chief of the Secretary of Defense (OSD)/Chief Digital & Artificial Intelligence Office (CDAO) with designs, consolidates, and integrates computer networks and advanced networking solutions, including local area networks (LANs), wide area networks (WANs), service provider networks, Enterprise Networks, Data centers for the build out of a platform and infrastructure that effectively support processing of AI/ML workloads and delivering scalable AI capabilities
  • Identifies high level architectures and design patterns best suited to a complex multi- vendor, multi-organization environment
  • Identifies technical risks and mitigation strategies
  • Initiates and develops methods to assess end-to-end systems performance using operational measurements or other available test beds in order that end-to-end performance objectives are met
  • Ensures all functional and performance analysis models developed reflect the operational performance requirements
  • Demonstrates knowledge by leading design of extensive multi-tier architectures with various levels of enabling platforms and technologies and the design of API and data standards
  • Assists in evaluation of new products and services; make recommendations for improvements and assist in the development and documentation of network architecture
  • Works with CFT team members, working groups and committees to provide senior level technical direction, leadership, and guidance in accomplishment of challenging tasks including operational use of AI (both ML and non-ML)
  • Serves as senior technical advisor in formulating technical approaches, selection of the tools, diagnostics methods for solving customer problems utilizing artificial intelligence in a timely manner
  • Serves as CDAO representative for agency-wide Integrated Product Teams (IPTs).

Test & Evaluation Engineer

Core4ce
08.2020 - 05.2021
  • Serve as a T&E Engineer my primary responsibilities are supporting the United States Navy Yard in Washington, D.C
  • Providing technical support to various aspects of the T&E planning, execution, and reporting process
  • Supporting the Integration Testing Events for the United States Navy Electronic Procurement System (ePS) which supports the Contract Award Decision and documents the overarching plan to coordinate and execute T&E of Navy ePS
  • The Navy ePS T&E Team supporting, managing and providing oversight on all elements of testing including scope, objectives, plans, reports and schedules with applicable stakeholders to support the T&E Working Integrated Product Team (WIPT) stakeholders the Navy ePS T&E Lead, Vendor (s), DON major Commands/Claimants (e.g., HCAs), and other applicable test organizations (e.g., COTF, Joint Interoperability Test Command (JITC))
  • Provides additional confidence that Navy ePS will be fit for operations and satisfy the business needs and associated requirements for functional and technical performance
  • Supports Navy ePS CWS life cycle processes to ensure compliance with regulatory, performance, schedule, and budgetary requirements and facilitates early detection of their variances
  • Enhances management insight into system and performance risks
  • Liaison between the Contractor and User community - Discuss training requirements and how it shall be presented to the users
  • Design, execute, and maintain automated and manual test scripts and Generating test data
  • Data Validation, execute regression, functional, load and system testing and Verifying results and producing test summaries and defect reports
  • Identifying defects and requirement discrepancies and Generating defect change requests and reporting discrepancies
  • Reports test outcome by collecting, analyzing, interpreting, summarizing, and displaying data, resolves testing problems by modifying testing methods during tests; conferring with management to revise test objectives and plans


System Analyst Expert

Kaimetrix
09.2019 - 08.2020
  • Served as a DISA System Analyst Expert my primary responsible are overseeing the systems development life cycle (SDLC) process including business needs analysis, requirement gathering, information system architecture, design, integration, implementation, QA, deployment, and production maintenance.
  • Create and maintain project plans and schedules, set milestones, assign resources, and identify/mitigate risks to ensure successful execution. Coordinate major releases with local and remote deployment support teams, as well as briefing JITC Stakeholders on new features and changes. Interface with stakeholders to understand business needs and drive effective requirements gathering. Keep JITC Stakeholders engaged to ensure that the final work product meets customer expectations. Review proposed new features and connectivity to provide feedback on feasibility and potential impact.
  • Coordinates integrated testing activities. Reviews and evaluates test requirements to insure completeness of test program. Performs technical analysis of complete systems and prepares comprehensive system level evaluations. Implements software development and maintenance processes and methods.
  • Ensures measures meet acceptable reliability standards. Oversee the system under test is working according to the required specification before deployment. Ensures that project and process control documentation are compliant with requirements, objectives and/or contract. Review’s software design, change specifications, and plans against contractual and/or process requirements
  • Performs or directs verification of software requirement allocations, traceability, and testability. Collaborated with upper management to drive strategy and implement new processes. Improved systems with addition of new features and infrastructure.

Computer Scientist Principle

Jacobs Technology
07.2018 - 09.2019
  • Serve as Computer Scientist Principle my primary responsibilities are working with Defense Information System Agency (DISA) NBIS Test and Evaluation Supporting to accomplish Developmental, Operational, and Joint Interoperability test, evaluation and certification of National Background Investigation System (NBIS)
  • NBIS capabilities have both legacy and newly developed applications that encompasses integrated Commercial Off-The-Shelf (COTS) and Government Off-The-Shelf (GOTS) acquisition model for developing, testing and deploying capabilities
  • Provided technical support in four specific areas plan and Execute Developmental Test and Evaluation (DT&E) on NBIS Program Management Office (PMO) Developed Components; Contribute to the development, maintenance, and implementation of the NBIS Overarching T&E Strategy.
  • Conduct Cybersecurity DT&E on NBIS. Conduct Operational Cybersecurity Assessments of the re-hosted NBIS components and NBIS PMO-developed components and NBIS Operational Test and Evaluation (OT&E) Support; Plan and conduct Initial OT&E (IOT&E) for NBIS once the NBIS PMO developed systems are integrated, fielded, and fully operational.
  • Support Task Management; Perform administrative task management duties for Period of Performance to support JITC T&E efforts for NBIS. Completed reviews of codes, requirements and project plans.
  • Gathered requirements and developed project plans and worked with stakeholders to develop quarterly roadmaps based on impact, effort and test coordinations.

Technical PM/Testing Manager

Software Consortium, LLC D/B/A PrimeSoft
04.2018 - 06.2018
  • erved as a Technical PM/Testing Manager my primary responsibilities is demonstrated ability to manage projects through a full lifecycle; initiating, planning, executing, monitoring and controlling and closing
  • Facilitated meetings and conversations with ability to articulate
  • Communicated project plans and progress to key stakeholders and project contributors.
  • Expected outcomes, issues, reusable systems components
  • Supervising a team of 12 test specialists and managing complex program requirements into testable objectives.
  • Analyzing and recommending test cycle, test plans, safety reviews, and detailed test objectives.
  • Manage and work effectively with diverse organizations and personalities to execute a successful test program and Verifications quality control.
  • Maintaining the level of quality throughout the software lifecycle
  • Develops software quality assurance plans
  • Conducts formal and informal reviews at predetermined points throughout the development lifecycle and validation, software testing and integration, and software metrics, and their application to software quality assessment.
  • Applying subject matter knowledge to high level analysis, design, development, modeling, simulation, integration, installation, documentation and implementation.
  • Resolving problems, which require an intimate knowledge of the related technical subject matter.
  • Applies principles and methods of the subject matter to specialized solutions.
  • Including but not limited to, environmental, scientific, maintenance and repair processes, and logistical support activities.
  • Data conversions or data migrations and developing test data
  • In addition, wearing several different hats, worked with the development team and application SMEs as the liaison to analyze requirements and designs to develop test scenarios and cases
  • Facilitate and coordinate Joint Application Requirement (JAR) and Joint Application Development (JAD) sessions with excellent inter-personal and conflict resolution skills
  • Elicit requirements from business users and stakeholders using interviews, document analysis, requirements workshops
  • Critically evaluate information gathered from multiple sources, reconcile conflicts and decompose high-level information into details to deliver required artifacts.
  • Collaborated with business users, technical teams, database administrators and testing teams during kickoff meetings, joint application designing, and planning sessions to validate requirements.

Senior IT Consultant

Horizon Industries Contracting
12.2017 - 03.2018
  • as called in by the Lead PM for a 90-day engagement to help to get the project back on track working 60-hour weeks
  • Primary responsibilities were supervising a team of test specialists and managing complex program requirements into testable objectives
  • Analyzing and recommending test cycle, test plans, safety reviews, and detailed test objectives
  • Manage and work effectively with diverse organizations and personalities to execute a successful test program
  • Providing the overall test and documentation of the Test and Evaluation Master Plan (TEMP) in accordance with DODI 5000.2.
  • Verifications and validation, software testing and integration, and software metrics, and their application to software quality assessment.
  • Determining the resources required for information technology quality control
  • Maintaining the level of quality throughout the software lifecycle
  • Develops software quality assurance plans
  • Conducts formal and informal reviews at predetermined points throughout the development lifecycle
  • Applying subject matter knowledge to high level analysis, design, development, modeling, simulation, integration, installation, documentation and implementation
  • Resolving problems, which require an intimate knowledge of the related technical subject matter
  • Applies principles and methods of the subject matter to specialized solutions
  • Including but not limited to, environmental, scientific, maintenance and repair processes, and logistical support activities
  • Demonstrated ability to manage projects through a full lifecycle; initiating, planning, executing, monitoring and controlling and closing
  • Defined the project scope, executed data collection methodologies and plans
  • Experience creating foundational project artifacts such as project charters, schedules, plans, issues and risk logs and status reports
  • Developed Briefings to Senior Level Executives weekly
  • Build and maintain client relationships
  • Create and deliver project level documentation reports, project and deliverables, etc
  • Drive task level project Integrated Master Schedule (IMS)
  • Ensured project is delivered on-time and on-budget
  • Control, supervise, and organize technical reviews of technology selections and architectural approaches for assigned projects with special emphasis on cost, schedule, and risk
  • Expected outcomes, issues, and risks and needed steps to resolve
  • Developed clear specifications for project plans using customer requirements.
  • Contributed ideas and suggestions in team meetings and delivered updates on deadlines, designs, and enhancements.
  • Planned and developed interfaces that simplified overall management and offered ease of use.

Quality Assurance Manager

Technatomy Corporation
10.2016 - 11.2017
  • Served as a Quality Assurance Manager my primary responsibilities were developing, implementing and maintaining quality assurance systems and activities within the Industrial Base Management Systems (IBMS) program
  • The Industrial Base program supports Defense Logistics Agency (DLA) Headquarters and the Inventory Control Points (ICPs) by providing a one-stop resource center for the DLA Warstopper and Industrial Base Capabilities program which define and specifies the implementation of standards, methods, and procedures for inspecting, testing and evaluating the precision, accuracy, and reliability of products and deliverables
  • Participate in the review of engineering designs to contribute quality assurance requirements and considerations
  • Oversee the validity of results, accuracy, reliability, and conformance to established software QA standards
  • Overall responsibility for the development of program/project Software Quality Assurance Plan and the implementation of procedures that conforms to the requirements of the contract
  • Responsible for managing the definition, implementation, and integration of quality principles into the design and development of system software and IT processes
  • Provides an independent assessment of how the program’s/project’s software development process is being implemented relative to the defined process and recommends methods to optimize the organization’s process
  • Responsible for all activities involving quality assurance and compliance with applicable statutory/regulatory requirements
  • Conduct audits and reviews/analyzes data and documentation
  • Work with Business Analyst and Test Engineer to develop and implement procedures and test plans for assuring quality in a system development environment which supports large databases and applications
  • Provide guidance and subject matter expertise to developers on testing and Quality Assurance (QA)
  • Successfully deployed Sprint 1 and Sprint 2 into production
  • Capable of maintaining and establishing a process for evaluating software and associated documentation
  • Lead efforts associated with establishing initial software and hardware configuration baselines for systems entering the test program including assembling and verifying configuration item documentation and technical data packages
  • Conduct system engineering critical functions (system architecture) analysis and gap analysis to ensure proposed configuration items (CIs) adequately reflect those elements that are critical to the system performance against the associated requirements to be tested
  • Determining the resources required for information technology quality control
  • Maintaining the level of quality throughout the software lifecycle
  • Develops software quality assurance plans
  • Conducts formal and informal reviews at predetermined points throughout the development lifecycle
  • Applying subject matter knowledge to high level analysis, design, development, modeling, simulation, integration, installation, documentation and implementation
  • Resolving problems, which require an intimate knowledge of the related technical subject matter
  • Applies principles and methods of the subject matter to specialized solutions
  • Including but not limited to, environmental, scientific, maintenance and repair processes, and logistical support activities
  • Demonstrated ability to manage projects through a full lifecycle; initiating, planning, executing, monitoring and controlling and closing
  • Resolved problems, which require an intimate knowledge of the related technical subject matter
  • Environment: Application Server, IIS 7.5, Enterprise Test Center (ETC), Virtual Machine (VM), Relational DB SQL Server 2012R2, Agile Delivery Methodologies, Jaws Access with Speech (JAWS) and Team Foundation Server (TFS) etc
  • Performed root cause analysis to identify and resolve quality issues and defects.
  • Created and maintained quality management systems to align with industry standards.

Senior System Engineer

SEKON DOD Healthcare Management System Modernization
07.2016 - 10.2016
  • Assisted employees with resolving network problems at remote locations.
  • Worked with stakeholders to determine implementation and integration of system-oriented projects.
  • Documented system configuration, mapping and processes.
  • Proposed technical feasibility solutions for new system designs and suggested options for performance improvement of technical components.
  • Maintained stability, integrity and efficient operation of information systems supporting organizational functions.
  • Suggested system updates or changes after conduct in-depth technical reviews.
  • Identified software issues and handled troubleshooting to resolve quickly.
  • Analyzed security logs to determine and alleviate network threats.
  • Conducted end-user reviews for modified and new systems.
  • Checked for accuracy and functionality during implementation of new systems.
  • Tested and analyzed equipment design and performance feasibility to determine potential ROI.

Senior Business Analyst/QA System Analyst

Project Performance Company (PPC)
12.2015 - 06.2016
  • Served as a System Business Analyst and based on an urgent business need and a limited budget, I was brought in for a short-term assignment with a dual role.
  • Homeland Security (DHS) and Project Manager to help facilitate the completion of accurate and compliant contract deliverables,
  • Supported and expert knowledge of business processes, workflows, systems and technologies to support projects within a federal government sector.
  • Provided support and expert knowledge of business processes, workflows, systems and technologies to supports projects within a federal government sector.
  • Responsible for ensuring IT deliverables meet functional and non-functional needs of the customers.
  • Reviewed of contract deliverables (Requirements Document, Test Plans, Test Cases, Test Scenarios, Test Execution and developed User Acceptance Test (UAT).
  • Homeland Security (DHS) and Project Manager to help facilitate the completion of accurate and compliant contract deliverables.
  • Guided teams and refined Agile techniques throughout journey and evolution of delivery cycles, establishing documentation framework to support Agile methodology implementation with Waterfall-centric delivery team.
  • Supported leadership team with reporting, analysis, and business presentations to inform divisional strategies.

Senior Quality Assurance Specialist

Zolon Technical, INC, Protection Agency
07.2014 - 12.2014
  • Served as a Senior Quality Assurance Specialist my primary responsibilities where participated independently with customers and developers to define business requirements that were applied to design, development, implementation, evaluation and management of systems supporting Web-based Applications for Environment Protection Agency (EPA)
  • Responsible for unit, integration, and functional qualification testing Functional Unified Test (FUT) on PeopleSoft Payroll all-in -one Human Resources Information System (HRIS) following the Scrum/Agile methodology
  • Independently Verified & Validated (IV&V) reviews and testing activity which included the development of test plans, procedures, and test/evaluation reports to support various stages of testing (e.g., unit, integration, system, performance, functional qualification, and acceptance) for all initial and updated increments/releases and components.


Senior Technical Consultant

Robbins Gioia/Right Patterson Airforce Base, RG
03.2013 - 06.2014
  • Served as a Senior Technical Consultant my primary responsibilities were Defined the project scope, executed data collection methodologies and plans.
  • Conducted analysis of studies and surveys, analyzed and validated data to determine solutions and advise on alternative methods of recommendations for Right Patterson Airforce Base.
  • Managed multiple simultaneous activities using project management principles, methodologies, and tools.
  • Created project plans for information technology-based projects in cooperation with customers, senior IT personnel, investment review boards or other stakeholders.
  • Conducted standardized and detailed project requirements for organization.
  • Maintained data set-up, backend validation, and perform data quality analysis via SQL queries and formulated a solid test approach/strategy using a variety of sources ranging from well documented requirements specifications to validate requirements.
  • Worked with multiple database interfaces and familiar with standard web architectures.
  • Supported and tested usability from the initial design stages through user acceptance test.
  • Conducted and documented assessments to ensure that project phases and deadlines were met and that products were built to specified requirements.
  • Recommended technology upgrades to improve client security.
  • Troubleshot systems comprised of security alarms and Internet connectivity.

Quality Assurance Manager

ICF Consultant Federal Aviation Administration FAA
05.2011 - 02.2013
  • Served as a Quality Assurance Manager for verification and validation control of projects to ensure quality control for design, testing, and maintenance of human resources software applications
  • Defined quality management plan and provided leadership in quality planning, quality assurance, and quality control. Prepared test strategies, test plans, and supervised all testing activities throughout software development life cycle.
  • Managed the testing processes for the FAA’s contract information tracking tool, a web-based application that will streamline the management of multi-billion-dollar contracts within the NISC Program. Coordinated all program test activities with a team of 20 Test Engineers. Responsibilities included leading multiple project efforts with adherence to stringent timelines, guiding the testing infrastructure, developing testing frameworks, scheduling day-to-day activities of the Quality Assurance team, and mentoring junior staff.
  • Worked collaboratively with design and development teams for process enhancements to reduce defects and improve time to test, resulting in a more efficient, and cost- effective Quality Assurance program.
  • Established quality assurance standard operating procedures applicable to multiple products such as defect tracking, change management, release management reporting, test case development and overall project reporting.
  • Advised senior management of quality trends and prescribed preventive and corrective actions for improvement via status reports, gate reviews, and presentations to management, clients, and end-user representatives.
  • Evaluated technical documentation to determine the most effective testing, inspection, and test automation strategy for projects of varying complexity and size.
  • Established measurable test objectives and ensured all defects uncovered during test were recorded, analyzed, and utilized in post project reviews to improve the development and test process. Lead efforts associated with establishing initial software and hardware configuration baselines for systems entering the test program including assembling and verifying configuration item documentation and technical data packages.
  • Developed, analyzed and reviewed User Stories, Test Cases, and Test Scripts using IBM Rational Jazz. Developed Test Summary Reports to determine when user acceptance criteria are met Analyzed and reviewed technical documentation (Use Cases, Activity Diagrams, Sequence Diagrams, and Class Diagrams) for completeness, to ensure business objectives are adequately stated and subsequently met. Provided input to overall project schedules, with emphasis on defining test schedules, activities, risks, and staffing needs
  • Implemented security measures into total infrastructure environment. Data set-up, backend validation, and perform data reviewed and recommended improvements in program polices, procedure, and other complex work products for initial quality audit.
  • Deployed KITT 1.4, 1.5, and 2.0 releases to production with 97% quality objective ratio which led to reduction in budget by 11% and delivered 5 days prior to deadline. Performed root cause analysis to identify and resolve quality issues and defects. Created and maintained quality management systems to align with industry standards. Collaborated with cross-functional teams to develop and implement process and system improvements. Developed and implemented comprehensive quality assurance plans to monitor product quality and adherence to regulatory standards.
  • Implemented new quality assurance and customer service standards. Conducted process and system audits to identify areas of improvement and enforce compliance with industry standards.

Senior Software Tester

Zolon Technical, INC, Dept. Of Energy
11.2010 - 04.2011
  • Served as a Senior Software Tester my primary responsibilities were development test cases, test plans, test scripts using manual and automated testing tools, sand produced testing reports for all products developed by Department of Energy.
  • Analyzed and refined software requirements and Translated system requirements into software prototypes. Worked closely with individuals and groups to identify issues, analyze alternatives, and negotiate differences and combined pieces of information to form general rules or conclusions
  • Worked closely with individuals and groups to identify issues, analyze alternatives, and negotiate differences and combined pieces of information to form general rules or conclusions. Quality analysis via SQL Proficient to formulate a solid test approach/strategy using a variety of sources ranging from well documented requirements specifications to validate requirements
  • Worked with customers to test software applications. Assured software quality and functionality. Developed and maintained software documentation. Conducted formal and informal meetings involving presentations to decision makers. Applied IT concepts and methods in the design, development, implementation and operational support of IT programs. Planned and oversaw deployment of technology capabilities
  • Coordinated work with various teams to solve problems and improve efficiency for software testing and automation.
  • Worked closely with different departments to develop innovative solutions to functionality issues. Optimized test cases to maximize success of manual software testing. Documented integration issues and vulnerabilities and outlined improvement recommendations.
  • Directed teams completing regression tests to support successful product development stages. Planned and devised cohesive test plans for projects using advanced testing technologies. Checked software beyond testing scripts for interconnected problems not covered by established specifications.
  • Defined, created and controlled testing environments for successful software deliverables. Monitored resolution of bugs, tested fixes and helped developers tackle ongoing problems by providing QA perspective.
  • Documented testing procedures for developers and future testing use. Operated under Agile and Scrum frameworks to complete releases and well-organized sprints. Tested functional and compatibility of new programs or updates in comparison to existing applications. Kept scripts and test cases updated with current requirements. Evaluated function, performance and design compliance of every product against design standards and customer needs.

Senior Test Engineer

TASC GCSS Marine Corp
05.2009 - 10.2010
  • My primary responsibilities were supporting GCSS Marine Corps - Joint Interoperability Test Command (JITC) in warfighter support mission of assuring interoperability of systems and equipment
  • Managed test reports
  • Managed test plans and reports and metrics for project tracking documents.
  • Certified technical builds for net-ready key performance parameter (NR-KPP) and supportability aspect of system such as hardware or software modifications
  • Verified biometrics interoperability of commercial off-the-shelf products (COTS)
  • Reviewed configurations for final certification
  • Designed testing methods and equipment
  • Created Test Plans, Test Cases and Test Strategies
  • Wrote test case parameters, test scripts and automation guidelines.
  • Researched and acquired extant testing frameworks to speed quality assurance timeframes.
  • Reproduced defects and documented findings.
  • Completed regression tests of new software builds to assess performance and success of bug fixes.
  • Integrated collected data into business process enhancements to address ongoing business goals.
  • Defined and tracked test results, defect counts and performance discrepancies.
  • Identified delivery risks for ongoing projects, developing strategies to avoid delays.
  • Reviewed results and produced daily reports for developers.

Computer System Analyst

Northrop Grumman Corporation
03.2008 - 04.2009
  • erved as a Computer System Analyst my primary responsibilities were analyzed and refined software requirements for United States Postal Service (USPS)
  • Provided end-to-end information technology services including application development, business data management, and enterprise infrastructure support to United States Postal Service
  • Conducted tests on computer software programs to ensure proper performance and performance reliability testing.
  • Applied IT concepts and methods in the design, development, implementation and operational support of IT programs
  • Data set-up, backend validation, and perform data reviewed and recommended improvements in program polices, procedure, and other complex work products with responsible, accountable, consulted and informed (RACI) individuals to peer review and baseline documentation for initial quality audit
  • Performed diagnostics and testing keep systems working as expected.
  • Assessed current systems and identified ways to add new functionality while boosting processes.
  • Researched latest technologies to add efficiency to organization's existing systems.
  • Communicated complex computer information into easy-to-understand terminology for non-technical individuals.
  • Provided client support on system operation and troubleshooting.
  • Resolved or escalated problem tickets to resolve user issues.
  • Resolved malfunctions with systems and programs through troubleshooting.

Quality Assurance Manager

AT&T Services, Inc
11.1999 - 02.2008
  • Served as a Quality Assurance Manager my primary responsibilities were development managed testing project using standard SDLC and iterative method.
  • Assured consistent quality of production by implementing and enforcing automated practice systems.
  • Reviewed and analyzed change requests to determine the scope of work and estimate the level of effort for testing application changes
  • Developed advanced analytical methods to identify business requirements, establish project scope, and evaluate technical and economic feasibility
  • Developed and deployed AT&T U-Verse home entertainment brand of triple-play telecommunication services in 22 U.S
  • States
  • U-Verse includes high-speed internet (HPI), IP telephone, and IPTV services HBO, Showtime, Cinema and Starz on demand
  • Documented results of testing activities, including creating and reporting test-discovered defects Maintained direct knowledge of and applies all applications testing technical processes, procedures, guidelines, and quality standards
  • All SDLC project phases to ensure appropriate system testing activities resulted in quality applications
  • Performed quality process audits (Quality Center and Requisite Pro usage) and quality reviews of selected work products (test plans, test cases, and test summary reports) against defined standards validating adherence to those standards
  • Coordinated integrated testing across multiple applications and development teams
  • Successful project implementations resulted winning the Vice Chairman’s Award.
  • Defined HP Quality Center setup, utilization, processes, and procedures and integration requirements with Rational Requisite Pro to generate Requirements Traceability Matrices.
  • Hands on experience with evaluated business and technical requirements to develop test plans, cases, and procedures, using automated testing techniques where appropriate to properly validate the target software
  • Determined the feasibility of automated testing and defined the appropriate level of automation for implementation, ensuring scripts can be easily maintained for future iterations

  • Collaborated with the technical team and business subject matter experts (SMEs) to ensure delivered functionality met or exceeded quality goals.
  • Performed root cause analysis to identify and resolve quality issues and defects.
  • Created and maintained quality management systems to align with industry standards.
  • Collaborated with cross-functional teams to develop and implement process and system improvements.
  • Developed and implemented comprehensive quality assurance plans to monitor product quality and adherence to regulatory standards.
  • Implemented new quality assurance and customer service standards.
  • Conducted process and system audits to identify areas of improvement and enforce compliance with industry standards.
  • Monitored staff organization and suggested improvements to daily functionality.
  • Investigated customer complaints and performed corrective actions to resolve quality issues.
  • Assessed product quality by monitoring quality assurance metrics, reports and dashboards.
  • Conducted risk assessments to identify and mitigate potential quality issues.
  • Recorded, analyzed, and distributed statistical information.

Education

Master of Arts - Business Administration

Lindenwood University
St. Charles, MO
2005

Bachelor of Arts - Information Technology/Computer Science

Lindenwood University
St. Charles, MO
2003

Training Six Sigma Green Belt Certificate, ICF International, 2012 , CMMI-DEV Level 3 for Development 1.3 - undefined

Carnegie Mellon University

Skills

atabase

Access (5 Years), SQL Server DB development (5 Years), Oracle (7 Years) J2EE, Java, Oracle 11G or 12c Database RDMS and SQL (3 years)

Database Tools

Oracle SQL Developer Application (5 Years), SQL (7 Years), J2EE, Java, Oracle 11G or 12c Database RDMS (3 Years)

Software

Microsoft Office 2010, Microsoft Excel, Microsoft Project, Microsoft PowerPoint, Microsoft Outlook, Microsoft Word, Microsoft Visio and Adobe CS Suite, PeopleSoft Human Resources Information system (HRIS), Jaws Access with Speech for Windows (JAWS) and SharePoint

Development Methodology

Waterfall (5 Years), Agile Scrum (10 Years), RUP (5 years), SDLC (10 Years)

Programming Languages

Python, VB Script, C (2 Years), XML (3 Years), SQL (5 Years), My SQL (3 Years), HTML 4 (5 Years), WSDL (14 Years), ASPNET (2 Years), VBNet (2 Years), PL-SQL (2 Years), Cold Fusion (2years), JAVA SCRIPT (2 years)

Operating Systems

Systems Windows NT, Windows Server, Windows 2000, Windows XP HARDWARE: Windows, Linux and Unix (3 YEARS), Windows Server 2008, Oracle 8, 9i, 10, and 11i, SQL Server, and TOAD

Tools/Methodologies

Erwin 4 (15years), Erwin r8/r9 (16 years), IT Guru (15 years), Visual Studio 2005 – 2012, Win Runner 70/76/80/82, Load Runner 70/72, Quick Test Professional (7 years), Functional Unified Test, Test Director 70/80 (5 years),, Mercury Quality Center 80/82, QA Load version 50, SQL, JIRA, Bugzilla, Rational Clear, Test Director, and HP ALM (5 years) and Visual Studio Team Foundation Server (TFS), Smart Bear QAComplete and Test Complete IBM Jazz Platform (Quality Manager ~Team Concert ~ Requirements Composer) Rational ClearQuest Rational Requisite Pro Quick Test Pro HP Quality Center Atlassian (JIRA Confluence) Rapid SQL SQL Developer SQL UNIX JAVA WebLogic XML Microsoft Office Suite (Project ~ Excel ~ Visio ~ PowerPoint) MS Access Excelsius HP UFT WinRunner

Methodologies

Scrum Framework/Agile Delivery Methodologies (5 years, 4 Months with Scrum Framework/Agile SDLC), Waterfall (5 Years with SDLC) Quality Improvement and Process Definition, Requirements Gathering and Analysis, Staff Management (Matrix, Strategic & Activity Based) Program/Project Management, Team Building, Service Level Agreements, Impact Assessments, ISO 9001: 2008, ISO 9001: 2015, ITIL Foundation, Internal Audits and Risk Management

Documentation

Test Evaluation Master Plan (TEMP) Test Plans (10 Years), DFD’s, ERD’s, Use cases (10 Years), DoDAF at Department of Defense “DOD”, System and Technical View process diagrams [SV, TV, and OV] (7 years) End user training manuals (5 Years), Installation manuals (10 Years), incident reports, Change Requests (CR’s), and Test reports (10 years)

Quality Assurance Methodologies

  • Agile Scrum
  • Application Testing
  • Root Cause Analysis
  • Gathering Requirements
  • Technical Analysis
  • Continuous Improvements
  • Testing Procedure Development
  • Integration Readiness Analysis
  • Quality Assurance
  • Regression Test Management
  • Data Analysis
  • System Acceptance
  • Team Meetings
  • Risk Identification
  • Business Requirements Documents (BRDs)
  • Agile Best Practices
  • Atlassian JIRA
  • Scrum Processes
  • Total Quality Management
  • Deliverable Tracking
  • Manage Schedules
  • Sprint Planning
  • Waterfall Methodology
  • Plan Projects
  • Performance Monitor
  • Risk Mitigation Planning
  • Prepare Reports
  • Planning Meetings
  • Lead Teams
  • Software Process Management
  • Change Management Process
  • User Acceptance Testing (UAT)
  • Acceptance Criteria
  • Team Engagement
  • Agility Improvements
  • Quality Deliverables

Certification

Graduate certificates Graduate Certificate in Master of Business Administration (2005) Project Management Applications Microsoft Project 1995 – 2013

Additional Information

  • Clearance: Active DoD Clearance

Timeline

Systems Engineer

SanCorp Consulting, LLC
06.2021 - Current

Test & Evaluation Engineer

Core4ce
08.2020 - 05.2021

System Analyst Expert

Kaimetrix
09.2019 - 08.2020

Computer Scientist Principle

Jacobs Technology
07.2018 - 09.2019

Technical PM/Testing Manager

Software Consortium, LLC D/B/A PrimeSoft
04.2018 - 06.2018

Senior IT Consultant

Horizon Industries Contracting
12.2017 - 03.2018

Quality Assurance Manager

Technatomy Corporation
10.2016 - 11.2017

Senior System Engineer

SEKON DOD Healthcare Management System Modernization
07.2016 - 10.2016

Senior Business Analyst/QA System Analyst

Project Performance Company (PPC)
12.2015 - 06.2016

Senior Quality Assurance Specialist

Zolon Technical, INC, Protection Agency
07.2014 - 12.2014

Senior Technical Consultant

Robbins Gioia/Right Patterson Airforce Base, RG
03.2013 - 06.2014

Quality Assurance Manager

ICF Consultant Federal Aviation Administration FAA
05.2011 - 02.2013

Senior Software Tester

Zolon Technical, INC, Dept. Of Energy
11.2010 - 04.2011

Senior Test Engineer

TASC GCSS Marine Corp
05.2009 - 10.2010

Computer System Analyst

Northrop Grumman Corporation
03.2008 - 04.2009

Quality Assurance Manager

AT&T Services, Inc
11.1999 - 02.2008

Master of Arts - Business Administration

Lindenwood University

Bachelor of Arts - Information Technology/Computer Science

Lindenwood University

Training Six Sigma Green Belt Certificate, ICF International, 2012 , CMMI-DEV Level 3 for Development 1.3 - undefined

Carnegie Mellon University
Graduate certificates Graduate Certificate in Master of Business Administration (2005) Project Management Applications Microsoft Project 1995 – 2013
TERRI M. SAUNDERS-HARLEY