Test strategy
Slide 3 -7
Slide 3 - Test types, levels and techniques
Test Levels -
Integration testing : Manual and Automated by Developers & Testers
Ensures reliable, secure communication between components like APIs, interfaces, and databases to allow smooth flow of data
System testing : Manual and Automated by Developers & Testers
Ensure system works properly in the system environment and carries out required tasks; Decrease security risks and promote
reliability
Type of testing:
Performance testing (Automated) : Evaluate stability and usability of system and network under a workload
Security testing (Hybrid): Identify and eradicate vulnerability
Functional Testing (Automated): Ensure application works properly and meet the requirements, Test basic functionality of
application
Preventative Approach
Requirements Review:
Ensure all system, security and performance requirements are made clear from the initiation of project to minimise misunderst anding that may create issue in
development of application
Test Design Techniques
Grey-box Testing
Tester has basic knowledge about components but not about their communication. Helps prevent potential vulnerabilities in sys tem, Provide benefit of
both white box and black box
https://www.imperva.com/learn/application-security/gray-box-
testing/#:~:text=Gray%20box%20testing%20(a.k.a%20grey,of%20the%20component%20being%20tested.
Regression Testing: Test to ensure changes made in the system such as bug fixes, upgrades do not have unwanted side effects or cause regression error in
previously working applications
https://www.browserstack.com/guide/retesting-vs-regression-testing
Level of independence
Developers: System testing, Integration testing
Testers ; System testing, Integration testing, Regression testing
Slide 4- Entry & exit criteria (also covers completion criteria)
Test Entry Exit
Integration Testing Components created and ready Integration test cases executed
for test use. Issues found reviewed
Test plans and test environment Test results documented
designed and approved
Test scenarios designed
Tester and developers are
available for test purposes
System Testing Integration testing completed System test cases executed
System environment and Issues resolved
scenarios which mirrors produced Test results documented
created
Test plans created and approved
Tester and developers are
available for test purposes
Performance Testing Performance test scenarios and Performance meets earlier
tools are available approved standard
System environment and Issues resolved
scenarios which mirrors produced Test reports documented
created
Standards required for
performance defined and
approved
Security Testing Security test scenarios designed System vulnerability identified
and approved and resolved
Required testing tools and testers Test reports documented
are available
Functional Testing Testing requirements for Test cases executed
functional testing defined and Issues resolved
Slide 5 - Test environment, data & tooling requirements - this can be a list of environment, data and tooling
requirements
Software
Software under test - PaperBag
Integrated systems - Payment system, Inventory management, User Profile management ,etc
APIs - Payment APIs, Catalog APIs, User APIs
Databases - PostgreSQL
OS - Windows, MacOs, Linux etc
Collaboration software - Microsoft Teams, Slack
Hardware
Physical servers - for performance testing
Online servers - for System testing
Client Machines
Network:
Ethernet
Mobile Networks
Browser and devices
Versions of Chrome, Safari etc
Desktops, Laptops, Smartphones
Tools:
Test Management : Jira X-Ray
Test Automation: Selenium
Performance Testing; Apache Jmeter
Security Testing: OWASP ZAP
Collaboration: Slack
Documentation: Confluence
Data:
Synthetic and prod data
Test accounts
Security:
Vulnerability scans before system tests
Secure access of test accounts to system environment
Login Accounts:
Laptop login with accounts for testers and developers with secure access
Software under test ; Role based access
SQL Server: access to databases
Physical Requirement;
Physical office for meeting
Phone for communication
Location : Inside home or office
Slide 6 - Metrics - this is a list of metrics that you are recommending be collected during testing
Planned vs. Actual : Number of tests planned vs. executed within the timeline.
Test Status: Number and percentage of executed tests based on status , Passed, Failed, Blocked, Unexecuted, Descoped
Requirement Percentage: Required covered by manual and automated tests
Open Defects by Status and Severity
Number of defects categorized as:
Open, In Progress, Resolved, Closed
Critical, High, Medium, Low
Open Risks by Risk Level
Number of open risks categorized as:
Low, Medium, High, Critical
Failure Rate
Defects per test executed
Defects per requirement tested
Slide 7 - Test deliverables & communication
Test Plans : Outline of the testing scope, objectives
Prepared - Test Manager
Once
Sign off - Project Manager
Test Conditions & Test Cases : Documented conditions and scenarios
Prepared- Testers/Developers
Iteratively
Sign off - Test Manager
Test scripts ; Scripts for testing
Prepared - Tester/Developers
Before and during test cycle
Sign off - Test Manager
Test Execution Results: Pass/fail status
Prepared: Testers/Developers
Frequency: Bi-weekly
Sign off: Test Manager
Defects Reports : List of defects with severity and status
Prepared: Testers/Developers
Frequency: As defects logged
Sign off: Test Manager
Test Progress:Summary of testing progress
Prepared: Test Manager
Frequency: Weekly
Sign off: Project Manager
Test Completion Report
Prepared: Test Manager
Frequency: End of testing phase
Sign off: Project Manager
Environment, Tools & Data: Details of the environment, tools, and test data
Prepared: Test Manager
Frequency: Once at the beginning, updated if needed
Sign off: Project Manager