Non-functional Requirement
Performance and Load Test
Document Version: 0.1
Document Date: 2017-05-06
Product: Mantis Bug Tracker
Title:
Authors Request date
Approvers Requested delivery date
Document history
Date Version Cause Comments
2017-05-06 0.1 Document Draft
creation
Mantis – Mantis performance and load test plan 1
© Tester Hanoi
Contents
Contents
1 INTRODUCTION 3
1.1 PURPOSE 3
1.2 TEST CONDITIONS 3
1.3 SCOPE 4
2 TEST REQUIREMENTS 5
2.1 REQUIREMENTS SPECIFIED BY FSOFT 5
REQUIREMENTS TRACEABILITY MATRIX 5
3 TECHNOLOGY 6
3.1 ENVIRONMENT 6
3.2 INFRASTRUCTURE 7
3.3 TECHNOLOGY 9
3.4 TECHNOLOGY FOR CONTROLLED ENVIRONMENT 9
4 TEST APPROACH 10
4.1 MANTIS APPLICATION TEST DESIGN 10
4.2 TEST DATABASE 10
4.3 TEST EXECUTION DATA 10
4.4 TEST DATA MANAGEMENT 10
4.5 TEST SCRIPTS 11
4.5.1 Script 1: User access to Mantis, raise test case 11
4.5.2 Script 2: User access the Mantis and make test cases imports 13
4.5.3 Script 3: Manual navigation on the system 15
4.7 TEST PREPARATION 17
4.8 SCENARIOS 17
4.8.1 Scenario 1: Peak load test 17
5 SCHEDULE 18
5.1 DATES FOR TEST ACTIVITIES 18
6 REPORTING 19
6.1 PUBLISHING RESULTS 19
Mantis – Mantis performance and load test plan 2
© Tester Hanoi
1 INTRODUCTION
1.1 PURPOSE
This document will describe the Mantis web site. The purpose of that testing is to prove that Mantis
infrastructure at Ha Noi
This document will describe the test plan for the Mantis web site. The purpose of that testing is to
prove that BU10 infrastructure at Ha Noi and the Mantis application itself can process transactions to
the prescribed volumes and performances set out below.
We will test to the extent of their infrastructure boundaries. Comparisons with the current Mantis
hosted at Ha Noi will be provided as well.
Testing will be to prove up to 3 year worth of user take-on and 3 year worth of data storage based on
20% of volume yearly growth.
1.2 TEST CONDITIONS
Current and anticipated volumes leading up to this time are described below.
Peak transaction times:
80% of volumes occur between 3pm and 6pm per day, of which:
25% 3-4pm
25% 4-5pm
50% 5-6pm
Tests will be run over a 3 hours period, simulating the pattern of usage during this time noted above.
During this peak time simulation automated scripts will simulate requests and additional manual tests
will be run to test the speed of system response. See section 4 for further information on scripts and
simulated user profile.
1.1.1 Current volumes
80 000 users
12 500 Bugs per day (Peak = 25 000)
20 Bugs per user per day
Therefore the total number of items stored in the performance test database is:
Maximum peak usage (5pm) 40% of Bugs peak day = 10 000 Bugs / hour
20 Bugs per user per day = 500 concurrent users
Mantis – Mantis performance and load test plan 3
© Tester Hanoi
1.1.2 Year 3 volumes
Based on 20% more users each year:
2018: 108 000 users
2019: 130 000 users
20120: 155 000 users
Based on 30% more items each year:
2018: 16 250 Bugs normal – 32 500 Bugs peak days
2019: 21 125 Bugs normal – 42 250 Bugs peak days
2020: 27 462 Bugs normal – 54 925 Bugs peak days
To ensure that the test conditions match the worst case volume of data and therefore database
performance, 3 years volume of data will be created in the performance test database.
Therefore the total number of items stored in the performance test database will be:
Maximum peak usage (5pm) 40% of Bugs peak day = 22 000 Bugs / hour
20 Bugs per user per day = 1 100 concurrent users
1.3 SCOPE
We will test to the extent of their infrastructure boundaries.
Automatic scripting will be run via the tool JMeter to check response times to requests made during
peak operating times and at peak loads.
Simultaneously manual tests will be run from standard PCs during the automated test period to test
response times of creating test case activities described in section 2, Test Scenarios
Mantis – Mantis performance and load test plan 4
© Tester Hanoi
2 TEST REQUIREMENTS
1.4 REQUIREMENTS SPECIFIED BY TESTER HANOI
Non-functional requirements dictate the level of performance which testing will prove.
REQUIREMENTS TRACEABILITY MATRIX
Req. ID Requirement Description Fsoft Test Scenario
1 Verify that customers first log into Mantis within 5 Scenario 1 – Script 1
seconds
2 Ability to successfully complete load testing on the Scenario 1 – Script 1
application to ensure peak time processing will not
result in any system failure
3 Mantis to support the following volumetric: Scenario 1 – Script 1
25 000 bugs / day
4 90% of users to be able to access any Mantis screen Scenario 1 – Script 1
within 3 seconds of logging in to Mantis.
7 Within a screen, any field to be available Scenario 1 – Script 3
instantaneously/within 0.5 seconds of user clicking on
the field.
8 Drop Down menu’s within a screen to be available Scenario 1 – Script 3
instantaneously/within 0.5 seconds of user clicking
drop down list.
9 Ability for user to load and save within 1 minutes Scenario 1 – Script 2
10 Data committed for processing to be completed Scenario 1 – Script 1
within 5 seconds.
11 90% requests have response time within 3 seconds. Script 4,5,6,7
Mantis – Mantis performance and load test plan 5
© Tester Hanoi
3 TECHNOLOGY
3.1 ENVIRONMENT
Tester Hanoi will test in the production environment. This environment will simulate the server
configurations required to carry out effective and realistic testing.
Elements of this environment that will be used in the test:
● Web -> SUN X6240 : 4 CPU 8 cores, RAM 8Gb. There will be no other load at the time of the
test, usually this is standby servers for cyberstation DRP.
● DB -> SUN X4640 : 8 CPU 6 cores, RAM 128 Gb (used by other mysql customers, it is a
standby box for cyberstation)
SUN M4000 will be used to run jmeter scripts : 2CPU 8 cores, RAM 16Gb (used only for this test
Mantis – Mantis performance and load test plan 6
© Tester Hanoi
Mantis – Mantis performance and load test plan 7
© Tester Hanoi
3.2 INFRASTRUCTURE
3.3 TECHNOLOGY
For Script 3: Manual navigation on the system while script 1 is running
3.4 TECHNOLOGY FOR CONTROLLED ENVIRONMENT
Simulating user pc for screen refresh and print queuing performances. This environment will be as for
3.4 above
Mantis – Mantis performance and load test plan 8
© Tester Hanoi
4 TEST APPROACH
4.1 MANTIS APPLICATION TEST DESIGN
Tester Hanoi will test initially to the maximum volumes described in section 1.2. Should any test prove
that these volumes cannot be met then the volumes will be reduced down to the point at which the
system has capacity. A plan of activities/timescales to scale up to meet requirements will follow test
reporting in that case.
Automated test scripting will be employed – see section 4.4.
4.2 TEST DATABASE
The MANTIS performance test database will contain the latest available database schema and the
most up to date reference data available at the start of the performance test cycle. Exact versions will
be reported in the performance test report and will be configuration managed by the Mantis release
control board.
The performance test database will contain each scenario’s required number of historical transactional
data volumes as described in section 1.2 Test conditions.
This historic data is spread evenly across users and days i.e. there will be 20 items per working day
attributed to each of the users over the required period.
4.3 TEST EXECUTION DATA
Parameterised test data will be supplied by the Jmeter tool.
The tool will simulate the login by unique users up to the concurrent numbers required by the test
scenario. For example if the test requires 700 concurrent users then 700 unique user accounts
(including login names and passwords) will be used during the test.
4.4 TEST DATA MANAGEMENT
There will be no backup/restore procedure between tests as the system is designed to be added to on
a daily basis. The exception is the unlikely event whereby a test corrupts or adversely affects the
profile of data required.
Mantis – Mantis performance and load test plan 9
© Tester Hanoi
4.5 TEST SCRIPTS
4.5.1 Script 1: User access to Mantis, report issue
[Link] Software and hardware
Tool: Jmeter
Machine: Mantis production environment, see sections 3.1 and 3.2
[Link] Basic script steps
This script should take around one hour to execute. It concerns only one specific user having its own
login and password taken from an excel file.
Step Action Jmeter command
1. Reach the login page Jmeter send an http request with the Mantis
URL only and capture the loading time.
2. Log in Jmeter send an http request to Mantis to
simulate the click on the [Log in] button, send a
username and password as parameter and
capture the log in time.
3. Load Report Issue screen Jmeter send an http request to Mantis to
simulate the click on the project Jmeter, Report
Issue link, and capture the Report Issue
loading time.
4. Create an Issue Jmeter send an http request to Mantis to
simulate the enter data for the fields as
parameter then click [Submit Report] button,
and capture the Issue saving time.
5. Display the saved issue in the View Jmeter send an http request to Mantis to
Issues page capture the current issue list loading time.
6. Repeat step 4 20 times. Jmeter will load the user profile with 10 test
cases.
7. Log out Jmeter send an http request to Mantis, to
simulate the click on the [Logout] link and
capture the log out processing time.
Each execution will read sequentially a file containing a list of 700 different users, so each script
execution will be done on a user.
[Link] Profile of user
Users with role Reporter.
Mantis – Mantis performance and load test plan 10
© Tester Hanoi
4.5.2 Script 2: User access the Mantis and make testcases imports
[Link] Software and hardware
Tool: Jmeter
Machine: Mantis production environment, see sections 3.1 and 3.2
[Link] Basic script steps
Step Action Jmeter command
1. Reach the login page N/A (Captured in script 1)
2. Log in N/A (Captured in script 1)
3. Import 1000 test cases Jmeter send an http request to Mantis to
simulate the click on the project link-
>Requirement, select a requirement, select
Action->Import and capture the importing time.
4. Load the imported test caes list Jmeter send an http request to Mantis to
capture the current test cases list loading time.
5. Log out N/A (Captured in script 1)
[Link] Profile of user
Users with role Reporter.
Mantis – Mantis performance and load test plan 11
© Tester Hanoi
4.5.3 Script 3: Manual navigation on the system
[Link] Software and hardware
Tool: Internet explorer
Machine: PC, technical specification, see section 3.4.
[Link] Basic script steps
This script should take between 5 and 10 minutes to execute. It concerns only one specific user
having its own login and password.
Step Action Time to capture manually
1. Reach the login page N/A (Captured in script 1)
2. Log in N/A (Captured in script 1)
3. Navigate to the View Issues screen Time to load the screen.
4. Click Report Issue Time to load the Report Issue screen.
5. Enter the report details N/A
6. Click Submit Report button Time to save the issue.
7. Log out the application. N/A (Captured in script 1)
4.5.4 Script 4: User access the Mantis and create project
[Link] Software and hardware
Tool: Jmeter
Machine: Mantis production environment, see sections 3.1 and 3.2
[Link] Basic script steps
Step Action Jmeter command
1. Reach the login page N/A (Captured in script 1)
2. Log in N/A (Captured in script 1)
3. Load the Manage page Jmeter send an http request to click Manage
link and capture the loading time.
4. Load the Manage Project page Jmeter send an http request to click Manage
Project link and capture the Manage Project
page loading time.
5. Load Create New Project screen Jmeter send an http request to click [Create
New Project] button and capture the Create
Project page loading time.
6. Save Project Jmeter send an http request to Mantis to
simulate the enter data for the fields as
parameter then click [Add Project] button, and
capture the project saving time.
7. Log out N/A (Captured in script 1)
[Link] Profile of user
Users with role Administrator.
Mantis – Mantis performance and load test plan 12
© Tester Hanoi
4.5.5 Script 5: User access the Mantis and resolve bug
[Link] Software and hardware
Tool: Jmeter
Machine: Mantis production environment, see sections 3.1 and 3.2
[Link] Basic script steps
Step Action Jmeter command
1. Reach the login page N/A (Captured in script 1)
2. Log in N/A (Captured in script 1)
3. Load the View Issues page N/A (Captured in script 3)
4. Load the View Issue Details page Jmeter send an http request to select a project,
click issue id link and capture the View Issue
Details page loading time.
5. Load Create Task screen Jmeter send an http request to click [Create
Task] button and capture the Create Task page
loading time.
6 Load Resolve Issue page Jmeter send an http request to select resolve
status and click [Change Status To] button and
capture the Resolve Issue page loading time.
6. Resolve issue Jmeter send an http request to Mantis to
simulate the enter data for the fields as
parameter then click [Resolve Issue] button,
and capture the Resolve saving time.
7. Log out N/A (Captured in script 1)
[Link] Profile of user
Users with role Tester.
4.5.6 Script 6: User access the Mantis and delete issue
[Link] Software and hardware
Tool: Jmeter
Machine: Mantis production environment, see sections 3.1 and 3.2
[Link] Basic script steps
Step Action Jmeter command
1. Reach the login page N/A (Captured in script 1)
2. Log in N/A (Captured in script 1)
3. Load the View Issues page N/A (Captured in script 3)
4. Load the View Issue Details page N/A (Captured in script 5)
5. Delete Issue Jmeter send an http request to click [Delete]
button and confirm and capture the delete
issue loading time.
7. Log out N/A (Captured in script 1)
[Link] Profile of user
Users with role Tester.
Mantis – Mantis performance and load test plan 13
© Tester Hanoi
4.5.7 Script 7: User access the Mantis and load the Summary pages
[Link] Software and hardware
Tool: Jmeter
Machine: Mantis production environment, see sections 3.1 and 3.2
[Link] Basic script steps
Step Action Jmeter command
1. Reach the login page N/A (Captured in script 1)
2. Log in N/A (Captured in script 1)
3. Load the Summary page Jmeter send an http request to click Summary
link and capture the Summary page loading
time.
11. Log out N/A (Captured in script 1)
[Link] Profile of user
Users with role Tester.
4.6 TEST PREPARATION
Ensure environment is managed according to Mantis release management principles
Ensure infrastructure and test tools scale sufficiently for the tests to be performed to give meaningful
results
Ensure adequate planning and resourcing
4.7 SCENARIOS
4.7.1 Scenario 1: Peak load test – Current activity
[Link] Objectives
We will run the script 1, script 2 and script 3 concurrently. The goal is to demonstrate:
1) How the infrastructure manage the volume of activity in the peak period,
2) Is it still possible to make big volume test case importations and how long does it take,
3) How the user interface is responding.
[Link] Volumes
The number of concurrent users to be simulated in this period is 500 divided like this:
- 250 users executing script 1 between 3pm and 5pm,
- 250 users executing script 1 between 5pm and 6pm,
- 20 users executing script 2 between 5pm and 6pm,
- 1 user executing script 3 at 2:30 pm, 3:30pm, 4:30pm, 5:30pm and 6:30pm (Manual).
Mantis – Mantis performance and load test plan 14
© Tester Hanoi
[Link] Scenario schedule for automated scripts
Period User ramp up Action Max. nb of Testcases per
per user concurrent access User
reached*
Between 15:00 and Launch script 1 20 250 5000
16:00 every 3 seconds
Between 16:00 and Launch script 1 20 250 5000
17:00 every 3 seconds
Between 17:00 and Launch script 1 20 500 10000
18:00 every 1,5 seconds
Launch script 2 1 20 20
every 8 seconds
Mantis – Mantis performance and load test plan 15
© Tester Hanoi
At the end of the loading period (3 hours) we will have:
1475 different users that used the system.
28000 Bugs processed. (excluding special case of import test case)
4.7.2 Scenario 2: Peak load test – Year 3 activity
[Link] Objectives
We will run the script 1, script 2 and script 3 concurrently. The goal is to demonstrate:
4) How the infrastructure manage the volume of activity in the peak period,
5) Is it still possible to make big volume Bugs importations and how long does it take,
6) How the user interface is responding.
;
[Link] Volumes
The number of concurrent users to be simulated in this period is 720 divided like this:
- 550 users executing script 1 between 3pm and 5pm,
- 1100 users executing script 1 between 5pm and 6pm,
- 20 users executing script 2 between 5pm and 6pm,
- 1 user executing script 3 at 2:30 pm, 3:30pm, 4:30pm, 5:30pm and 6:30pm (Manual).
[Link] Scenario schedule for automated scripts
Period User ramp up Action Max. nb of Testcases per
per user concurrent access User
reached*
Between 15:00 and Launch script 1 20 550 11000
16:00 every 3 seconds
Between 16:00 and Launch script 1 20 550 11000
17:00 every 3 seconds
Between 17:00 and Launch script 1 20 1100 22000
18:00 every 1,5 seconds
Launch script 2 1 20 20
every 8 seconds
Mantis – Mantis performance and load test plan 16
© Tester Hanoi
5 SCHEDULE
5.1 DATES FOR TEST ACTIVITIES
The test execution activities are currently planned for the following dates:
6 REPORTING
6.1 PUBLISHING RESULTS
Test results will be analysed and published in a performance test report. This report will contain as a
minimum the following:
Transaction Response Times
Automated Page response Times
Minimum, Maximum, Average and 90 percentile response times for all page requests/response
automated by JMETER
Manual Response Times
This will include end user response time which includes the test PC browser rendering time for each
page. The measurements will include:
1. Page response times
2. All ‘in page’ drop down selection times
3. Time taken to tab between fields on each page
4. Data commit time
Resource Utilisation
Resource graphs will be created and analysed for all servers described in “Section 3.2 Infrastructure”,
specifically the Mantis web and database servers.
The graphs will include key indicators for CPU, memory and disk. Analysis will be provided to explain
whether the graphs show any cause for concern and where there are any issues. For example if CPU
resource utilisation is high or a spike is seen, the process causing this will be determined and an
explanation provided.
Example reporting graphs that will form part of the final analysis are below:
Mantis – Mantis performance and load test plan 17
© Tester Hanoi
Mantis – Mantis performance and load test plan 18
© Tester Hanoi
Mantis – Mantis performance and load test plan 19
© Tester Hanoi