JMeter Tutorials
JMeter Tutorials
Click on the section name to go straight to the section. Click on the "+" to go to the relevant section
of the detailed section list, where you can select individual subsections.
Section Summary
Changes
+ ... 1. Introduction
+ ... 2. Getting Started
+ ... 3. Building a Test Plan
+ ... 4. Elements of a Test Plan
+ ... 5. Building a Web Test Plan
+ ... 6. Building an Advanced Web Test Plan
+ ... 7. Building a Database Test Plan
+ ... 8. Building an FTP Test Plan
+ ... 9a. Building an LDAP Test Plan
+ ... 9b. Building an Extended LDAP Test Plan
+ ... 10. Building a Webservice Test Plan
+ ... 11. Building a JMS Point to point Test Plan
+ ... 12. Building a JMS Topic Test Plan
+ ... 13. Building a Monitor Test Plan
+ ... 14. Listeners
+ ... 15. Remote Testing
+ ... 16. Best Practices
+ ... 17. Help! My boss wants me to load test our web app!
+ ... 18. Component Reference
+ ... 19. Functions
+ ... 20. Regular Expressions
+ ... 21. Hints and Tips
2. Getting Started
o 2.1 Requirements
o 4.2 Controllers
4.2.1 Samplers
4.2.2 Logic Controllers
o 4.3 Listeners
o 4.4 Timers
o 4.5 Assertions
22. Glossary
1. Introduction
Apache JMeter is a 100% pure Java desktop application designed to load test client/server
software (such as a web application ). It may be used to test performance both on static and
dynamic resources such as static files, Java Servlets, CGI scripts, Java objects, databases ,
FTP servers , and more. JMeter can be used to simulate a heavy load on a server, network or
object to test its strength or to analyze overall performance under different load types.
Additionally, JMeter can help you regression test your application by letting you create test
scripts with assertions to validate that your application is returning the results you expect.
For maximum flexibility, JMeter lets you create these assertions using regular expressions.
1.1 History
Stefano Mazzocchi of the Apache Software Foundation was the original developer of
JMeter. He wrote it primarily to test the performance of Apache JServ (a project that has
since been replaced by the Apache Tomcat project). We redesigned JMeter to enhance the
GUI and to add functional-testing capabilities.
If you want to perform JDBC testing, then you will, of course, need the appropriate JDBC
driver from your vendor. JMeter does not come with any JDBC drivers.
JMeter includes the JMS API jar, but does not include a JMS client implementation. If you
want to run JMS tests, you will need to download the appropriate jars from the JMS
provider.
Next, start JMeter and go through the Building a Test Plan section of the User Guide to
familiarize yourself with JMeter basics (for example, adding and removing elements).
Finally, go through the appropriate section on how to build a specific type of Test Plan. For
example, if you are interested in testing a Web application, then see the section Building a
Web Test Plan . The other specific Test Plan sections are:
Advanced Web Test Plan
JDBC
FTP
JMS Point-to-Point
JMS Topic
LDAP
LDAP Extended
WebServices (SOAP)
Once you are comfortable with building and running JMeter Test Plans, you can look into
the various configuration elements (timers, listeners, assertions, and others) which give you
more control over your Test Plans.
2.1 Requirements
JMeter requires your computing environment meets some minimum requirements.
2.2 Optional
If you plan on doing JMeter development, then you will need one or more optional packages listed
below.
JMeter HTTP defaults to protocol level TLS. This can be changed by editting the JMeter
property "https.default.protocol" in jmeter.properties or user.properties.
The JMeter HTTP samplers are configured to accept all certificates, whether trusted
or not, regardless of validity periods etc. This is to allow the maximum flexibility in
testing servers.
2.3 Installation
We recommend that most users run the latest release .
To install a release build, simply unzip the zip/tar file into the directory where you want
JMeter to be installed. Provided that you have a JRE/JDK correctly installed and the
JAVA_HOME environment variable set, there is nothing more for you to do.
Note: there can be problems (especially with client-server mode) if the directory path
contains any spaces.
The installation directory structure should look something like this (for version 2.3.1):
jakarta-jmeter-2.3.1
jakarta-jmeter-2.3.1/bin
jakarta-jmeter-2.3.1/docs
jakarta-jmeter-2.3.1/extras
jakarta-jmeter-2.3.1/lib/
jakarta-jmeter-2.3.1/lib/ext
jakarta-jmeter-2.3.1/lib/junit
jakarta-jmeter-2.3.1/printable_docs
You can rename the parent directory (i.e. jakarta-jmeter-2.3.1) if you want, but do not
change any of the sub-directory names.
To run JMeter, run the jmeter.bat (for Windows) or jmeter (for Unix) file. These files are
found in the bin directory. After a short pause, the JMeter GUI should appear.
There are some additional scripts in the bin directory that you may find useful. Windows
script files (the .CMD files require Win2K or later):
Note: the special name LAST can be used with jmeter-n.cmd, jmeter-t.cmd and jmeter-n-
r.cmd and means the last test plan that was run interactively.
The environment variable JVM_ARGS can be used to override JVM settings in the
jmeter.bat script. For example:
jmeter - run JMeter (in GUI mode by default). Defines some JVM settings
which may not work for all JVMs.
jmeter-server - start JMeter in server mode (calls jmeter script with
appropriate parameters)
jmeter.sh - very basic JMeter script with no JVM options specified.
mirror-server.sh - runs the JMeter Mirror Server in non-GUI mode
shutdown.sh - Run the Shutdown client to stop a non-GUI instance gracefully
stoptest.sh - Run the Shutdown client to stop a non-GUI instance abruptly
It may be necessary to edit the jmeter shell script if some of the JVM options are not
supported by the JVM you are using. The JVM_ARGS environment variable can be used to
override or set additional JVM options, for example:
If you don't want to put JMeter extension jars in the lib/ext directory, then define the
property search_paths in jmeter.properties. Do not use lib/ext for utility jars; it is only
intended for JMeter components.
Other jars (such as JDBC, JMS implementations and any other support libaries needed by
the JMeter code) should be placed in the lib directory - not the lib/ext directory
You can also install utility Jar files in $JAVA_HOME/jre/lib/ext, or (since 2.1.1) you can
set the property user.classpath in jmeter.properties
Note that setting the CLASSPATH environment variable will have no effect. This is
because JMeter is started with "java -jar", and the java command silently ignores the
CLASSPATH variable, and the -classpath/-cp options when -jar is used. [This occurs with
all Java programs, not just JMeter.]
-r Run the test in the servers specified by the JMeter property "remote_hosts"
-R [list of remote servers] Run the test in the specified remote servers
The script also lets you specify the optional firewall/proxy server information:
The script also lets you specify the optional firewall/proxy server information:
If you want the server to exit after a single test has been run, then define the JMeter property
server.exitaftertest=true.
To run the test from the client in non-GUI mode, use the following command:
-G[propertyfile] - defines a file containing JMeter properties to be sent to all remote servers.
The -L flag can also be used without the category name to set the root logging level.
Examples :
jmeter -Duser.dir=/home/mstover/jmeter_stuff \
-Jremote_hosts=127.0.0.1 -Ljmeter.engine=DEBUG
jmeter -LDEBUG
N.B.
The command line properties are processed early in startup, but after the logging
system has been set up. Attempts to use the -J flag to update log_level or log_file
properties will have no effect.
If JMeter detects an error during a test, a message will be written to the log file. The log file
name is defined in the jmeter.properties file (or using the -j option, see below). It defaults to
jmeter.log , and will be found in the directory from which JMeter was launched.
JMeter versions after 2.2 added a new command-line option, -j jmeterlogfile. This is
processed after the initial properties file is read, and before any further properties are
processed. It therefore allows the default of jmeter.log to be overridden. The jmeter scripts
that take a test plan name as a parameter (e.g. jmeter-n.cmd) have been updated to define the
log file using the test plan name, e.g. for the test plan Test27.jmx the log file is set to
Test27.log.
When running on Windows, the file may appear as just jmeter unless you have set
Windows to show file extensions. [Which you should do anyway, to make it easier to detect
viruses and other nasties that pretend to be text files...]
As well as recording errors, the jmeter.log file records some information about the test run.
For example:
The log file can be helpful in determining the cause of an error, as JMeter does not interrupt
a test to display an error dialogue.
-h, --help
print usage information and exit
-v, --version
print the version information and exit
-p, --propfile {argument}
the jmeter property file to use
-q, --addprop {argument}
additional property file(s)
-t, --testfile {argument}
the jmeter test(.jmx) file to run
-j, --jmeterlogfile {argument}
the jmeter log file
-l, --logfile {argument}
the file to log samples to
-n, --nongui
run JMeter in nongui mode
-s, --server
run the JMeter server
-H, --proxyHost {argument}
Set a proxy server for JMeter to use
-P, --proxyPort {argument}
Set proxy server port for JMeter to use
-u, --username {argument}
Set username for proxy server that JMeter is to use
-a, --password {argument}
Set password for proxy server that JMeter is to use
-J, --jmeterproperty {argument}={value}
Define additional JMeter properties
-G, --globalproperty (argument)[=(value)]
Define Global properties (sent to servers)
e.g. -Gport=123
or -Gglobal.properties
-D, --systemproperty {argument}={value}
Define additional System properties
-S, --systemPropertyFile {filename}
a property file to be added as System properties
-L, --loglevel {argument}={value}
Define loglevel: [category=]level
e.g. jorphan=INFO or jmeter.util=DEBUG
-r, --runremote (non-GUI only)
Start remote servers (as defined by the jmeter property
remote_hosts)
-R, --remotestart server1,... (non-GUI only)
Start these remote servers (overrides remote_hosts)
-d, --homedir {argument}
the jmeter home directory to use
-X, --remoteexit
Exit the remote servers at end of test (non-GUI)
Note: the JMeter log file name is formatted as a SimpleDateFormat (applied to the current
date) if it contains paired single-quotes, .e.g. 'jmeter_'yyyyMMddHHmmss'.log'
If the special name LAST is used for the -t, -j or -l flags, then JMeter takes that to mean the
last test plan that was run in interactive mode.
Parameters
The command line options and properties files are processed in the following order:
-p propfile
jmeter.properties (or the file from the -p option) is then loaded
-j logfile
Logging is initialised
user.properties is loaded
system.properties is loaded
all other command-line options are processed
To save tree elements, right click on an element and choose the "Save Selection As ..."
option. JMeter will save the element selected, plus all child elements beneath it. In this way,
you can save test tree fragments and individual elements for later use.
Versions of JMeter after 2.3.2 allow a Stop to be initiated if Shutdown is taking too long.
Close the Shutdown dialog box and select Run/Stop, or just press Control + '.'.
When running JMeter in non-GUI mode, there is no Menu, and JMeter does not react to
keystrokes such as Control + '.'. So in versions after 2.3.2, JMeter non-GUI mode will listen
for commands on a specific port (default 4445, see the JMeter property
jmeterengine.nongui.port ). The commands currently supported are:
If you are not recording the data to file, this option makes no difference.
You can also use the Configuration button on a listener to decide what fields to save.
4.1 ThreadGroup
Thread group elements are the beginning points of any test plan. All controllers and
samplers must be under a thread group. Other elements, e.g. Listeners, may be placed
directly under the test plan, in which case they will apply to all the thread groups. As the
name implies, the thread group element controls the number of threads JMeter will use to
execute your test. The controls for a thread group allow you to:
Set the number of threads
Set the ramp-up period
Set the number of times to execute the test
Each thread will execute the test plan in its entirety and completely independently of other
test threads. Multiple threads are used to simulate concurrent connections to your server
application.
The ramp-up period tells JMeter how long to take to "ramp-up" to the full number of threads
chosen. If 10 threads are used, and the ramp-up period is 100 seconds, then JMeter will take
100 seconds to get all 10 threads up and running. Each thread will start 10 (100/10) seconds
after the previous thread was begun. If there are 30 threads and a ramp-up period of 120
seconds, then each successive thread will be delayed by 4 seconds.
Ramp-up needs to be long enough to avoid too large a work-load at the start of a test, and
short enough that the last threads start running before the first ones finish (unless one wants
that to happen).
By default, the thread group is configured to loop once through its elements.
Version 1.9 introduces a test run scheduler . Click the checkbox at the bottom of the Thread
Group panel to reveal extra fields in which you can enter the start and end times of the run.
When the test is started, JMeter will wait if necessary until the start-time has been reached.
At the end of each cycle, JMeter checks if the end-time has been reached, and if so, the run
is stopped, otherwise the test is allowed to continue until the iteration limit is reached.
Alternatively, one can use the relative delay and duration fields. Note that delay overrides
start-time, and duration over-rides end-time.
4.2 Controllers
JMeter has two types of Controllers: Samplers and Logical Controllers. These drive the
processing of a test.
Samplers tell JMeter to send requests to a server. For example, add an HTTP Request
Sampler if you want JMeter to send an HTTP request. You can also customize a request by
adding one or more Configuration Elements to a Sampler. For more information, see
Samplers .
Logical Controllers let you customize the logic that JMeter uses to decide when to send
requests. For example, you can add an Interleave Logic Controller to alternate between two
HTTP Request Samplers. For more information, see Logical Controllers .
4.2.1 Samplers
Samplers tell JMeter to send requests to a server and wait for a response. They are processed
in the order they appear in the tree. Controllers can be used to modify the number of
repetitions of a sampler.
FTP Request
HTTP Request
JDBC Request
Java object request
LDAP Request
SOAP/XML-RPC Request
WebService (SOAP) Request
Each sampler has several properties you can set. You can further customize a sampler by
adding one or more Configuration Elements to the Test Plan.
If you are going to send multiple requests of the same type (for example, HTTP Request) to
the same server, consider using a Defaults Configuration Element. Each controller has one
or more Defaults elements (see below).
Remember to add a Listener to your test plan to view and/or store the results of your
requests to disk.
If you are interested in having JMeter perform basic validation on the response of your
request, add an Assertion to the sampler. For example, in stress testing a web application,
the server may return a successful "HTTP Response" code, but the page may have errors on
it or may be missing sections. You could add assertions to check for certain HTML tags,
common error strings, and so on. JMeter lets you create these assertions using regular
expressions.
To understand the effect of Logic Controllers on a test plan, consider the following test tree:
Test Plan
o Thread Group
The first thing about this test is that the login request will be executed only the first time
through. Subsequent iterations will skip it. This is due to the effects of the Once Only
Controller .
After the login, the next Sampler loads the search page (imagine a web application where
the user logs in, and then goes to a search page to do a search). This is just a simple request,
not filtered through any Logic Controller.
After loading the search page, we want to do a search. Actually, we want to do two different
searches. However, we want to re-load the search page itself between each search. We could
do this by having 4 simple HTTP request elements (load search, search "A", load search,
search "B"). Instead, we use the Interleave Controller which passes on one child request
each time through the test. It keeps the ordering (ie - it doesn't pass one on at random, but
"remembers" its place) of its child elements. Interleaving 2 child requests may be overkill,
but there could easily have been 8, or 20 child requests.
Note the HTTP Request Defaults that belongs to the Interleave Controller. Imagine that
"Search A" and "Search B" share the same PATH info (an HTTP request specification
includes domain, port, method, protocol, path, and arguments, plus other optional items).
This makes sense - both are search requests, hitting the same back-end search engine (a
servlet or cgi-script, let's say). Rather than configure both HTTP Samplers with the same
information in their PATH field, we can abstract that information out to a single
Configuration Element. When the Interleave Controller "passes on" requests from "Search
A" or "Search B", it will fill in the blanks with values from the HTTP default request
Configuration Element. So, we leave the PATH field blank for those requests, and put that
information into the Configuration Element. In this case, this is a minor benefit at best, but it
demonstrates the feature.
The next element in the tree is another HTTP default request, this time added to the Thread
Group itself. The Thread Group has a built-in Logic Controller, and thus, it uses this
Configuration Element exactly as described above. It fills in the blanks of any Request that
passes through. It is extremely useful in web testing to leave the DOMAIN field blank in all
your HTTP Sampler elements, and instead, put that information into an HTTP default
request element, added to the Thread Group. By doing so, you can test your application on a
different server simply by changing one field in your Test Plan. Otherwise, you'd have to
edit each and every Sampler.
The last element is a HTTP Cookie Manager . A Cookie Manager should be added to all
web tests - otherwise JMeter will ignore cookies. By adding it at the Thread Group level, we
ensure that all HTTP requests will share the same cookies.
Logic Controllers can be combined to achieve various results. See the list of built-in Logic
Controllers .
4.3 Listeners
Listeners provide access to the information JMeter gathers about the test cases while JMeter
runs. The Graph Results listener plots the response times on a graph. The "View Results
Tree" Listener shows details of sampler requests and responses, and can display basic
HTML and XML representations of the response. Other listeners provide summary or
aggregation information.
Additionally, listeners can direct the data to a file for later use. Every listener in JMeter
provides a field to indicate the file to store data to. There is also a Configuration button
which can be used to choose which fields to save, and whether to use CSV or XML format.
Note that all Listeners save the same data; the only difference is in the way the data is
presented on the screen.
Listeners can be added anywhere in the test, including directly under the test plan. They will
collect data only from elements at or below their level.
4.4 Timers
By default, a JMeter thread sends requests without pausing between each request. We
recommend that you specify a delay by adding one of the available timers to your Thread
Group. If you do not add a delay, JMeter could overwhelm your server by making too many
requests in a very short amount of time.
The timer will cause JMeter to delay a certain amount of time before each sampler which is
in its scope .
If you choose to add more than one timer to a Thread Group, JMeter takes the sum of the
timers and pauses for that amount of time before executing the samplers to which the timers
apply. Timers can be added as children of samplers or controllers in order to restrict the
samplers to which they are applied.
To provide a pause at a single place in a test plan, one can use the Test Action Sampler.
4.5 Assertions
Assertions allow you to assert facts about responses received from the server being tested.
Using an assertion, you can essentially "test" that your application is returning the results
you expect it to.
For instance, you can assert that the response to a query will contain some particular text.
The text you specify can be a Perl-style regular expression, and you can indicate that the
response is to contain the text, or that it should match the whole response.
You can add an assertion to any Sampler. For example, you can add an assertion to a HTTP
Request that checks for the text, "</HTML>". JMeter will then check that the text is present
in the HTTP response. If JMeter cannot find the text, then it will mark this as a failed
request.
Note that assertions apply to all samplers which are in its scope . To restrict the assertion to
a single sampler, add the assertion as a child of the sampler.
To view the assertion results, add an Assertion Listener to the Thread Group. Failed
Assertions will also show up in the Tree View and Table Listeners, and will count towards
the error %age for example in the Aggregate and Summary reports.
A configuration element is accessible from only inside the tree branch where you place the
element. For example, if you place an HTTP Cookie Manager inside a Simple Logic
Controller, the Cookie Manager will only be accessible to HTTP Request Controllers you
place inside the Simple Logic Controller (see figure 1). The Cookie Manager is accessible to
the HTTP requests "Web Page 1" and "Web Page 2", but not "Web Page 3".
Also, a configuration element inside a tree branch has higher precedence than the same element in
a "parent" branch. For example, we defined two HTTP Request Defaults elements, "Web Defaults
1" and "Web Defaults 2". Since we placed "Web Defaults 1" inside a Loop Controller, only "Web
Page 2" can access it. The other HTTP requests will use "Web Defaults 2", since we placed it in the
Thread Group (the "parent" of all other branches).
Controller
o Post-Processor 1
o Sampler 1
o Sampler 2
o Timer 1
o Assertion 1
o Pre-Processor 1
o Timer 2
o Post-Processor 2
Pre-Processor 1
Timer 1
Timer 2
Sampler 2
Post-Processor 1
Post-Processor 2
Assertion 1
Some controllers affect the order of their subelements, and you can read about these specific
controllers in the component reference .
Other elements are hierarchical. An Assertion, for instance, is hierarchical in the test tree. If its
parent is a request, then it is applied to that request. If its parent is a Controller, then it affects all
requests that are descendants of that Controller. In the following test tree:
Hierarchy example
Assertion #1 is applied only to Request One, while Assertion #2 is applied to Requests Two
and Three.
complex example
In this example, the requests are named to reflect the order in which they will be executed.
Timer #1 will apply to Requests Two, Three, and Four (notice how order is irrelevant for
hierarchical elements). Assertion #1 will apply only to Request Three. Timer #2 will affect
all the requests.
Hopefully these examples make it clear how configuration (hierarchical) elements are
applied. If you imagine each Request being passed up the tree branches, to its parent, then to
its parent's parent, etc, and each time collecting all the configuration elements of that parent,
then you will see how it works.
Properties are global to jmeter, and are mostly used to define some of the defaults JMeter
uses. For example the property remote_hosts defines the servers that JMeter will try to run
remotely. Properties can be referenced in test plans - see Functions - read a property - but
cannot be used for thread-specific values.
JMeter variables are local to each thread. The values may be the same for each thread, or
they may be different.
If a variable is updated by a thread, only the thread copy of the variable is changed. For
example the Regular Expression Extractor Post-Processor will set its variables according to
the sample that its thread has read, and these can be used later by the same thread. For
details of how to reference variables and functions, see Functions and Variables
Note that the values defined by the Test Plan and the User Defined Variables configuration
element are made available to the whole test plan at startup. If the same variable is defined
by multiple UDV elements, then the last one takes effect. Once a thread has started, the
initial set of variables is copied to each thread. Other elements such as the User Parameters
Pre-Processor or Regular Expression Extractor Post-Processor may be used to redefine the
same variables (or create new ones). These redefinitions only apply to the current thread.
The setProperty function can be used to define a JMeter property. These are global to the
test plan, so can be used to pass information between threads - should that be needed.
When deciding how to structure a Test Plan, make a note of which items are constant for the
run, but which may change between runs. Decide on some variable names for these -
perhaps use a naming convention such as prefixing them with C_ or K_ or using uppercase
only to distinguish them from variables that need to change during the test. Also consider
which items need to be local to a thread - for example counters or values extracted with the
Regular Expression Post-Processor. You may wish to use a different naming convention for
these.
For example, you might define the following on the Test Plan:
HOST www.example.com
THREADS 10
LOOPS 20
You can refer to these in the test plan as ${HOST} ${THREADS} etc. If you later want to
change the host, just change the value of the HOST variable. This works fine for small
numbers of tests, but becomes tedious when testing lots of different combinations. One
solution is to use a property to define the value of the variables, for example:
HOST ${__P(host,www.example.com)}
THREADS ${__P(threads,10)}
LOOPS ${__P(loops,20)}
You can then change some or all of the values on the command-line as follows:
jmeter ... -Jhost=www3.example.org -Jloops=13
Go ahead and add the ThreadGroup element by first selecting the Test Plan, clicking your
right mouse button to get the Add menu, and then select Add --> ThreadGroup.
You should now see the Thread Group element under Test Plan. If you do not see the
element, then "expand" the Test Plan tree by clicking on the Test Plan element.
Next, you need to modify the default properties. Select the Thread Group element in the tree, if you
have not already selected it. You should now see the Thread Group Control Panel in the right
section of the JMeter window (see Figure 5.1 below)
Figure 5.1. Thread Group with Default Values
Start by providing a more descriptive name for our Thread Group. In the name field, enter
Jakarta Users.
In the next field, the Ramp-Up Period, leave the the default value of 1 seconds. This property
tells JMeter how long to delay between starting each user. For example, if you enter a Ramp-
Up Period of 5 seconds, JMeter will finish starting all of your users by the end of the 5
seconds. So, if we have 5 users and a 5 second Ramp-Up Period, then the delay between
starting users would be 1 second (5 users / 5 seconds = 1 user per second). If you set the
value to 0, then JMeter will immediately start all of your users.
Finally enter a value of 2 in the Loop Count field. This property tells JMeter how many
times to repeat your test. If you enter a loop count value of 1, then JMeter will run your test
only once. To have JMeter repeatedly run your Test Plan, select the Forever checkbox.
See Figure 5.2 for the completed Jakarta Users Thread Group.
Figure 5.2. Jakarta Users Thread Group
Begin by selecting the Jakarta Users (Thread Group) element. Click your right mouse button to get
the Add menu, and then select Add --> Config Element --> HTTP Request Defaults. Then, select this
new element to view its Control Panel (see Figure 5.3).
Skip to the next field, which is the Web Server's Server Name/IP. For the Test Plan that you
are building, all HTTP requests will be sent to the same Web server, jakarta.apache.org.
Enter this domain name into the field. This is the only field that we will specify a default, so
leave the remaining fields with their default values.
See Figure 5.4 for the completed HTTP Request Defaults element
Start by adding the first HTTP Request to the Jakarta Users element (Add --> Sampler -->
HTTP Request). Then, select the HTTP Request element in the tree and edit the following
properties (see Figure 5.5):
Next, add the second HTTP Request and edit the following properties (see Figure 5.6:
Select the Jakarta Users element and add a Graph Results listener (Add --> Listener --> Graph
Results). Next, you need to specify a directory and filename of the output file. You can either type it
into the filename field, or select the Browse button and browse to a directory and then enter a
filename.
Figure 5.7. Graph Results Listener
To do this in JMeter, add an HTTP Request, and set the method to POST. You'll need to
know the names of the fields used by the form, and the target page. These can be found out
by inspecting the code of the login page. [If this is difficult to do, you can use the JMeter
Proxy Recorder to record the login sequence.] Set the path to the target of the submit button.
Click the Add button twice and enter the username and password details. Sometimes the login form
contains additional hidden fields. These will need to be added as well.
Figure 5.8. Sample HTTP login request
To respond correctly to URL rewriting, JMeter needs to parse the HTML received from the
server and retrieve the unique session ID. Use the appropriate HTTP URL Re-writing
Modifier to accomplish this. Simply enter the name of your session ID parameter into the
modifier, and it will find it and add it to each request. If the request already has a value, it
will be replaced. If "Cache Session Id?" is checked, then the last found session id will be
saved, and will be used if the previous HTTP sample does not contain a session id.
Download this example . In Figure 1 is shown a test plan using URL rewriting. Note that the URL
Re-writing modifier is added to the SimpleController, thus assuring that it will only affect requests
under that SimpleController.
In Figure 2, we see the URL Re-writing modifier GUI, which just has a field for the user to specify the
name of the session ID parameter. There is also a checkbox for indicating that the session ID should
be part of the path (separated by a ";"), rather than a request parameter
Go ahead and add the ThreadGroup element by first selecting the Test Plan, clicking your
right mouse button to get the Add menu, and then select Add --> ThreadGroup.
You should now see the Thread Group element under Test Plan. If you do not see the
element, then "expand" the Test Plan tree by clicking on the Test Plan element.
Next, you need to modify the default properties. Select the Thread Group element in the tree, if you
have not already selected it. You should now see the Thread Group Control Panel in the right
section of the JMeter window (see Figure 7.1 below)
Start by providing a more descriptive name for our Thread Group. In the name field, enter
JDBC Users.
You will need a valid database, database
table, and user-level access to that table. In
the example shown here, the database is
'mydb' and the table name is 'Stocks'.
In the next field, the Ramp-Up Period, leave the the default value of 0 seconds. This property
tells JMeter how long to delay between starting each user. For example, if you enter a Ramp-
Up Period of 5 seconds, JMeter will finish starting all of your users by the end of the 5
seconds. So, if we have 5 users and a 5 second Ramp-Up Period, then the delay between
starting users would be 1 second (5 users / 5 seconds = 1 user per second). If you set the
value to 0, then JMeter will immediately start all of your users.
Finally, enter a value of 3 in the Loop Count field. This property tells JMeter how many
times to repeat your test. To have JMeter repeatedly run your Test Plan, select the Forever
checkbox.
See Figure 7.2 for the completed JDBC Users Thread Group.
Begin by selecting the JDBC Users element. Click your right mouse button to get the Add
menu, and then select Add --> Config Element --> JDBC Connection Configuration. Then,
select this new element to view its Control Panel (see Figure 7.3).
Set up the following fields (these assume we will be using a local MySQL database called
test):
JMeter creates a database connection pool with the configuration settings as specified in the
Control Panel. The pool is referred to in JDBC Requests in the 'Variable Name' field. Several different
JDBC Configuration elements can be used, but they must have unique names. Every JDBC Request
must refer to a JDBC Configuration pool. More than one JDBC Request can refer to the same pool.
Figure 7.3. JDBC Configuration
Selecting the JDBC Users element again. Click your right mouse button to get the Add menu, and
then select Add --> Sampler --> JDBC Request. Then, select this new element to view its Control
Panel (see Figure 7.4).
In our Test Plan, we will make two JDBC requests. The first one is for Eastman Kodak
stock, and the second is Pfizer stock (obviously you should change these to examples
appropriate for your particular database). These are illustrated below.
Next, add the second JDBC Request and edit the following properties (see Figure 7.6):
Select the JDBC Users element and add a Graph Results listener (Add --> Listener --> Graph
Results).
Go ahead and add the ThreadGroup element by first selecting the Test Plan, clicking your
right mouse button to get the Add menu, and then select Add --> ThreadGroup.
You should now see the Thread Group element under Test Plan. If you do not see the
element, then "expand" the Test Plan tree by clicking on the Test Plan element.
Next, you need to modify the default properties. Select the Thread Group element in the tree, if you
have not already selected it. You should now see the Thread Group Control Panel in the right
section of the JMeter window (see Figure 8.1 below)
Start by providing a more descriptive name for our Thread Group. In the name field, enter
O'Reilly Users.
Next, increase the number of users to 4.
In the next field, the Ramp-Up Period, leave the the default value of 0 seconds. This property
tells JMeter how long to delay between starting each user. For example, if you enter a Ramp-
Up Period of 5 seconds, JMeter will finish starting all of your users by the end of the 5
seconds. So, if we have 5 users and a 5 second Ramp-Up Period, then the delay between
starting users would be 1 second (5 users / 5 seconds = 1 user per second). If you set the
value to 0, then JMeter will immediately start all of your users.
Finally, enter a value of 2 in the Loop Count field. This property tells JMeter how many
times to repeat your test. To have JMeter repeatedly run your Test Plan, select the Forever
checkbox.
See Figure 8.2 for the completed O'Reilly Users Thread Group.
Begin by selecting the O'Reilly Users element. Click your right mouse button to get the Add menu,
and then select Add --> Config Element --> FTP Request Defaults. Then, select this new element to
view its Control Panel (see Figure 8.3).
Like most JMeter elements, the FTP Request Defaults Control Panel has a name field that
you can modify. In this example, leave this field with the default value.
Skip to the next field, which is the FTP Server's Server Name/IP. For the Test Plan that you
are building, all FTP requests will be sent to the same FTP server, ftp.oro.com. Enter this
domain name into the field. This is the only field that we will specify a default, so leave the
remaining fields with their default values.
See Figure 8.4 for the completed FTP Request Defaults element
Figure 8.4. FTP Defaults for our Test Plan
Start by adding the first FTP Request to the O'Reilly Users element (Add --> Sampler -->
FTP Request). Then, select the FTP Request element in the tree and edit the following
properties (see Figure 8.5):
Next, add the second FTP Request and edit the following properties (see Figure 8.6:
Figure 8.6. FTP Request for O'Reilly mSQL Java tutorial file
Go ahead and add the ThreadGroup element by first selecting the Test Plan, clicking your right
mouse button to get the Add menu, and then select Add-->ThreadGroup. You should now see the
Thread Group element under Test Plan. If you do not see the element, then "expand" the Test Plan
tree by clicking on the Test Plan element.
Figure 9a.1. Thread Group with Default Values
Like most JMeter elements, the Login Config Element Control Panel has a name field that you can
modify. In this example, leave this field with the default value.
Like most JMeter elements, the LDAP Request Defaults Control Panel has a name field that you can
modify. In this example, leave this field with the default value.
JMeter sends requests in the order that you add them to the tree. Start by adding the first
LDAP Request to the Siptech Users element (Add --> Sampler --> LDAP Request). Then,
select the LDAP Request element in the tree and edit the following properties
You do not have to set the Server Name field, port field, Username, Password and DN
because you already specified this value in the Login Config Element and LDAP Request
Defaults.
Next, add the second LDAP Request and edit the following properties
As the Extended LDAP Sampler is highly configurable, this also means that it takes some
time to build a correct testplan. You can however tune it exactly up to your needs.
You will create four users that send requests for four tests on the LDAP server.Also, you will
tell the users to run their tests twice. So, the total number of requests is (4 users) x (4
requests) x repeat 2 times) = 32 LDAP requests. To construct the Test Plan, you will use the
following elements:
Thread Group ,
This example assumes that the LDAP Server is installed in your Local machine.
For the less experienced LDAP users, I build a small LDAP tutorial which shortly explains
the several LDAP operations that can be used in building a complex testplan.
Take care when using LDAP special characters in the distinghuished name, in that case (eg, you want
to use a + sign in a distinghuished name) you need to escape the character by adding an "\" sign
before that character. extra exeption: if you want to add a \ character in a distinguished name (in an
add or rename operation), you need to use 4 backslashes. examples: cn=dolf\+smits to add/search
an entry with the name like cn=dolf+smits cn=dolf \\ smits to search an entry with the name cn=dolf
\ smits cn=c:\\\\log.txt to add an entry with a name like cn=c:\log.txt
Go ahead and add the ThreadGroup element by first selecting the Test Plan, clicking your right
mouse button to get the Add menu, and then select Add-->ThreadGroup. You should now see the
Thread Group element under Test Plan. If you do not see the element, then "expand" the Test Plan
tree by clicking on the Test Plan element.
Figure 9b.1. Thread Group with Default Values
Like most JMeter elements, the LDAP Extended Request Defaults Control Panel has a name field
that you can modify. In this example, leave this field with the default value.
JMeter sends requests in the order that you add them to the tree.
Adding the LDAP Extended Request to the Thread Group element (Add --> Sampler --> LDAP Ext
Request). Then, select the LDAP Ext Request element in the tree and edit the following properties.
take care that this must be the uppermost shared level for all your request, eg
When all information is stored under ou=people, dc=siemens, dc=com, you
can use this value in the basedn.
If you need to search or rename objects in both subtrees, use the common
denominator (dc=siemens,dc=com) as the baseDN.
5. (Optional) enter the distinghuised name from the user you want to use for
authentication. When this field is kept empty, an anonymous bind will be
established.
6. (optional) Enter the password for the user you want to authenticate with, an
empty password will also lead to an anonymous bind.
When left empty, the basedn is used as a search base, this files is important
if you want to use a "base-entry" or "one-level" search (see below)
3. Enter the searchfilter, any decent LDAP serach filter will do, but for now,
use something simple, like cn=john doe
4. (optional) enter the scope in the scope field, it has three options:
1. Base level, Enter the value 0
When a very large returnset is returned, from a very fast server, over a very
slow line, you may have to wait for ages for the completion of the search
request, but this parameter will not influence this.
7. (Optional) Attributes you want in the search answer. This can be used to
limit the size of the answer, especially when an onject has very large
attributes (like jpegPhoto). There are three possibilities:
1. Leave empty (the default setting must also be empty) This will return
all attributes.
2. Put in one empty value (""), it will request a non-existent attributes,
so in reality it returns no attributes
3. Put in the attributes, seperated by a semi-colon. It will return only the
requested attributes
8. (Optional) Return object, possible values are "true" and "false". True will
return all java-object attributes, it will add these to the requested attributes,
as specified above.
When you need the same attribute more than once, just add a new line, add the
attribute again, and a different value.
All necessary attributes and values must be specified to pass the test, see picture!
(sometimes the server adds the attribute "objectClass=top", this might give a
problem.
Figure 9b.3.5. Add request example
this will mean that the attribute value (not optional in this case)
willbe added to the attribute.
When the attribute is not existing, it will be created and the value
added
This will overwrite the attribute with the given new value (not
optional here)
When the attribute is not existing, it will be created and the value
added
When it is existing, old values are removed, the new value is added.
3. delete
when the given value is not existing, the test will fail
5. (Optional) Add more modifications in the "modify test" table.
All modifications which are specified must succeed, to let the modification test
pass. When one modification fails, NO modifications at all will be made and the
entry will remain unchanged.
whne you only change the RDN, it will simply rename the entry
when you also add a differten subtree, eg you change from cn=john
doe,ou=people to cn=john doe,ou=users, it will move the entry. You can also move
a complete subtree (If your LDAP server supports this!!!!), eg
ou=people,ou=retired, to ou=oldusers,ou=users, this will move the complete
subtee, plus all retired people in the subtree to the new place in the tree.
Figure 9b.3.8. Rename example
In this listener you have three tabs to view, the sampler result, the request and the response
data.
1. The sampler result just contains the response time, the returncode and return
message
2. The request gives a short description of the request that was made, in practice
no relevant information is contained here.
3. The response data contains the full details of the sent request, as well the full
details of the received answer, this is given in a (self defined) xml-style. The
full description can be found here.
10. Building a WebService Test Plan
In this section, you will learn how to create a Test Plan to test a WebService. You will create
five users that send requests to One page. Also, you will tell the users to run their tests twice.
So, the total number of requests is (5 users) x (1 requests) x (repeat 2 times) = 10 HTTP
requests. To construct the Test Plan, you will use the following elements: Thread Group ,
WebService(SOAP) Request (Beta Code) , and Graph Results .
General notes on the webservices sampler. The current implementation uses Apache SOAP
driver, which requires activation.jar and mail.jar from SUN. Due to license restrictions,
JMeter does not include the jar files in the binary distribution.
If the sampler appears to be getting an error from the webservice, double check the SOAP
message and make sure the format is correct. In particular, make sure the xmlns attributes are
exactly the same as the WSDL. If the xml namespace is different, the webservice will likely
return an error. Xmethods contains a list of public webservice for those who want to test
their test plan.
Go ahead and add the ThreadGroup element by first selecting the Test Plan, clicking your
right mouse button to get the Add menu, and then select Add --> ThreadGroup.
You should now see the Thread Group element under Test Plan. If you do not see the
element, then "expand" the Test Plan tree by clicking on the Test Plan element.
Next, you need to modify the default properties. Select the Thread Group element in the tree, if you
have not already selected it. You should now see the Thread Group Control Panel in the right
section of the JMeter window (see Figure 10.1 below)
Start by providing a more descriptive name for our Thread Group. In the name field, enter
Jakarta Users.
Finally, clear the checkbox labeled "Forever", and enter a value of 2 in the Loop Count field.
This property tells JMeter how many times to repeat your test. If you enter a loop count
value of 0, then JMeter will run your test only once. To have JMeter repeatedly run your Test
Plan, select the Forever checkbox.
See Figure 10.2 for the completed Jakarta Users Thread Group.
Start by adding the sampler WebService(SOAP) Request (Beta Code) to the Jakarta Users
element (Add --> Sampler --> WebService(SOAP) Request (Beta Code) ). Then, select the
webservice Request element in the tree and edit the following properties (see Figure 10.5):
Next, select the web method and click "Configure". The sampler should populate the "URL"
and "SOAPAction" text fields. Assuming the WSDL is valid, the correct soap action should
be entered.
The last step is to paste the SOAP message in the "SOAP/XML-RPC Data" text area. You
can optionally save the soap message to a file and browse to the location. For convienance,
there is a third option of using a message folder. The sampler will randomly select files from
a given folder and use the text for the soap message.
If you do not want JMeter to read the response from the SOAP Webservice, uncheck "Read
Soap Responses." If the test plan is intended to stress test a webservice, the box should be
unchecked. If the test plan is a functional test, the box should be checked. When "Read Soap
Responses" is unchecked, no result will be displayed in view result tree or view results in
table.
An important note on the sampler. It will automatically use the proxy host and port passed to
JMeter from command line, if those fields in the sampler are left blank. If a sampler has
values in the proxy host and port text field, it will use the ones provided by the user. If no
host or port are provided and JMeter wasn't started with command line options, the sampler
will fail silently. This behavior may not be what users expect.
Note: If you're using Cassini webserver, it does not work correctly and is not a reliable
webserver. Cassini is meant to be a simple example and isn't a full blown webserver like IIS.
Cassini does not close connections correctly, which causes JMeter to hang or not get the
response contents.
Currently, only .NET uses SOAPAction, so it is normal to have a blank SOAPAction for all
other webservices. The list includes JWSDP, Weblogic, Axis, The Mind Electric Glue, and
gSoap.
Select the Jakarta Users element and add a Graph Results listener (Add --> Listener --> Graph
Results). Next, you need to specify a directory and filename of the output file. You can either type it
into the filename field, or select the Browse button and browse to a directory and then enter a
filename.
Figure 10.7. Graph Results Listener
In this section, you will learn how to create a Test Plan to test a JMS Point-to-Point
messaging solution. The setup of the test is 1 threadgroup with 5 threads sending 4 messages
each through a request queue. A fixed reply queue will be used for monitoring the reply
messages. To construct the Test Plan, you will use the following elements: Thread Group ,
JMS Point-to-Point , and Graph Results .
General notes on JMS: There are currently two JMS samplers. One uses JMS topics and the
other uses queues. Topic messages are commonly known as pub/sub messaging. Topic
messaging is generally used in cases where a message is published by a producer and
consumed by multiple subscribers. A JMS sampler needs the JMS implementation jar files;
for example, from Apache ActiveMQ. See here for the list of jars provided by ActiveMQ
3.0.
Go ahead and add the ThreadGroup element by first selecting the Test Plan, clicking your
right mouse button to get the Add menu, and then select Add --> ThreadGroup.
You should now see the Thread Group element under Test Plan. If you do not see the
element, then "expand" the Test Plan tree by clicking on the Test Plan element.
Next, you need to modify the default properties. Select the Thread Group element in the tree, if you
have not already selected it. You should now see the Thread Group Control Panel in the right
section of the JMeter window (see Figure 11.1 below)
Start by providing a more descriptive name for our Thread Group. In the name field, enter
Point-to-Point.
In the next field, the Ramp-Up Period, leave set the value to 0 seconds. This property tells
JMeter how long to delay between starting each user. For example, if you enter a Ramp-Up
Period of 5 seconds, JMeter will finish starting all of your users by the end of the 5 seconds.
So, if we have 5 users and a 5 second Ramp-Up Period, then the delay between starting users
would be 1 second (5 users / 5 seconds = 1 user per second). If you set the value to 0, then
JMeter will immediately start all of your users.
Clear the checkbox labeled "Forever", and enter a value of 4 in the Loop Count field. This
property tells JMeter how many times to repeat your test. If you enter a loop count value of
0, then JMeter will run your test only once. To have JMeter repeatedly run your Test Plan,
select the Forever checkbox.
Select the Thread Group element and add a Graph Results listener (Add --> Listener --> Graph
Results). Next, you need to specify a directory and filename of the output file. You can either type it
into the filename field, or select the Browse button and browse to a directory and then enter a
filename.
In this section, you will learn how to create a Test Plan to test JMS Providers. You will
create five subscribers and one publisher. You will create 2 thread groups and set each one to
10 iterations. The total messages is (6 threads) x (1 message) x (repeat 10 times) = 60
messages. To construct the Test Plan, you will use the following elements: Thread Group ,
JMS Publisher , JMS Subscriber , and Graph Results .
General notes on JMS: There are currently two JMS samplers. One uses JMS topics and the
other uses queues. Topic messages are commonly known as pub/sub messaging. Topic
messaging is generally used in cases where a message is published by a producer and
consumed by multiple subscribers. Queue messaging is generally used for transactions where
the sender expects a response. Messaging systems are quite different from normal HTTP
requests. In HTTP, a single user sends a request and gets a response. Messaging system can
work in sychronous and asynchronous mode. A JMS sampler needs the JMS implementation
jar files; for example, from Apache ActiveMQ. See here for the list of jars provided by
ActiveMQ 3.0.
Go ahead and add the ThreadGroup element by first selecting the Test Plan, clicking your
right mouse button to get the Add menu, and then select Add --> ThreadGroup.
You should now see the Thread Group element under Test Plan. If you do not see the
element, then "expand" the Test Plan tree by clicking on the Test Plan element.
Next, you need to modify the default properties. Select the Thread Group element in the tree, if you
have not already selected it. You should now see the Thread Group Control Panel in the right
section of the JMeter window (see Figure 12.1 below)
Start by providing a more descriptive name for our Thread Group. In the name field, enter
Subscribers.
In the next field, the Ramp-Up Period, set the value to 0 seconds. This property tells JMeter
how long to delay between starting each user. For example, if you enter a Ramp-Up Period
of 5 seconds, JMeter will finish starting all of your users by the end of the 5 seconds. So, if
we have 5 users and a 5 second Ramp-Up Period, then the delay between starting users
would be 1 second (5 users / 5 seconds = 1 user per second). If you set the value to 0, JMeter
will immediately start all users.
Clear the checkbox labeled "Forever", and enter a value of 10 in the Loop Count field. This
property tells JMeter how many times to repeat your test. If you enter a loop count value of
0, then JMeter will run your test only once. To have JMeter repeatedly run your Test Plan,
select the Forever checkbox.
Repeat the process and add another thread group. For the second thread group, enter
"Publisher" in the name field, set the number of threads to 1, and set the iteration to 10.
Start by adding the sampler JMS Subscriber to the Subscribers element (Add --> Sampler -->
JMS Subscriber). Then, select the JMS Subscriber element in the tree and edit the following
properties:
Next add the sampler JMS Publisher to the Publisher element (Add --> Sampler --> JMS
Subscriber). Then, select the JMS Publisher element in the tree and edit the following
properties:
Select the Test Plan element and add a Graph Results listener (Add --> Listener --> Graph
Results). Next, you need to specify a directory and filename of the output file. You can either type it
into the filename field, or select the Browse button and browse to a directory and then enter a
filename.
Go ahead and add the ThreadGroup element by first selecting the Test Plan, clicking your
right mouse button to get the Add menu, and then select Add --> ThreadGroup.
You should now see the Thread Group element under Test Plan. If you do not see the element,
"expand" the Test Plan tree by clicking on the Test Plan element.
By default, the Listener will select the results from the first connector in the sample
response. The Connector prefix field can be used to select a different connector. If specified,
the Listener will choose the first connector which matches the prefix. If no match is found,
then the first connector is selected.
There are two tabs in the monitor results listener. The first is the "Health", which displays the status
of the last sample the monitor received. The second tab is "Performance", which shows a historical
view of the server's performance.
A quick note about how health is calculated. Typically, a server will crash if it runs out of memory, or
reached the maximum number of threads. In the case of Tomcat 5, once the threads are maxed out,
requests are placed in a queue until a thread is available. The relative importance of threads vary
between containers, so the current implementation uses 50/50 to be conservative. A container that
is more efficient with thread management might not see any performance degradation, but the
used memory definitely will show an impact.
The performance graph shows for different lines. The free memory line shows how much
free memory is left in the current allocated block. Tomcat 5 returns the maximum memory,
but it is not graphed. In a well tuned environment, the server should never reach the
maximum memory.
Note the graph has captions on both sides of the graph. On the left is percent and the right is
dead/healthy. If the memory line spikes up and down rapidly, it could indicate memory
thrashing. In those situations, it is a good idea to profile the application with Borland
OptimizeIt or JProbe. What you want to see is a regular pattern for load, memory and
threads. Any erratic behavior usually indicates poor performance or a bug of some sort.
The "Configure" button can be used to specify which fields to write to the file, and whether
to write it as CSV or XML. CSV files are much smaller than XML files, so use CSV if you
are generating lots of samples.
If you only wish to record certain samples, add the Listener as a child of the sampler. Or you
can use a Simple Controller to group a set of samplers, and add the Listener to that. The
same filename can be used by multiple samplers - but make sure they all use the same
configuration!
jmeter.save.saveservice.output_format=
The information to be saved is configurable. For maximum information, choose "xml" as the
format and specify "Functional Test Mode" on the Test Plan element. If this box is not
checked, the default saved data includes a time stamp (the number of milliseconds since
midnight, January 1, 1970 UTC), the data type, the thread name, the label, the response time,
message, and code, and a success indicator. If checked, all information, including the full
response data will be logged.
The following example indicates how to set properties to get a vertical bar ("|") delimited
format that will output results like:.
timeStamp|time|label|responseCode|threadName|dataType|success|
failureMessage
02/06/03 08:21:42|1187|Home|200|Thread Group-1|text|true|
02/06/03 08:21:42|47|Login|200|Thread Group-1|text|false|Test Failed:
expected to contain: password etc.
The corresponding jmeter.properties that need to be set are shown below. One oddity in this
example is that the output_format is set to csv, which typically indicates comma-separated
values. However, the default_delimiter was set to be a vertical bar instead of a comma, so the
csv tag is a misnomer in this case. (Think of CSV as meaning character separated values)
jmeter.save.saveservice.output_format=csv
jmeter.save.saveservice.assertion_results_failure_message=true
jmeter.save.saveservice.default_delimiter=|
The full set of properties that affect result file output is shown below.
#-------------------------------------------------------------------------
--
# Results file configuration
#-------------------------------------------------------------------------
--
# legitimate values: xml, csv, db. Only xml and csv are currently
supported.
#jmeter.save.saveservice.output_format=xml
# Timestamp format
# legitimate values: none, ms, or a format suitable for SimpleDateFormat
#jmeter.save.saveservice.timestamp_format=ms
#jmeter.save.saveservice.timestamp_format=MM/dd/yy HH:mm:ss
#jmeter.save.saveservice.print_field_names=false
Note that cookies, method and the query string are saved as part of the "Sampler Data"
option.
This feature can be used to specify different data and log files for each test run, for example:
Note that JMeter logging messages are written to the file jmeter.log by default. This file is
recreated each time, so if you want to keep the log files for each run, you will need to rename
it using the -j option as above. The -j option was added in version 2.3.
Versions of JMeter after 2.3.1 support variables in the log file name. If the filename contains
paired single-quotes, then the name is processed as a SimpleDateFormat format applied to
the current date, for example: log_file='jmeter_'yyyyMMddHHmmss'.tmp' . This can be
used to generate a unique name for each test run.
The following Listeners no longer need to keep copies of every single sample. Instead,
samples with the same elapsed time are aggregated. Less memory is now needed, especially
if most samples only take a second or two at most.
Aggregate Report
Aggregate Graph
Distribution Graph
To minimise the amount of memory needed, use the Simple Data Writer, and use the CSV
format.
Variables, if specified
-- nonHTTPP Sample
</testResults>
Note that the sample node name may be either "sample" or "httpSample".
Attribute Content
by Bytes
de Data encoding
dt Data type
ec Error count (0 or 1, unless multiple samples are aggregated)
hn Hostname where the sample was generated
it Idle Time = time not spent sampling (milliseconds) (generally 0)
lb Label
lt Latency = time to initial response (milliseconds) - not all samplers support this
na Number of active threads for all thread groups
ng Number of active threads in this group
rc Response Code (e.g. 200)
rm Response Message (e.g. OK)
s Success flag (true/false)
sc Sample count (1, unless multiple samples are aggregated)
t Elapsed time (milliseconds)
tn Thread Name
ts timeStamp (milliseconds since midnight Jan 1, 1970 UTC)
varname Value of the named variable (versions of JMeter after 2.3.1)
Versions 2.1 and 2.1.1 of JMeter saved the Response Code as "rs", but read it back expecting
to find "rc". This has been corrected so that it is always saved as "rc"; either "rc" or "rs" can
be read.
Results can be read from XML or CSV format files. When reading from CSV results files,
the header (if present) is used to determine which fields were saved. In order to interpret a
header-less CSV file correctly, the appropriate JMeter properties must be set.
The Listeners which generate output as tables can also be saved using Copy/Paste. Select the
desired cells in the table, and use the OS Copy short-cut (normally Control+C). The data will be
saved to the clipboard, from where it can be pasted into another application, e.g. a spreadsheet or
text editor.
However, remote mode does use more resources than running the same number of non-GUI
tests independently. If many server instances are used, the client JMeter can become
overloaded, as can the client network connection.
Note that while you can execute the JMeterEngine on your application server, you need to be
mindful of the fact that this will be adding processing overhead on the application server and
thus your testing results will be somewhat tainted. The recommended approach is to have
one or more machines on the same Ethernet segment as your application server that you
configure to run the JMeter Engine. This will minimize the impact of the network on the test
results without impacting the performance of the application serer itself.
Make sure that all the nodes (client and servers) are running exactly the same version of
JMeter. As far as possible, also use the same version of Java on all systems. Using different
versions of Java may work - but is best avoided.
If the test uses any data files, note that these are not sent across by the client so make sure
that these are available in the appropriate directory on each server. If necessary you can
define different values for properties by editting the user.properties or system.properties files
on each server. These properties will be picked up when the server is started and may be
used in the test plan to affect its behaviour (e.g. connecting to a different remote server).
Alternatively use different content in any datafiles used by the test (e.g. if each server must
use unique ids, divide these between the data files)
To run JMeter in remote node, start the JMeter server component on all machines you wish
to run on by running the JMETER_HOME/bin/jmeter-server (unix) or
JMETER_HOME/bin/jmeter-server.bat (windows) script.
Note that there can only be one JMeter server on each node unless different RMI ports are
used.
Since JMeter 2.3.1, the JMeter server application starts the RMI registry itself; there is no
need to start RMI registry separately. To revert to the previous behaviour, define the JMeter
property server.rmi.create=false on the server host systems.
By default, RMI uses a dynamic port for the JMeter server engine. This can cause problems
for firewalls, so versions of JMeter after 2.3.2 will check for the JMeter property
server.rmi.localport . If this is non-zero, it will be used as the local port number for the
server engine.
Edit the properties file on the controlling JMeter machine . In /bin/jmeter.properties, find the
property named, "remote_hosts", and add the value of your running JMeter server's IP
address. Multiple such servers can be added, comma-delimited.
Note that you can use the -R command line option instead to specify the remote host(s) to
use. This has the same effect as using -r and -Jremote_hosts={serverlist}. E.g. jmeter
-Rhost1,127.0.0.1,host2
If you define the JMeter property server.exitaftertest=true, then the server will exit after it
runs a single test. See also the -X flag (described below)
Now you are ready to start the controlling JMeter client. For MS-Windows, start the client with the
script "bin/jmeter.bat". For UNIX, use the script "bin/jmeter". You will notice that the Run menu
contains two new sub-menus: "Remote Start" and "Remote Stop" (see figure 1). These menus
contain the client that you set in the properties file. Use the remote start and stop instead of the
normal JMeter start and stop menu items.
As an alternative, you can start the remote server(s) from a non-GUI (command-line) client.
The command to do this is:
jmeter -n -t script.jmx -r
or
jmeter -n -t script.jmx -R server1,server2...
The first example will start whatever servers are defined in the JMeter property remote_hosts; the
second example will define remote_hosts from the list of servers and then run the remote servers.
The command-line client will exit when all the remote servers have stopped.
15.1 Doing it Manually
In some cases, the jmeter-server script may not work for you (if you are using an OS
platform not anticipated by the JMeter developers). Here is how to start the JMeter servers
(step 1 above) with a more manual process:
Since JMeter 2.3.1, the RMI registry is started by the JMeter server, so this section does not
apply in the normal case. To revert to the previous behaviour, define the JMeter property
server.rmi.create=false on the server host systems and follow the instructions below.
JMeter uses Remote Method Invocation (RMI) as the remote communication mechanism.
Therefore, you need to run the RMI Registry application (which is named, "rmiregistry")
that comes with the JDK and is located in the "bin" directory. Before running rmiregistry,
make sure that the following jars are in your system claspath:
JMETER_HOME/lib/ext/ApacheJMeter_core.jar
JMETER_HOME/lib/jorphan.jar
JMETER_HOME/lib/logkit-1.2.jar
The rmiregistry application needs access to certain JMeter classes. Run rmiregistry with no
parameters. By default the application listens to port 1099.
Once the RMI Registry application is running, start the JMeter Server. Use the "-s" option
with the jmeter startup script ("jmeter -s").
15.2 Tips
JMeter/RMI requires a connection from the client to the server. This will use the port you
chose, default 1099. JMeter/RMI also requires a reverse connection in order to return
sample results from the server to the client. This will use a high-numbered port. If there are
any firewalls or other network filters between JMeter client and server, you will need to
make sure that they are set up to allow the connections through. If necessary, use monitoring
software to show what traffic is being generated.
If you're running Suse Linux, these tips may help. The default installation may enable the
firewall. In that case, remote testing will not work properly. The following tips were
contributed by Sergey Ten.
If you see connections refused, turn on debugging by passing the following options.
Since JMeter 2.3.1, the RMI registry is started by the server; however the options can still
be passed in from the JMeter command line. For example: "jmeter -s
-Dsun.rmi.loader.logLevel=verbose" (i.e. omit the -J prefixes). Alternatively the properties
can be defined in the system.properties file.
The solution to the problem is to remove the loopbacks 127.0.0.1 and 127.0.0.2 from
etc/hosts. What happens is jmeter-server can't connect to rmiregistry if 127.0.0.2 loopback is
not available. Use the following settings to fix the problem.
Replace
With
HOST="-Djava.rmi.server.hostname=[computer_name][computer_domain]
-Djava.security.policy=`dirname $0`/[policy_file]"
`dirname $0`/jmeter $HOST -s "$@"
Since Jmeter 2.1.1, the jmeter-server scripts provide support for changing the port. For
example, assume you want to use port 1664 (perhaps 1099 is already used).
On Unix:
$ SERVER_PORT=1664 jmeter-server [other options]
[N.B. use upper case for the environment variable]
In both cases, the script starts rmiregistry on the specified port, and then starts JMeter in
server mode, having defined the "server_port" property.
The chosen port will be logged in the server jmeter.log file (rmiregistry does not create a log
file).
15.4 Using sample batching
Listeners in the test plan send their results back to the client JMeter which writes the results
to the specified files By default, samples are sent back as they are generated. This can place
a large load on the network and the JMeter client. There are some JMeter properties that can
be set to alter this behaviour.
mode - sample sending mode - default is Standard
o Standard - send samples as soon as they are generated
o Hold - hold samples in an array until the end of a run. This may use a
lot of memory on the server.
o Batch - send saved samples when either the count or time exceeds a
threshold
o Statistical - send a summary sample when either the count or time
exceeds a threshold. The samples are summarised by thread group
name and sample label. The following fields are accumulated:
elapsed time
latency
bytes
sample count
error count
The Proxy Server expects to find a ThreadGroup element with a Recording Controller under
it where it will record HTTP Requests to. This conveniently packages all your samples under
one controller, which can be given a name that describes the test case.
Now, go through the steps of a Test Case. If you have no pre-defined test cases, use JMeter
to record your actions to define your test cases. Once you have finished a definite series of
steps, save the entire test case in an appropriately named file. Then, wipe clean and start a
new test case. By doing this, you can quickly record a large number of test case "rough
drafts".
One of the most useful features of the Proxy Server is that you can abstract out certain
common elements from the recorded samples. By defining some user-defined variables at the
Test Plan level or in User Defined Variables elements, you can have JMeter automatically
replace values in you recorded samples. For instance, if you are testing an app on server
"xxx.example.com", then you can define a variable called "server" with the value of
"xxx.example.com", and anyplace that value is found in your recorded samples will be
replaced with "${server}".
If JMeter does not record any samples, check that the brower really is using the proxy. If the
browser works OK even if JMeter is not running, then the browser cannot be using the
proxy. Some browsers ignore proxy settings for localhost or 127.0.0.1; try using the local
hostname or IP instead.
The error "unknown_ca" probably means that you are trying to record HTTPS, and the
browser has not accepted the JMeter Proxy server certificate.
For example:
Create a text file containing the user names and passwords, separated by
commas. Put this in the same directory as your test plan.
Add a CSV DataSet configuration element to the test plan. Name the
variables USER and PASS.
Replace the login name with ${USER} and the password with ${PASS} on
the appropriate samplers
The CSV Data Set element will read a new line for each thread.
If your test needs large amounts of data - particularly if it needs to be randomised - create the
test data in a file that can be read with CSV Dataset. This avoids wasting resources at run-
time.
beanshell.server.port=9000
beanshell.server.file=../extras/startup.bsh
In the above example, the server will be started, and will listen on ports 9000 and 9001. Port
9000 will be used for http access. Port 9001 will be used for telnet access. The startup.bsh
file will be processed by the server, and can be used to define various functions and set up
variables. The startup file defines methods for setting and printing JMeter and system
properties. This is what you should see in the JMeter console:
As a practical example, assume you have a long-running JMeter test running in non-GUI
mode, and you want to vary the throughput at various times during the test. The test-plan
includes a Constant Throughput Timer which is defined in terms of a property, e.g. $
{__P(throughput)}. The following BeanShell commands could be used to change the test:
printprop("throughput");
curr=Integer.decode(args[0]); // Start value
inc=Integer.decode(args[1]); // Increment
end=Integer.decode(args[2]); // Final value
secs=Integer.decode(args[3]); // Wait between changes
while(curr <= end){
setprop("throughput",curr.toString()); // Needs to be a string here
Thread.sleep(secs*1000);
curr += inc;
}
printprop("throughput");
The script can be stored in a file (throughput.bsh, say), and sent to the server using
bshclient.jar. For example:
Some long-running tests may cause the interpreter to use lots of memory; if this is the case
try using the reset option.
You can test BeanShell scripts outside JMeter by using the command-line interpreter:
Scripts can also access JMeter variables using the get() and put() methods of the "vars"
variable, for example: vars.get("HOST"); vars.put("MSG","Successful"); . The
get() and put() methods only support variables with String values, but there are also
getObject() and putObject() methods which can be used for arbitrary objects. JMeter
variables are local to a thread, but can be used by all test elements (not just Beanshell).
If you need to share variables between threads, then JMeter properties can be used:
import org.apache.jmeter.util.JMeterUtils;
String value=JMeterUtils.getPropDefault("name","");
JMeterUtils.setProperty("name", "value");
The sample .bshrc files contain sample definitions of getprop() and setprop() methods.
Another possible method of sharing variables is to use the "bsh.shared" shared namespace.
For example:
if (bsh.shared.myObj == void){
// not yet defined, so create it:
myObj=new AnyObject();
}
bsh.shared.myObj.process();
Rather than creating the object in the test element, it can be created in the startup file
defined by the JMeter property "beanshell.init.file". This is only processed once.
Create a simple Test Plan containing the BSF Sampler and Tree View Listener. Code the
script in the sampler script pane, and test it by running the test. If there are any errors, these
will show up in the Tree View. Also the result of running the script will show up as the
response.
Once the script is working properly, it can be stored as a variable on the Test Plan. The script
variable can then be used to create the function call. For example, suppose a BeanShell script
is stored in the variable RANDOM_NAME. The function call can then be coded as $
{__BeanShell(${RANDOM_NAME})} . There is no need to escape any commas in the script,
because the function call is parsed before the variable's value is interpolated.
One way to do this is to define a set of variables on the Test Plan, and then use those
variables in the test elements. For example, one could define the variable LOOPS=10, and
refer to that in the Thread Group as ${LOOPS}. To run the test with 20 loops, just change
the value of the LOOPS variable on the Test Plan.
This quickly becomes tedious if you want to run lots of tests in non-GUI mode. One solution
to this is to define the Test Plan variable in terms of a property, for example LOOPS=$
{__P(loops,10))} . This uses the value of the property "loops", defaulting to 10 if the
property is not found. The "loops" property can then be defined on the JMeter command-
line: jmeter ... -Jloops=12 ... . If there are a lot of properties that need to be changed
together, then one way to achieve this is to use a set of property files. The appropriate
property file can be passed in to JMeter using the -q command-line option.
When is a good time to load-test our application (i.e. off-hours or week-ends), bearing in
mind that this may very well crash one or more of our servers ?
Does our application have state ? If so, how does our application manage it (cookies,
session-rewriting, or some other method) ?
17.2 Resources
The following resources will prove very helpful. Bear in mind that if you cannot locate these
resources, you will become these resources. As you already have your work cut out for you, it is
worth knowing who the following people are, so that you can ask them for help if you need it.
17.2.1 Network
Who knows our network topology ? If you run into any firewall or proxy issues, this will
become very important. As well, a private testing network (which will therefore have very
low network latency) would be a very nice thing. Knowing who can set one up for you (if
you feel that this is necessary) will be very useful. If the application doesn't scale as
expected, who can add additional hardware ?
17.2.2 Application
Who knows how our application functions ? The normal sequence is
test (low-volume - can we benchmark our application?)
benchmark (the average number of users)
load-test (the maximum number of users)
test destructively (what is our hard limit?)
The test process may progress from black-box testing to white-box testing (the difference is
that the first requires no knowledge of the application [it is treated as a "black box"] while
the second requires some knowledge of the application). It is not uncommon to discover
problems with the application during this process, so be prepared to defend your work.
For Windows, Windows XP Professional should be a minimum (the others do not multi-
thread past 50-60 connections, and you probably anticipate more users than that).
Good free platforms include the linuxes, the BSDs, and Solaris Intel. If you have a little
more money, there are commercial linuxes. If you can justify it, a commercial Unix (Solaris,
etc) is probably the best choice.
Don't forget JMeter batch mode. This can be useful if you have a powerful server that
supports Java but perhaps does not have a fast graphics implementation, or where you need
to login remotely. Batch (non-GUI) mode can reduce the network traffic compared with
using a remote display or client-server mode. The batch log file can then be loaded into
JMeter on a workstation for analysis, or you can use CSV output and import the data into a
spreadsheet.
17.4 Tools
The following tools will all prove useful. It is definitely worthwhile to become familiar with them.
This should include trying them out, and reading the appropriate documentation (man-pages, info-
files, application --help messages, and any supplied documentation).
17.4.1 ping
This can be used to establish whether or not you can reach your target site. Options can be
specified so that 'ping' provides the same type of route reporting as 'traceroute'.
17.4.2 nslookup/dig
While the user will normally use a human-readable internet address, you may wish to avoid
the overhead of DNS lookups when performing benchmarking/load-testing. These can be
used to determine the unique address (dotted quad) of your target site.
17.4.3 traceroute
If you cannot "ping" your target site, this may be used to determine the problem (possibly a
firewall or a proxy). It can also be used to estimate the overall network latency (running
locally should give the lowest possible network latency - remember that your users will be
running over a possibly busy internet). Generally, the fewer hops the better.
17.5.2 HttpUnit
This is worth a look. It is a library (and therefore of more interest to developers) that can be
used to perform HTTP tests/benchmarks. It is intended to be used instead of a web browser
(therefore no GUI) in conjunction with JUnit .
17.5.4 JMeter
If you have non-standard requirements, then this solution offers an open-source community
to provide them (of course, if you are reading this , you are probably already committed to
this one). This product is free to evolve along with your requirements.
Well, Perl might be a very good choice except that the Benchmark package seems to give
fairly fuzzy results. Also, simulating multiple users with Perl is a tricky proposition
(multiple connections can be simulated by forking many processes from a shell script, but
these will not be threads, they will be processes). However, the Perl community is very
large. If you find that someone has already written something that seems useful, this could
be a very good solution.
C, of course, is a very good choice (check out the Apache ab tool). But be prepared to write
all of the custom networking, threading, and state management code that you will need to
benchmark your application.
Java gives you (for free) the custom networking, threading, and state management code that
you will need to benchmark your application. Java is aware of HTTP, FTP, and HTTPS - as
well as RMI, IIOP, and JDBC (not to mention cookies, URL-encoding, and URL-rewriting).
In addition Java gives you automatic garbage-collection, and byte-code level security.
And once Microsoft moves to a CLR (common language run-time) a Windows Java solution
will not be any slower than any other type of solution on the Windows platform.
18.1 Samplers
o FTP Request
o HTTP Request
o JDBC Request
o Java Request
o SOAP/XML-RPC Request
o WebService(SOAP) Request
o LDAP Request
o BeanShell Sampler
o BSF Sampler
o JSR223 Sampler
o TCP Sampler
o JMS Publisher
o JMS Subscriber
o JMS Point-to-Point
o JUnit Request
o Test Action
o SMTP Sampler
o Loop Controller
o Interleave Controller
o Random Controller
o Throughput Controller
o Runtime Controller
o If Controller
o While Controller
o Switch Controller
o ForEach Controller
o Module Controller
o Include Controller
o Transaction Controller
o Recording Controller
18.3 Listeners
o Sample Result Save Configuration
o Graph Results
o Spline Visualizer
o Assertion Results
o Aggregate Report
o Monitor Results
o Aggregate Graph
o Mailer Visualizer
o BeanShell Listener
o Summary Report
o BSF Listener
o JSR223 Listener
o Random Variable
o Counter
18.5 Assertions
o Response Assertion
o Duration Assertion
o Size Assertion
o XML Assertion
o BeanShell Assertion
o MD5Hex Assertion
o HTML Assertion
o XPath Assertion
o BSF Assertion
o JSR223 Assertion
o Compare Assertion
o SMIME Assertion
18.6 Timers
o Constant Timer
o Synchronizing Timer
o BeanShell Timer
o BSF Timer
o JSR223 Timer
o User Parameters
o BeanShell PreProcessor
o BSF PreProcessor
o JSR223 PreProcessor
18.8 Post-Processors
o Regular Expression Extractor
o XPath Extractor
o BeanShell PostProcessor
o BSF PostProcessor
o JSR223 PostProcessor
o WorkBench
o SSL Manager
o Property Display
o Debug Sampler
o Debug PostProcessor
18.10 Reports
o Report Plan
o Report Table
o Report Page
o Line Graph
o Bar Chart
18.1 Samplers
Samplers perform the actual work of JMeter. Each sampler (except Test Action) generates one or
more sample results. The sample results have various attributes (success/fail, elapsed time, data
size etc) and can be viewed in the various listeners.
Latency is set to the time it takes to login (versions of JMeter after 2.3.1).
Control Panel
Parameters
See Also:
Assertions
FTP Request Defaults
The default parser is htmlparser. This can be changed by using the property
"htmlparser.classname" - see jmeter.properties for details.
If you are going to send multiple requests to the same web server, consider using an HTTP
Request Defaults Configuration Element so you do not have to enter the same information
for each HTTP Request.
Or, instead of manually adding HTTP Requests, you may want to use JMeter's HTTP Proxy
Server to create them. This can save you time if you have a lot of HTTP requests or requests
with many parameters.
There is no control over how connections are re-used. When a connection is released
by JMeter, it may or may not be re-used by the same thread.
The API is best suited to single-threaded usage - various settings (e.g. proxy) are
defined via system properties, and therefore apply to all connections.
There is a bug in the handling of HTTPS via a Proxy (the CONNECT is not handled
correctly). See Java bugs 6226610 and 6208335.
Note: the FILE protocol is intended for testing puposes only. It is handled by the same code
regardless of which HTTP Sampler is used.
If the request requires server or proxy login authorization (i.e. where a browser would create
a pop-up dialog box), you will also have to add an HTTP Authorization Manager
Configuration Element. For normal logins (i.e. where the user enters login information in a
form), you will need to work out what the form submit button does, and create an HTTP
request with the appropriate method (usually POST) and the appropriate parameters from the
form definition. If the page uses HTTP, you can use the JMeter Proxy to capture the login
sequence.
In versions of JMeter up to 2.2, only a single SSL context was used for all threads and
samplers. This did not generate the proper load for multiple users. A separate SSL context is
now used for each thread. To revert to the original behaviour, set the JMeter property:
https.sessioncontext.shared=true
JMeter defaults to the SSL protocol level TLS. If the server needs a different level, e.g.
SSLv3, change the JMeter property, for example:
https.default.protocol=SSLv3
JMeter also allows one to enable additional protocols, by changing the property
https.socket.protocols .
If the request uses cookies, then you will also need an HTTP Cookie Manager . You can add
either of these elements to the Thread Group or the HTTP Request. If you have more than
one HTTP Request that needs authorizations or cookies, then add the elements to the Thread
Group. That way, all HTTP Request controllers will share the same Authorization Manager
and Cookie Manager elements.
If the request uses a technique called "URL Rewriting" to maintain sessions, then see section
6.1 Handling User Sessions With URL Rewriting for additional configuration steps.
Control Panel
Parameters
N.B. when using Automatic Redirection, cookies are only sent for the initial URL. This can
cause unexpected behaviour for web-sites that redirect to a local server. E.g. if
www.example.com redirects to www.example.co.uk. In this case the server will probably
return cookies for both URLs, but JMeter will only see the cookies for the last host, i.e.
www.example.co.uk. If the next request in the test plan uses www.example.com, rather than
www.example.co.uk, it will not get the correct cookies. Likewise, Headers are sent for the
initial request, and won't be sent for the redirect. This is generally only a problem for
manually created test plans, as a test plan created using a recorder would continue from the
redirected URL.
Parameter Handling:
For the POST and PUT method, if there is no file to send, and the name(s) of the
parameter(s) are omitted, then the body is created by concatenating all the value(s) of the
parameters. This allows arbitrary bodies to be sent. The values are encoded if the encoding
flag is set (versions of JMeter after 2.3). See also the MIME Type above how you can
control the content-type request header that is sent.
For other methods, if the name of the parameter is missing, then the parameter is ignored.
This allows the use of optional parameters defined by variables. (versions of JMeter after
2.3)
Method Handling:
The POST and PUT request methods work similarly, except that the PUT method does not
support multipart requests. The PUT method body must be provided as one of the following:
If you define any parameters with a name in either the sampler or Http defaults then nothing
is sent. The GET and DELETE request methods work similarly to each other.
Upto and including JMeter 2.1.1, only responses with the content-type "text/html" were
scanned for embedded resources. Other content-types were assumed to be something other
than HTML. JMeter 2.1.2 introduces the a new property HTTPResponse.parsers , which is
a list of parser ids, e.g. htmlParser and wmlParser . For each id found, JMeter checks two
further properties:
See jmeter.properties file for the details of the settings. If the HTTPResponse.parser property
is not set, JMeter reverts to the previous behaviour, i.e. only text/html responses will be
scanned
See Also:
Assertion
Building a Web Test Plan
Building an Advanced Web Test Plan
HTTP Authorization Manager
HTTP Cookie Manager
HTTP Header Manager
HTML Link Parser
HTTP Proxy Server
HTTP Request Defaults
Before using this you need to set up a JDBC Connection Configuration Configuration
element
If the Variable Names list is provided, then for each row returned by a Select statement, the
variables are set up with the value of the corresponding column (if a variable name is
provided), and the count of rows is also set up. For example, if the Select statement returns 2
rows of 3 columns, and the variable list is A,,C , then the following variables will be set up:
If the Select statement returns zero rows, then the A_# and C_# variables would be set to 0,
and no other variables would be set.
Old variables are cleared if necessary - e.g. if the first select retrieves 6 rows and a second
select returns only 3 rows, the additional variables for rows 4, 5 and 6 will be removed.
Control Panel
Parameters
See Also:
The pull-down menu provides the list of all such implementations found by JMeter in its
classpath. The parameters can then be specified in the table below - as defined by your
implementation. Two simple examples (JavaTest and SleepTest) are provided.
The JavaTest example sampler can be useful for checking test plans, because it allows one to
set values in almost all the fields. These can then be used by Assertions, etc. The fields allow
variables to be used, so the values of these can readily be seen.
Control Panel
The Add/Delete buttons don't serve any
purpose at present.
Parameters
SleepTime is in milliseconds
SleepMask is used to add a "random" element to the time:
totalSleepTime = SleepTime + (System.currentTimeMillis() % SleepMask)
Control Panel
Parameters
An important note on the sampler is it will automatically use the proxy host and port passed
to JMeter from command line, if those fields in the sampler are left blank. If a sampler has
values in the proxy host and port text field, it will use the ones provided by the user. This
behavior may not be what users expect.
Note: If you are using CSVDataSet, do not check "Memory Cache". If memory cache is
checked, it will not iterate to the next value. That means all the requests will use the first
value.
Make sure you use <soap:Envelope rather than <Envelope. For example:
Control Panel
18.1.7 LDAP Request
This Sampler lets you send a different Ldap request(Add, Modify, Delete and Search) to an
LDAP server.
If you are going to send multiple requests to the same LDAP server, consider using an LDAP
Request Defaults Configuration Element so you do not have to enter the same information
for each LDAP Request.
The same way the Login Config Element also using for Login and password.
Control Panel
There are two ways to create test cases for testing an LDAP Server.
There are four test scenarios of testing LDAP. The tests are given below:
1. Add Test
1. Inbuilt test :
This will add a pre-defined entry in the LDAP Server and calculate the
execution time. After execution of the test, the created entry will be deleted
from the LDAP Server.
This will add the entry in the LDAP Server. User has to enter all the attributes
in the table.The entries are collected from the table to add. The execution time
is calculated. The created entry will not be deleted after the test.
2. Modify Test
1. Inbuilt test :
This will create a pre-defined entry first, then will modify the created entry in
the LDAP Server.And calculate the execution time. After execution of the
test, the created entry will be deleted from the LDAP Server.
This will modify the entry in the LDAP Server. User has to enter all the
attributes in the table. The entries are collected from the table to modify. The
execution time is calculated. The entry will not be deleted from the LDAP
Server.
3. Search Test
1. Inbuilt test :
This will create the entry first, then will search if the attributes are available.
It calculates the execution time of the search query. At the end of the
execution,created entry will be deleted from the LDAP Server.
This will search the user defined entry(Search filter) in the Search base
(again, defined by the user). The entries should be available in the LDAP
Server. The execution time is calculated.
4. Delete Test
1. Inbuilt test :
This will create a pre-defined entry first, then it will be deleted from the
LDAP Server. The execution time is calculated.
This will delete the user-defined entry in the LDAP Server. The entries should
be available in the LDAP Server. The execution time is calculated.
Parameters
See Also:
If you are going to send multiple requests to the same LDAP server, consider using an LDAP
Extended Request Defaults Configuration Element so you do not have to enter the same
information for each LDAP Request.
Control Panel
There are nine test operations defined. These operations are given below:
1. Thread bind
Any LDAP request is part of an LDAP session, so the first thing that should be done
is starting a session to the LDAP server. For starting this session a thread bind is
used, which is equal to the LDAP "bind" operation. The user is requested to give a
username (Distinguished name) and password, which will be used to initiate a
session. When no password, or the wrong password is specified, an anonymous
session is started. Take care, omitting the password will not fail this test, a wrong
password will.
Parameters
2. Thread unbind
This is simply the operation to end a session. It is equal to the LDAP "unbind"
operation.
Parameters
3. Single bind/unbind
This is a combination of the LDAP "bind" and "unbind" operations. It can be used for
an authentication request/password check for any user. It will open an new session,
just to check the validity of the user/password combination, and end the session
again.
Parameters
4. Rename entry
This is the LDAP "moddn" operation. It can be used to rename an entry, but also for
moving an entry or a complete subtree to a different place in the LDAP tree.
Parameters
5. Add test
This is the ldap "add" operation. It can be used to add any kind of object to the LDAP
server.
Parameters
6. Delete test
This is the LDAP "delete" operation, it can be used to delete an object from the
LDAP tree
Parameters
This is the LDAP "search" operation, and will be used for defining searches.
Parameters
8. Modification test
This is the LDAP "modify" operation. It can be used to modify an object. It can be
used to add, delete or replace values of an attribute.
Parameters
9. Compare
This is the LDAP "compare" operation. It can be used to compare the value of a
given attribute with some already known value. In reality this is mostly used to check
whether a given person is a member of some group. In such a case you can compare
the DN of the user as a given value, with the values in the attribute "member" of an
object of the type groupOfNames. If the compare operation fails, this test fails with
errorcode 49.
Parameters
See Also:
The current implemenation of the parser only looks at the text within the quotes. Everything
else is stripped out and igored. For example, the response code is completely ignored by the
parser. For the future, it might be nice to filter out entries that do not have a response code of
200. Extending the sampler should be fairly simple. There are two interfaces you have to
implement.
org.apache.jmeter.protocol.http.util.accesslog.LogParser
org.apache.jmeter.protocol.http.util.accesslog.Generator
Control Panel
Parameters
The TCLogParser processes the access log independently for each thread. The
SharedTCLogParser and OrderPreservingLogParser share access to the file, i.e. each thread
gets the next entry in the log.
The SessionFilter is intended to handle Cookies across threads. It does not filter out any
entries, but modifies the cookie manager so that the cookies for a given IP are processed by a
single thread at a time. If two threads try to process samples from the same client IP address,
then one will be forced to wait until the other has completed.
The LogFilter is intended to allow access log entries to be filtered by filename and regex, as
well as allowing for the replacement of file extensions. However, it is not currently possible
to configure this via the GUI, so it cannot really be used.
Parameters
If you are going to send multiple requests to the same LDAP server, consider using an LDAP
Request Defaults Configuration Element so you do not have to enter the same information for
each LDAP Request.
The same way the Login Config Element also using for Login and password.
Control Panel
There are two ways to create test cases for testing an LDAP Server.
There are four test scenarios of testing LDAP. The tests are given below:
1. Add Test
1. Inbuilt test :
This will add a pre-defined entry in the LDAP Server and calculate the
execution time. After execution of the test, the created entry will be deleted
from the LDAP Server.
This will add the entry in the LDAP Server. User has to enter all the attributes
in the table.The entries are collected from the table to add. The execution time
is calculated. The created entry will not be deleted after the test.
2. Modify Test
1. Inbuilt test :
This will create a pre-defined entry first, then will modify the created entry in
the LDAP Server.And calculate the execution time. After execution of the test,
the created entry will be deleted from the LDAP Server.
This will modify the entry in the LDAP Server. User has to enter all the
attributes in the table. The entries are collected from the table to modify. The
execution time is calculated. The entry will not be deleted from the LDAP
Server.
3. Search Test
1. Inbuilt test :
This will create the entry first, then will search if the attributes are available. It
calculates the execution time of the search query. At the end of the
execution,created entry will be deleted from the LDAP Server.
This will search the user defined entry(Search filter) in the Search base (again,
defined by the user). The entries should be available in the LDAP Server. The
execution time is calculated.
4. Delete Test
1. Inbuilt test :
This will create a pre-defined entry first, then it will be deleted from the
LDAP Server. The execution time is calculated.
This will delete the user-defined entry in the LDAP Server. The entries should
be available in the LDAP Server. The execution time is calculated.
Parameters
See Also:
If you are going to send multiple requests to the same LDAP server, consider using an LDAP
Extended Request Defaults Configuration Element so you do not have to enter the same
information for each LDAP Request.
Control Panel
There are nine test operations defined. These operations are given below:
1. Thread bind
Any LDAP request is part of an LDAP session, so the first thing that should be done
is starting a session to the LDAP server. For starting this session a thread bind is used,
which is equal to the LDAP "bind" operation. The user is requested to give a
username (Distinguished name) and password, which will be used to initiate a session.
When no password, or the wrong password is specified, an anonymous session is
started. Take care, omitting the password will not fail this test, a wrong password will.
Parameters
2. Thread unbind
This is simply the operation to end a session. It is equal to the LDAP "unbind"
operation.
Parameters
3. Single bind/unbind
This is a combination of the LDAP "bind" and "unbind" operations. It can be used for
an authentication request/password check for any user. It will open an new session,
just to check the validity of the user/password combination, and end the session again.
Parameters
4. Rename entry
This is the LDAP "moddn" operation. It can be used to rename an entry, but also for
moving an entry or a complete subtree to a different place in the LDAP tree.
Parameters
5. Add test
This is the ldap "add" operation. It can be used to add any kind of object to the LDAP
server.
Parameters
6. Delete test
This is the LDAP "delete" operation, it can be used to delete an object from the LDAP
tree
Parameters
7. Search test
This is the LDAP "search" operation, and will be used for defining searches.
Parameters
8. Modification test
This is the LDAP "modify" operation. It can be used to modify an object. It can be
used to add, delete or replace values of an attribute.
Parameters
9. Compare
This is the LDAP "compare" operation. It can be used to compare the value of a given
attribute with some already known value. In reality this is mostly used to check
whether a given person is a member of some group. In such a case you can compare
the DN of the user as a given value, with the values in the attribute "member" of an
object of the type groupOfNames. If the compare operation fails, this test fails with
errorcode 49.
Parameters
See Also:
Tomcat uses the common format for access logs. This means any webserver that uses the
common log format can use the AccessLogSampler. Server that use common log format
include: Tomcat, Resin, Weblogic, and SunOne. Common log format looks like this:
The current implemenation of the parser only looks at the text within the quotes. Everything
else is stripped out and igored. For example, the response code is completely ignored by the
parser. For the future, it might be nice to filter out entries that do not have a response code of
200. Extending the sampler should be fairly simple. There are two interfaces you have to
implement.
org.apache.jmeter.protocol.http.util.accesslog.LogParser
org.apache.jmeter.protocol.http.util.accesslog.Generator
Control Panel
Parameters
Control Panel
Parameters
Download this example (see Figure 6). In this example, we created a Test Plan that sends two Ant
HTTP requests and two Log4J HTTP requests. We grouped the Ant and Log4J requests by placing
them inside Simple Logic Controllers. Remember, the Simple Logic Controller has no effect on how
JMeter processes the controller(s) you add to it. So, in this example, JMeter sends the requests in
the following order: Ant Home Page, Ant News Page, Log4J Home Page, Log4J History Page. Note,
the File Reporter is configured to store the results in a file named "simple-test.dat" in the current
directory.
Figure 6 Simple Controller Example
Control Panel
Parameters
Looping Example
Download this example (see Figure 4). In this example, we created a Test Plan that sends a
particular HTTP Request only once and sends another HTTP Request five times.
We configured the Thread Group for a single thread and a loop count value of one. Instead
of letting the Thread Group control the looping, we used a Loop Controller. You can see
that we added one HTTP Request to the Thread Group and another HTTP Request to a Loop
Controller. We configured the Loop Controller with a loop count value of five.
JMeter will send the requests in the following order: Home Page, News Page, News Page,
News Page, News Page, and News Page. Note, the File Reporter is configured to store the
results in a file named "loop-test.dat" in the current directory.
The Once Only Controller will now execute always during the first iteration of any looping
parent controller. Thus, if the Once Only Controller is placed under a Loop Controller
specified to loop 5 times, then the Once Only Controller will execute only on the first
iteration through the Loop Controller (ie, every 5 times). Note this means the Once Only
Controller will still behave as previously expected if put under a Thread Group (runs only
once per test), but now the user has more flexibility in the use of the Once Only Controller.
For testing that requires a login, consider placing the login request in this controller since
each thread only needs to login once to establish a session.
Control Panel
Parameters
Download this example (see Figure 5). In this example, we created a Test Plan that has two
threads that send HTTP request. Each thread sends one request to the Home Page, followed by
three requests to the Bug Page. Although we configured the Thread Group to iterate three times,
each JMeter thread only sends one request to the Home Page because this request lives inside a
Once Only Controller.
Each JMeter thread will send the requests in the following order: Home Page, Bug Page,
Bug Page, Bug Page. Note, the File Reporter is configured to store the results in a file
named "loop-test.dat" in the current directory.
Control Panel
Parameters
Download this example (see Figure 1). In this example, we configured the Thread Group to have
two threads and a loop count of five, for a total of ten requests per thread. See the table below for
the sequence JMeter sends the HTTP Requests.
JMeter starts over and sends the first HTTP Request, which is the News Page.
4 Log Page
5 FAQ Page
5 Log Page
Download another example (see Figure 2). In this example, we configured the Thread Group to
have a single thread and a loop count of eight. Notice that the Test Plan has an outer Interleave
Controller with two Interleave Controllers inside of it.
Figure 2 - Interleave Controller Example 2
The outer Interleave Controller alternates between the two inner ones. Then, each inner Interleave
Controller alternates between each of the HTTP Requests. Each JMeter thread will send the
requests in the following order: Home Page, Interleaved, Bug Page, Interleaved, CVS Page,
Interleaved, and FAQ Page, Interleaved. Note, the File Reporter is configured to store the results in
a file named "interleave-test2.dat" in the current directory.
The TCLogParser processes the access log independently for each thread. The
SharedTCLogParser and OrderPreservingLogParser share access to the file, i.e. each thread
gets the next entry in the log.
The SessionFilter is intended to handle Cookies across threads. It does not filter out any
entries, but modifies the cookie manager so that the cookies for a given IP are processed by a
single thread at a time. If two threads try to process samples from the same client IP address,
then one will be forced to wait until the other has completed.
The LogFilter is intended to allow access log entries to be filtered by filename and regex, as well as
allowing for the replacement of file extensions. However, it is not currently possible to configure this
via the GUI, so it cannot really be used.
18.2.5 Random Controller
The Random Logic Controller acts similarly to the Interleave Controller, except that instead
of going in order through its sub-controllers and samplers, it picks one at random at each
pass.
Control Panel
Parameters
Control Panel
Parameters
The Throughput Controller allows the user to control how often it is executed. There are two
modes - percent execution and total executions. Percent executions causes the controller to
execute a certain percentage of the iterations through the test plan. Total executions causes
the controller to stop executing after a certain number of executions have occurred. Like the
Once Only Controller, this setting is reset when a parent Loop Controller restarts.
Control Panel
Parameters
Control Panel
Parameters
18.2.9 If Controller
The If Controller allows the user to control whether the test elements below it (its children)
are run or not.
Prior to JMeter 2.3RC3, the condition was evaluated for every runnable element contained in
the controller. This sometimes caused unexpected behaviour, so 2.3RC3 was changed to
evaluate the condition only once on initial entry. However, the original behaviour is also
useful, so versions of JMeter after 2.3RC4 have an additional option to select the original
behaviour.
Versions of JMeter after 2.3.2 allow the script to be processed as a variable expression,
rather than requiring Javascript. It was always possible to use functions and variables in the
Javascript condition, so long as they evaluated to "true" or "false"; now this can be done
without the overhead of using Javascript as well. For example, previously one could use the
condition: ${__jexl(${VAR} == 23)} and this would be evaluated as true/false, the result
would then be passed to Javascript which would then return true/false. If the Variable
Expression option is selected, then the expression is evaluated and compared with "true",
without needing to use Javascript. Also, variable expressions can return any value, whereas
the Javascript condition must return "true"/"false" or an error is logged.
Control Panel
Parameters
Examples (Javascript):
${COUNT} < 10
"${VAR}" == "abcd"
${JMeterThread.last_sample_ok} (check if last sample succeeded)
If there is an error interpreting the code, the condition is assumed to be false, and a message
is logged in jmeter.log.
${RESULT}
18.2.10 While Controller
The While Controller runs its children until the condition is "false".
For example:
${VAR} - where VAR is set to false by some other test element
${__javaScript(${C}==10)}
${__javaScript("${VAR2}"=="abcd")}
${_P(property)} - where property is set to "false" somewhere else
Control Panel
Parameters
Note: In versions of JMeter after 2.3.1, the switch value can also be a name.
If the switch value is out of range, it will run the zeroth element, which therefore acts as the
default for the numeric case. It also runs the zeroth element if the value is the empty string.
If the value is non-numeric (and non-empty), then the Switch Controller looks for the
element with the same name (case is significant). If none of the names match, then the
element named "default" (case not significant) is selected. If there is no default, then no
element is selected, and the controller will not run anything.
Control Panel
Parameters
When the return variable is given as "returnVar", the collection of samplers and controllers
under the ForEach controller will be executed 4 consecutive times, with the return variable
having the respective above values, which can then be used in the samplers.
It is especially suited for running with the regular expression post-processor. This can
"create" the necessary input variables out of the result data of a previous request. By omitting
the "_" separator, the ForEach Controller can be used to loop through the groups by using the
input variable refName_g, and can also loop through all the groups in all the matches by
using an input variable of the form refName_${C}_g, where C is a counter variable.
Control Panel
Parameters
ForEach Example
Download this example (see Figure 7). In this example, we created a Test Plan that sends a
particular HTTP Request only once and sends another HTTP Request to every link that can be found
on the page.
Figure 7 - ForEach Controller Example
We configured the Thread Group for a single thread and a loop count value of one. You can
see that we added one HTTP Request to the Thread Group and another HTTP Request to the
ForEach Controller.
After the first HTTP request, a regular expression extractor is added, which extracts all the
html links out of the return page and puts them in the inputVar variable
In the ForEach loop, a HTTP sampler is added which requests all the links that were
extracted from the first returned HTML page.
ForEach Example
Here is another example you can download. This has two Regular Expressions and ForEach
Controllers. The first RE matches, but the second does not match, so no samples are run by the
second ForEach Controller
The Thread Group has a single thread and a loop count of two.
Sample 1 uses the JavaTest Sampler to return the string "a b c d".
The Regex Extractor uses the expression (\w)\s which matches a letter followed by a space,
and returns the letter (not the space). Any matches are prefixed with the string "inputVar".
The ForEach Controller extracts all variables with the prefix "inputVar_", and executes its
sample, passing the value in the variable "returnVar". In this case it will set the variable to
the values "a" "b" and "c" in turn.
The For 1 Sampler is another Java Sampler which uses the return variable "returnVar" as
part of the sample Label and as the sampler Data.
Sample 2, Regex 2 and For 2 are almost identical, except that the Regex has been changed to
"(\w)\sx", which clearly won't match. Thus the For 2 Sampler will not be run.
A test plan fragment consists of a Controller and all the test elements (samplers etc)
contained in it. The fragment can be located in any Thread Group, or on the WorkBench . If
the fragment is located in a Thread Group, then its Controller can be disabled to prevent the
fragment being run except by the Module Controller. Or you can store the fragments in a
dummy Thread Group, and disable the entire Thread Group.
There can be multiple fragments, each with a different series of samplers under them. The
module controller can then be used to easily switch between these multiple test cases simply
by choosing the appropriate controller in its drop down box. This provides convenience for
running many alternate test plans quickly and easily.
A fragment name is made up of the Controller name and all its parent names. For example:
Any fragments used by the Module Controller must have a unique name , as the name is
used to find the target controller when a test plan is reloaded. For this reason it is best to
ensure that the Controller name is changed from the default - as shown in the example above
- otherwise a duplicate may be accidentally created when new elements are added to the test
plan.
Control Panel
The Module Controller should not be used
with remote testing or non-gui testing in
conjunction with Workbench components
since the Workbench test elements are not
part of test plan .jmx files. Any such test will
fail.
Parameters
If the test uses a Cookie Manager or User Defined Variables, these should be placed in the
top-level test plan, not the included file, otherwise they are not guaranteed to work.
If the file cannot be found at the location given by prefix+filename, then the controller
attempts to open the fileName relative to the JMX launch directory (versions of JMeter after
2.3.4).
Control Panel
Parameters
For JMeter versions after 2.3, there are two modes of operation
The generated sample time includes all the times for the nested samplers, and any timers
etc. Depending on the clock resolution, it may be slightly longer than the sum of the
individual samplers plus timers. The clock might tick after the controller recorded the start
time but before the first sample starts. Similarly at the end.
The generated sample is only regarded as successful if all its sub-samples are successful.
In parent mode, the individual samples can still be seen in the Tree View Listener, but no
longer appear as separate entries in other Listeners. Also, the sub-samples do not appear in
CSV log files, but they can be saved to XML files.
Control Panel
Parameters
Control Panel
Parameters
18.3 Listeners
Most of the listeners perform several roles in addition to "listening" to the test results. They
also provide means to view, save, and read saved test results.
Note that Listeners are processed at the end of the scope in which they are found.
The saving and reading of test results is generic. The various listeners have a panel whereby
one can specify the file to which the results will be written (or read from). By default, the
results are stored as XML files, typically with a ".jtl" extension. Storing as CSV is the most
efficient option, but is less detailed than XML (the other available option).
Listeners do not process sample data in non-GUI mode, but the raw data will be saved
if an output file has been configured. In order to analyse the data generated by a non-GUI
test run, you need to load the file into the appropriate Listener.
Versions of JMeter up to 2.3.2 used to clear any current data before loading the new file.
This is no longer done, thus allowing files to be merged . If the previous behaviour is
required, use the menu item Run/Clear (Ctrl+Shift+E) or Run/Clear All (Ctrl+E) before
loading the file.
Results can be read from XML or CSV format files. When reading from CSV results files,
the header (if present) is used to determine which fields are present. In order to interpret a
header-less CSV file correctly, the appropriate properties must be set in
jmeter.properties.
Listeners can use a lot of memory if there are a lot of samples. Most of the listeners
currently keep a copy of every sample in their scope, apart from:
The following Listeners no longer need to keep copies of every single sample. Instead,
samples with the same elapsed time are aggregated. Less memory is now needed, especially
if most samples only take a second or two at most.
Aggregate Report
Aggregate Graph
Distribution Graph
To minimise the amount of memory needed, use the Simple Data Writer, and use the CSV
format.
The figure below shows an example of the result file configuration panel
Parameters
Note that cookies, method and the query string are saved as part of the "Sampler Data"
option.
Control Panel
Control Panel
18.3.3 Graph Results
The Graph Results listener generates a simple graph that plots all sample times. Along the
bottom of the graph, the current sample (black), the current average of all samples(blue), the
current standard deviation (red), and the current throughput rate (green) are displayed in
milliseconds.
The throughput number represents the actual number of requests/minute the server handled.
This calculation includes any delays you added to your test and JMeter's own internal
processing time. The advantage of doing the calculation like this is that this number
represents something real - your server in fact handled that many requests per minute, and
you can increase the number of threads and/or decrease the delays to discover your server's
maximum throughput. Whereas if you made calculations that factored out delays and
JMeter's processing, it would be unclear what you could conclude from that number.
Control Panel
The following table briefly describes the items on the graph. Further details on the precise
meaning of the statistical terms can be found on the web - e.g. Wikipedia - or by consulting a
book on statistics.
The individual figures at the bottom of the display are the current values. "Latest Sample" is
the current elapsed sample time, shown on the graph as "Data".
The graph is automatically scaled to fit within the window. This needs to be borne in mind
when comparing graphs.
Control Panel
Control Panel
See Also:
Response Assertion
There are several ways to view the response, selectable by a drop-down box at the bottom of
the left hand panel.
HTML
HTML (download embedded resources)
JSON
Regexp Tester
Text
XML
Additional renderers can be created. The class must implement the interface
org.apache.jmeter.visualizers.ResultRenderer and/or extend the abstract class
org.apache.jmeter.visualizers.SamplerResultTab , and the compiled code must be
available to JMeter (e.g. by adding it to the lib/ext directory).
The default "Text" view shows all of the text contained in the response. Note that this will
only work if the response content-type is considered to be text. If the content-type begins
with any of the following, it is considered as binary, otherwise it is considered to be text.
image/
audio/
video/
If there is no content-type provided, then the content will not be displayed in the any of the
Response Data panels. You can use Save Responses to a file to save the data in this case.
Note that the response data will still be available in the sample result, so can still be accessed
using Post-Processors.
If the response data is larger than 200K, then it won't be displayed. To change this limit, set
the JMeter property view.results.tree.max_size . You can also use save the entire response to
a file using Save Responses to a file .
The HTML view attempts to render the response as HTML. The rendered HTML is likely to
compare poorly to the view one would get in any web browser; however, it does provide a
quick approximation that is helpful for initial result evaluation. No images etc are
downloaded. If the HTML (download embedded resources) option is selected, the renderer
may download images and style-sheets etc referenced by the HTML.
The XML view will show response in tree style. Any DTD nodes or Prolog nodes will not
show up in tree; however, response may contain those nodes.
The JSON view will show the response in tree style (also handles JSON embedded in
JavaScript).
Most of the views also allow the displayed data to be searched; the result of the search will be
high-lighted in the display above. For example the Control panel screenshot below shows one
result of searching for "Java". Note that the search operates on the visible text, so you may
get different results when searching the Text and HTML views.
The "Regexp Tester" view only works for text responses. It shows the plain text in the upper
panel. The "Test" button allows the user to apply the Regular Expression to the upper panel
and the results will be displayed in the lower panel. For example, the RE (JMeter\w*).*
applied to the current JMeter home page gives the following output:
Match count: 26
Match[1][0]=JMeter - Apache JMeter</title>
Match[1][1]=JMeter
Match[2][0]=JMeter" title="JMeter" border="0"/></a>
Match[2][1]=JMeter
Match[3][0]=JMeterCommitters">Contributors</a>
Match[3][1]=JMeterCommitters
... and so on ...
The first number in [] is the match number; the second number is the group. Group [0] is
whatever matched the whole RE. Group [1] is whatever matched the 1st group, i.e.
(JMeter\w*) in this case. See Figure 9b (below).
Control Panel
The Control Panel (above) shows an example of an HTML display. Figure 9 (below) shows an example
of an XML display.
Figure 9 Sample XML display
The thoughput is calculated from the point of view of the sampler target (e.g. the remote
server in the case of HTTP samples). JMeter takes into account the total time over which the
requests have been generated. If other samplers and timers are in the same thread, these will
increase the total time, and therefore reduce the throughput value. So two identical samplers
with different names will have half the throughput of two samplers with the same name. It is
important to choose the sampler names correctly to get the best results from the Aggregate
Report.
Calculation of the Median and 90% Line (90 th percentile ) values requires additional
memory. For JMeter 2.3.4 and earlier, details of each sample were saved separately, which
meant a lot of memory was needed. JMeter now combines samples with the same elapsed
time, so far less memory is used. However, for samples that take more than a few seconds,
the probability is that fewer samples will have identical times, in which case more memory
will be needed. See the Summary Report for a similar Listener that does not store individual
samples and so needs constant memory.
Label - The label of the sample. If "Include group name in label?" is selected, then the
name of the thread group is added as a prefix. This allows identical labels from
different thread groups to be collated separately if required.
# Samples - The number of samples with the same label
Average - The average time of a set of results
Median - The median is the time in the middle of a set of results. 50% of the samples
took no more than this time; the remainder took at least as long.
90% Line - 90% of the samples took no more than this time. The remaining samples
at least as long as this. (90 th percentile )
Min - The shortest time for the samples with the same label
Max - The longest time for the samples with the same label
Error % - Percent of requests with errors
Throughput - the Throughput is measured in requests per second/minute/hour. The
time unit is chosen so that the displayed rate is at least 1.0. When the throughput is
saved to a CSV file, it is expressed in requests/second, i.e. 30.0 requests/minute is
saved as 0.5.
Kb/sec - The throughput measured in Kilobytes per second
Control Panel
The figure below shows an example of selecting the "Include group name" checkbox.
Sample "Include group name" display
Control Panel
18.3.9 Simple Data Writer
This listener can record results to a file but not to the UI. It is meant to provide an efficient
means of recording data by eliminating GUI overhead. When running in non-GUI mode, the
-l flag can be used to create a data file. The fields to save are defined by JMeter properties.
See the jmeter.properties file for details.
Control Panel
Currently, the primary limitation of the monitor is system memory. A quick benchmark of
memory usage indicates a buffer of 1000 data points for 100 servers would take roughly
10Mb of RAM. On a 1.4Ghz centrino laptop with 1Gb of ram, the monitor should be able to
handle several hundred servers.
As a general rule, monitoring production systems should take care to set an appropriate
interval. Intervals shorter than 5 seconds are too aggressive and have a potential of impacting
the server. With a buffer of 1000 data points at 5 second intervals, the monitor would check
the server status 12 times a minute or 720 times a hour. This means the buffer shows the
performance history of each machine for the last hour.
For a detailed description of how to use the monitor, please refer to Building a Monitor Test
Plan
Control Panel
18.3.11 Distribution Graph (alpha)
The distribution graph will display a bar for every unique response time. Since the granularity
of System.currentTimeMillis() is 10 milliseconds, the 90% threshold should be within the
width of the graph. The graph will draw two threshold lines: 50% and 90%. What this means
is 50% of the response times finished between 0 and the line. The same is true of 90% line.
Several tests with Tomcat were performed using 30 threads for 600K requests. The graph was
able to display the distribution without any problems and both the 50% and 90% line were
within the width of the graph. A performant application will generally produce results that
clump together. A poorly written application that has memory leaks may result in wild
fluctuations. In those situations, the threshold lines may be beyond the width of the graph.
The recommended solution to this specific problem is fix the webapp so it performs well. If
your test plan produces distribution graphs with no apparent clumping or pattern, it may
indicate a memory leak. The only way to know for sure is to use a profiling tool.
Control Panel
18.3.12 Aggregate Graph
The aggregate graph is similar to the aggregate report. The primary difference is the
aggregate graph provides an easy way to generate bar graphs and save the graph as a PNG
file. By default, the aggregate graph will generate a bar chart 450 x 250 pixels.
Control Panel
18.3.13 Mailer Visualizer
The mailer visualizer can be set up to send email if a test run receives too many failed
responses from the server.
Control Panel
Parameters