Apache JMeter is an open source Java performance testing tool which can be used to simulate user load on a web application. This guide explains how to use JMeter to implement and run test scripts for Appian applications as part of the Appian Performance Testing Methodology. It is important to note that test script implementation and execution are only two steps in the methodology. A test plan should be defined before starting implementation.
JMeter scripts are created by recording activity in the browser or mobile app using an HTTP(S) Test Script Recorder. Recording and parameterizing test scenarios is the primary task of test implementation.
It is highly recommended to start by installing the JMeter for Appian utility. This is an add-on for JMeter's Test Script Recorder that automatically extracts standard dynamic values that are passed back and forth when a user interacts with Appian. This saves a lot of time when parameterizing your recorded scripts.
The following properties allow cookies to be stored as variables which allow the JMeter scripts to reference client cookies. Ensure they are set to the below values in the user.properties file.
Before recording, the JMeter Test Script Recorder must be configured. Several additional configuration elements will make the recorded requests easier to convert into repeatable test scenarios.
Once a client authenticates with Appian, a cookie with the name appianCsrfToken is created. Appian requires this token to be passed in as the value of the X-APPIAN-CSRF-TOKEN header during subsequent requests. In order to access the cookie’s value, set the value of this header to ${COOKIE___appianCsrfToken} in JMeter. This is automatically done when leveraging the JMeter for Appian utility. If this token is not correctly passed, Appian will return back a 401 - Unauthorized.
When scripting user interactions on a SAIL form, certain dynamic values need to be extracted from the previous responses and provided as part of the next request:
Below is an example of a request modeling a user clicking a button widget with all dynamic values parameterized. This is automatically done when leveraging the JMeter for Appian utility.
When interacting with an Appian task, the HTTP request path will need to reference its opaque task identifier as outlined below:
/suite/rest/a/task/latest/${OPAQUE_TASK_ID}/form
This can be extracted from the previous HTTP request and parameterized to allow for performance testing playback.
The JMeter proxy server does not capture uploaded files from the HTTP request. Instead it references the file directly from the local filesystem using the path specified in the request. Since most browsers submit just the name of the file (not the full path) the proxy usually can’t find the file to send on to the server and the request fails.
In order to support file uploads when using JMeter, the uploaded file must be located in the directory where JMeter was launched. If JMeter was launched with ApacheJMeter.jar then the uploaded file should be in the same directory as this file. Move or copy files that will be part of the test script to the JMeter launch directory, then select them from that location when choosing the file to upload in the browser.
When recording a file upload, one request will be created to upload the file to the Appian server and another will be created to place the uploaded file in the interface’s appropriate file upload component. The file ID provided in the response of the first request will need to be extracted and referenced in the second request. This is automatically done when leveraging the JMeter for Appian utility.
More information on handling mobile uploads can be found here.
When tests aren’t working or produce unexpected results, run a quick smoke test. While GUI mode is not recommended for full-load test runs (due to performance concerns) the Summary Report and View Results Tree provide an extremely useful visual tool when diagnosing test problems.
The Summary Report listener aggregates all samples for a given step and can be used to identify steps with a high error rate or with abnormal response times.
The View Results Tree listener displays the request and response details for each individual sample which allows you to drill down to the root cause of any errors.
Full performance tests (as opposed to test script development or verification) should be run using Command Line mode (CLI mode).
Parameters are set as program arguments using the syntax -J<name of parameter>=<value>. For example, the following command specifies a duration of 1 hour (3600 seconds), a rampup of 1 minute (60 seconds), and an iteration delay of 1 minute (60000 milliseconds). The test name, user load multiplier, think time delay, and think time deviation will use the values saved in the test script.
c:\apache-jmeter-2.9\bin\jmeter -t "RefApp Test Plan.jmx" -n -JTEST_DURATION=3600 -JRAMPUP_PERIOD=60 -JITERATION_DELAY=60000
To run the same test with 10 times the original user load and rampup over 10 minutes, change the values for RAMPUP_PERIOD and USER_MULTIPLIER:
c:\apache-jmeter-2.9\bin\jmeter -t "RefApp Test Plan.jmx" -n -JTEST_DURATION=3600 -JRAMPUP_PERIOD=600 -JUSER_MULTIPLIER=10 -JITERATION_DELAY=60000
If no activity appears in the View Tree Results listener under the HTTP(S) Test Script Recorder then requests are not being sent to JMeter. Correct the proxy settings in the browser/mobile device.
If requests appear in the View Tree Results listener but are not captured in the Recording Controller then the URL inclusion regular expression is not matching the target site URL. Correct the regular expression in the URL Patterns to Include settings of the HTTP(S) Test Script Recorder.
JMeter transactions can fail for several reasons:
Errors may indicate a user-facing defect in the application or a problem with the test design. In other cases an error may be expected depending on conditions (eg: empty paging grids return 404 status codes).
The JMeter Summary Report listener and the JMeter Analysis Worksheet both track the error rate for each transaction.
The test design and test parameters should define a relatively consistent number of requests in any given period of time. If the results deviate significantly from the expected output it is a good sign that something went wrong with the test. Double check the test parameters and consider running a GUI smoke test to find the source of the problem.
The design of the example script is intended to simulate a random but relatively uniform load over the course of the test period. While some periods of higher than expected load are part of a realistic test, a continuous series of spikes and lulls is unlikely to provide realistic performance results.
In order to visualize the request distribution, plot requests vs time and look for repeating patterns of high and low load. A longer rampup time may help eliminate uneven distribution.
The JMeter Analysis Worksheet automatically filters out results from the rampup period but insufficient rampup time can cause some performance impact to carry over into the test period. The minimum safe rampup time will depend on the speed of the client machine, network, and server. In general, make sure that the rampup time (in seconds) is at least as high as the number of simulated users in the test (eg: a 100 user test should use a rampup time of at least 100 seconds).
Since JMeter simply sends a sequence of scripted requests to the server its ability to adapt to unexpected responses is limited and often results in a series of cascading errors. For example, if a form submission response does not contain the activity id of the next chained form because of an error, the next request will also fail.
These errors are usually caused by changes in the application being tested. Response assertions can be used to validate responses and identify these situations more clearly. Depending on how significant the change is the affected steps or the entire scenario may need to be re-recorded.
As described earlier, file uploads require special steps when using the JMeter recorder. The workaround for uploading files is slightly more complex when using a mobile device since the file is on a different device and cannot be easily moved into the JMeter launch directory.
Submit the mobile form to retry the failed upload
Remove the second file upload request from the recorded scenario in JMeter (the original request will work when the script is replayed)
The JMeter proxy server uses a self-signed SSL certificate to record HTTPS requests. See the JMeter documentation for details.
Most browsers will show a warning message that can be safely ignored in order to proceed with test recording. However, the Appian mobile app doesn’t have the same warning/override option as a browser and automatically rejects invalid SSL certificates.
In order to record test scripts with HTTPS sites, a custom proxy certificate must be generated and installed on the JMeter proxy server and the mobile device. This is recommended only for users familiar with SSL/HTTPS and network concepts.
These instructions are for Windows users but apply to any OS platform. The Java JDK 7+ must be installed.
Generate a custom SSL certificate using keytool
keytool -genkey -keyalg RSA -alias selfsigned -keystore proxyserver.jks -storepass password -validity 999 -keysize 2048 -ext bc
Install the certificate on the JMeter proxy server
Install the certificate on the mobile device
keytool -exportcert -keystore proxyserver.jks -alias selfsigned > proxyserver.crt
The Appian app should now be able to connect to an HTTPS site while using the JMeter proxy server
Note: If the URL of the test environment changes the generated certificate may no longer be valid. If this happens a new certificate will be needed.