Leverage JMeter in Appian Load Testing

Apache JMeter is an open source Java performance testing tool which can be used to simulate user load on a web application. This guide explains how to use JMeter to implement and run test scripts for Appian applications as part of the Appian Performance Testing Methodology. It is important to note that test script implementation and execution are only two steps in the methodology. A test plan should be defined before starting implementation.

Recording New Scenarios

JMeter scripts are created by recording activity in the browser or mobile app using an HTTP(S) Test Script Recorder. Recording and parameterizing test scenarios is the primary task of test implementation.

JMeter for Appian

It is highly recommended to start by installing the JMeter for Appian utility. This is an add-on for JMeter's Test Script Recorder that automatically extracts standard dynamic values that are passed back and forth when a user interacts with Appian. This saves a lot of time when parameterizing your recorded scripts.

JMeter Properties

The following properties allow cookies to be stored as variables which allow the JMeter scripts to reference client cookies. Ensure they are set to the below values in the user.properties file.

  • CookieManager.save.cookies=true
  • CookieManager.allow_variable_cookies=true

Pre-Recording Proxy Configurations

Before recording, the JMeter Test Script Recorder must be configured. Several additional configuration elements will make the recorded requests easier to convert into repeatable test scenarios.

  1. Update the URL inclusion regular expression pattern of the HTTP(S) Test Script Recorder to match the domain name or IP of the performance test environment (eg: .*mysite\.com.* - note the escaped periods using \.)
  2. Update the HTTP Request Defaults to set the fully-qualified domain name or IP of the performance test environment (eg: mysite.com) and scheme (http or https). If the Server Name is left blank in all subsequent HTTP recorders, this will be inherited from the default values which makes it very easy to point the scripts to a different Appian environment.
  3. Start the HTTP(S) Test Script Recorder
  4. Configure the browser or mobile network proxy settings to point to the JMeter proxy machine and port
    • Browser
      • Open up browser settings and search for "Proxy"
    • Mobile devices
      • The mobile device must have access to the machine running the JMeter proxy (eg: same wireless network, no firewall restrictions)
      • Additional steps are needed when using SSL/HTTPS sites with mobile devices
      • iOS: Settings->Wi-Fi->(connected network)->HTTP Proxy->Manual
      • Android: Not recommended due to inconsistent support for proxies and custom SSL certificates (if using SSL/HTTPS)

Considerations for Authentication

Once a client authenticates with Appian, a cookie with the name appianCsrfToken is created. Appian requires this token to be passed in as the value of the X-APPIAN-CSRF-TOKEN header during subsequent requests. In order to access the cookie’s value, set the value of this header to ${COOKIE___appianCsrfToken} in JMeter. This is automatically done when leveraging the JMeter for Appian utility. If this token is not correctly passed, Appian will return back a 401 - Unauthorized.

Working With SAIL Interfaces

When scripting user interactions on a SAIL form, certain dynamic values need to be extracted from the previous responses and provided as part of the next request:  

  • cID - The cID of the component being interacted with
  • saveInto - The saveInto identifier of the component being interacted with
  • context - An identifier that references the SAIL context stored in the application server
  • uuid - The unique identifier of the interface instance

Below is an example of a request modeling a user clicking a button widget with all dynamic values parameterized. This is automatically done when leveraging the JMeter for Appian utility.

Working With Tasks

When interacting with an Appian task, the HTTP request path will need to reference its opaque task identifier as outlined below:

/suite/rest/a/task/latest/${OPAQUE_TASK_ID}/form

This can be extracted from the previous HTTP request and parameterized to allow for performance testing playback. 

Special Handling for File Uploads

The JMeter proxy server does not capture uploaded files from the HTTP request. Instead it references the file directly from the local filesystem using the path specified in the request. Since most browsers submit just the name of the file (not the full path) the proxy usually can’t find the file to send on to the server and the request fails.

In order to support file uploads when using JMeter, the uploaded file must be located in the directory where JMeter was launched. If JMeter was launched with ApacheJMeter.jar then the uploaded file should be in the same directory as this file. Move or copy files that will be part of the test script to the JMeter launch directory, then select them from that location when choosing the file to upload in the browser.

When recording a file upload, one request will be created to upload the file to the Appian server and another will be created to place the uploaded file in the interface’s appropriate file upload component. The file ID provided in the response of the first request will need to be extracted and referenced in the second request. This is automatically done when leveraging the JMeter for Appian utility.

More information on handling mobile uploads can be found here.

Running Recorded Scripts

Using JMeter GUI

When tests aren’t working or produce unexpected results, run a quick smoke test. While GUI mode is not recommended for full-load test runs (due to performance concerns) the Summary Report and View Results Tree provide an extremely useful visual tool when diagnosing test problems.

The Summary Report listener aggregates all samples for a given step and can be used to identify steps with a high error rate or with abnormal response times.

The View Results Tree listener displays the request and response details for each individual sample which allows you to drill down to the root cause of any errors.

Command Line Configuration

Full performance tests (as opposed to test script development or verification) should be run using Command Line mode (CLI mode).

  • Parameters are set as program arguments using the syntax -J<name of parameter>=<value>. For example, the following command specifies a duration of 1 hour (3600 seconds), a rampup of 1 minute (60 seconds), and an iteration delay of 1 minute (60000 milliseconds). The test name, user load multiplier, think time delay, and think time deviation will use the values saved in the test script.

    c:\apache-jmeter-2.9\bin\jmeter -t "RefApp Test Plan.jmx" -n -JTEST_DURATION=3600 -JRAMPUP_PERIOD=60 -JITERATION_DELAY=60000
  • To run the same test with 10 times the original user load and rampup over 10 minutes, change the values for RAMPUP_PERIOD and USER_MULTIPLIER:

    c:\apache-jmeter-2.9\bin\jmeter -t "RefApp Test Plan.jmx" -n -JTEST_DURATION=3600 -JRAMPUP_PERIOD=600 -JUSER_MULTIPLIER=10 -JITERATION_DELAY=60000

Common Problems

New test scenarios not captured

If no activity appears in the View Tree Results listener under the HTTP(S) Test Script Recorder then requests are not being sent to JMeter. Correct the proxy settings in the browser/mobile device.

If requests appear in the View Tree Results listener but are not captured in the Recording Controller then the URL inclusion regular expression is not matching the target site URL. Correct the regular expression in the URL Patterns to Include settings of the HTTP(S) Test Script Recorder.

High Error %

JMeter transactions can fail for several reasons:

  • Server returned an error response code (eg: 500 Server Error, 404 Not Found)
  • Server response timed out
  • Response assertion failed (eg: expected “Submit Details” but got “View Details”)

Errors may indicate a user-facing defect in the application or a problem with the test design. In other cases an error may be expected depending on conditions (eg: empty paging grids return 404 status codes).

The JMeter Summary Report listener and the JMeter Analysis Worksheet both track the error rate for each transaction.

Too few/too many requests

The test design and test parameters should define a relatively consistent number of requests in any given period of time. If the results deviate significantly from the expected output it is a good sign that something went wrong with the test. Double check the test parameters and consider running a GUI smoke test to find the source of the problem.

Uneven request distribution over the test duration

The design of the example script is intended to simulate a random but relatively uniform load over the course of the test period. While some periods of higher than expected load are part of a realistic test, a continuous series of spikes and lulls is unlikely to provide realistic performance results.

In order to visualize the request distribution, plot requests vs time and look for repeating patterns of high and low load. A longer rampup time may help eliminate uneven distribution.

Even distribution
Uneven distribution

Not enough rampup time

The JMeter Analysis Worksheet automatically filters out results from the rampup period but insufficient rampup time can cause some performance impact to carry over into the test period. The minimum safe rampup time will depend on the speed of the client machine, network, and server. In general, make sure that the rampup time (in seconds) is at least as high as the number of simulated users in the test (eg: a 100 user test should use a rampup time of at least 100 seconds).

Cascading errors/unexpected process flow

Since JMeter simply sends a sequence of scripted requests to the server its ability to adapt to unexpected responses is limited and often results in a series of cascading errors. For example, if a form submission response does not contain the activity id of the next chained form because of an error, the next request will also fail.

These errors are usually caused by changes in the application being tested. Response assertions can be used to validate responses and identify these situations more clearly. Depending on how significant the change is the affected steps or the entire scenario may need to be re-recorded.

Special Considerations for Mobile

Uploading Files from Mobile Devices

As described earlier, file uploads require special steps when using the JMeter recorder. The workaround for uploading files is slightly more complex when using a mobile device since the file is on a different device and cannot be easily moved into the JMeter launch directory.

  1. Attempt to upload a file from the mobile form (take a photo, choose a photo, etc)
  2. The request will be captured by JMeter, but the file upload field will display an error on the mobile form
  3. In JMeter, note the name of the file in the newly captured request
  4. Copy the file from the mobile device to the JMeter proxy machine, or use a placeholder file if the actual content of the file is not important
  5. Move the file to the JMeter launch directory and rename it to match the name in the captured request
  6. Submit the mobile form to retry the failed upload

    1. The upload will succeed and the form will be submitted
    2. JMeter will record a second file upload request immediately before the form submission request
  7. Remove the second file upload request from the recorded scenario in JMeter (the original request will work when the script is replayed)

Recording SSL/HTTPS Traffic from Mobile Devices

The JMeter proxy server uses a self-signed SSL certificate to record HTTPS requests. See the JMeter documentation for details.

Most browsers will show a warning message that can be safely ignored in order to proceed with test recording. However, the Appian mobile app doesn’t have the same warning/override option as a browser and automatically rejects invalid SSL certificates.

In order to record test scripts with HTTPS sites, a custom proxy certificate must be generated and installed on the JMeter proxy server and the mobile device. This is recommended only for users familiar with SSL/HTTPS and network concepts.

iOS

These instructions are for Windows users but apply to any OS platform. The Java JDK 7+ must be installed.

  1. Generate a custom SSL certificate using keytool

    1. Open a command window and switch to the <jmeter>/bin directory
    2. Rename proxyserver.jks to proxyserver.jks.bak
    3. Run the following command:
      keytool -genkey -keyalg RSA -alias selfsigned -keystore proxyserver.jks -storepass password -validity 999 -keysize 2048 -ext bc
      • Note: The -ext parameter was added in Java 7 and is required to create a trusted certificate. This command can be run on any machine with Java 7+ and the resulting .jks file will still work with earlier Java versions.
    4. Instead of providing a first and last name, enter the URL of the Appian test environment (eg: www.mysite.com)
      • Use a wildcard to match multiple sites (eg: *.mysite.com)
    5. Answer the remaining questions to generate a self-signed certificate in proxyserver.jks
  2. Install the certificate on the JMeter proxy server

    1. The previous command automatically generates the certificate in the default location
    2. Start the JMeter proxy server
    3. Access the test site in a browser that uses the JMeter proxy
    4. Confirm that the browser shows a warning with the new certificate details
  3. Install the certificate on the mobile device

    1. Run the following command:
      keytool -exportcert -keystore proxyserver.jks -alias selfsigned > proxyserver.crt
    2. Email proxyserver.crt to your iOS device (or transfer it via another method)
    3. Open proxyserver.crt and install it as a trusted root profile on the device
    4. Confirm that the profile is listed and trusted in Settings->General->Profiles
  4. The Appian app should now be able to connect to an HTTPS site while using the JMeter proxy server

Note: If the URL of the test environment changes the generated certificate may no longer be valid. If this happens a new certificate will be needed.