Setup Locust telemetry via CSV

Locust (https://locust.io/) is a popular Python-based load-testing tool. If you use Locust to run your load tests you might want to follow these guidelines to import its metrics (throughput, errors, and response time) using the CSV file telemetry provider.

Export test results to CSV

Locust can export the results of a test in a variety of formats, including CSV files.

To generate a csv file from locust add the --csv results/results argument to the locust command line used to invoke the test as in this example.

locust --headless --users 2 -t 3m --spawn-rate 1 -H http://my-site.io --csv results/results -f test.py

This will make locust generate some CSV files in the results folder; we are interested in the file name results_stats_history.csv which contains some time-series with the core performance metrics.

We also suggest adding the following lines at the beginning of your locust file to reduce the sampling frequency reported in the CSV from the default of 1 second to 30 seconds as described here.

import locust.stats
locust.stats.CSV_STATS_INTERVAL_SEC = 30

Preprocess the CSV file

In order to import the CSV into Akamas we still need to do a bit of pre-processing in order to:

  • Update the timestamp in a more friendly format

  • Add a column with the name of the Akamas component

This can be done by running the following script, make sure to change application on line 10 with the name of your Web Application component.

#!/bin/bash
cd "$(dirname "$0")"
test_csv=results/results_stats_history.csv

echo 'Formatting locust test'

tr -d '\r' < $test_csv > temp && mv temp $test_csv
sed -i '/,N\/A/d' $test_csv                     # remove lines without metrics
sed -i '1s/$/,COMPONENT/' $test_csv             # add component header
sed -i '2,$s/$/,application/' $test_csv         # add component value
awk -F, 'NR>1 { $1=strftime("%Y-%m-%d %H:%M:%S", $1); print } NR==1 { print }' OFS=, $test_csv > temp && mv temp $test_csv # format timestamp

echo 'Locust test formatted'

You can easily add this as an operator to the akamas workflow so that it gets executed at the end of every test run or integrated into the script that launches your locus test.

Setup the Telemetry instance

Now you can create a telemetry instance such as the following one to import the metrics.

Save this snippet in a YAML file by editing the following sections:

  • Host, username, and authentication to connect to the instance where the CSV file is hosted (lines 7-11)

  • remotefilePattern with the path to the CSV file to load on the instance

kind: telemetry-instance
system: system
name: csv
provider: CSV File             # this is an instance of the CSV provider
config:    
  logLevel: DETAILED               # the level of logging
  address: toolbox             # the address of the host with the CSV files
  port: 22                     # the port used to connect
  authType: key           # the authentication method
  username: akamas             # the username used to connect
  auth: ./toolbox.key                 # the authentication credential
  protocol: scp                # the protocol used to retrieve the file
  fieldSeparator: ","          # the character used as field separator in the CSV files
  remoteFilePattern: /work/results/results_stats_history.csv    # the path of the CSV files to import
  componentColumn: COMPONENT                     # the header of the column with component names
  timestampColumn: Timestamp                            # the header of the column with the time stamp
  timestampFormat: yyyy-MM-dd HH:mm:ss           # the format of the timestamp
metrics:
  - metric: transactions_throughput
    datasourceMetric: Requests/s
  - metric: transactions_response_time_p90
    datasourceMetric: 90%
  - metric: transactions_response_time
    datasourceMetric: 50%
  - metric: transactions_error_throughput
    datasourceMetric: Failures/s

Explore the results

Now you can use the imported metrics in your study goal and constraints and explore them from the UI.

Appendix

Here you can find a collection of sample artifacts to be used to setup a workflow that runs the test and prepares the csv file using the toolbox as the target host.

Last updated