The focus of this guide is how to integrate Akamas with Tricentis NeoLoad in order to leverage Neoload as a performance testing tool in an Akamas optimization.
To be able to execute a test from Neoload and to collect the Neoload metrics you will need:
Neoload 7.0+
a valid Neoload license;
a working Neoload test script;
a fully working Neoload farm composed by:
NeoLoadWeb (saas or on-prem);
a Neoload "zone" composed of 1 controller and (at least) 1 loadgenerator;
URL (and port if it is not the default one) of the NeoLoadWeb server;
to whitelist the connections between:
Akamas server and NeoLoadWeb server over port 8080 and 8081 (if NeoLoadWeb is deployed on-premises);
Akamas server and internet if NeoLoadWeb is managed as a SaaS platform
A NeoLoadWeb user with a "tester" role ("guest" role cannot be used due to limitations in triggering test execution). For compatibility reasons, the user related to the generated token must belong to the default workspace.
A NeoLoadWeb API token created with the above user to inherit the same rights
At the component level, the NeoLoad integration is trivial and only requires specifying a single NeoLoad property at the Web Application component. These properties will be used during the telemetry phase to map the NeoLoad metrics (e.g. transactions response time, error rate, etc..) to the right Akamas component.
The overall configuration is described by the Integrating Neoload provider page.
The example below provides an example of a component definition with the appropriate NeoLoadWeb property:
At the telemetry level, the NeoLoad integration relies on the NeoLoadWeb telemetry provider.
In case the NeoLoadWeb telemetry provider is not already installed on the Akamas server, please follow the instructions on the Setup NeoLoad telemetry provider page. After installing the telemetry provider, a NeoLoadWeb telemetry instance can be implemented following the instructions on Create NeoLoadWeb telemetry instance page.
The following is an example of a NeoLoad telemetry instance:
At the workflow level, the NeoLoad integration requires implementing a dedicated task based on the NeoLoadWeb operator.
The operator configurations required by NeoLoad are described on the NeoLoad operator page.
The workflow configuration changes depending on your NeoLoadWeb deployment, since it could be Saas or on-premises.
For compatibility reasons, the user related to the generated token must belong to the default workspace.
Some properties can be retrieved from the NeoLoad application or NeoLoadWeb.
scenarioName
open project on NeoLoad
go to the runtime tab
pick a scenario from the "scenarios" multi-select
accountToken
access your NeoLoad Web platform
go to profile
hit "generate access token" or retrieve an existing one
Notice: for compatibility reasons, the user related to the generated token must belong to the default workspace.
You need to have a controller and at least one load generator in place in the zone you have configured in the workflow step
lgZones
controllerZoneId
access your NeoLoad Web platform
go to the Resources tab
pick the Zone id of an existing zone or create a new one
only for lgZones: append ":" as a suffix plus the number of load generators you are going to use during the test
You might want to use this docker container which can be useful for quickly troubleshooting your NeoLoad integration instead of building and running a full study on Akamas.
Assuming that the NeoLoad scripts are hosted on your instance (thus you didn’t upload them on NeoLoad Web) the following command will run the load test scripts deployed in folder neoload-project on your NeoLoad farm:
where neoload-project
is the name of the mount point that the container is expecting. Please do not change it. Docker will mount your project folder in an internal folder named neoload-project
The project folder can contain:
A NeoLoad project folder including .nlp, config.zip ...
A single zip file containing the NeoLoad project
A single YAML file containing the NeoLoad project as code
Problem Test file upload on NeoloadWeb fails with the following error:
Solution
The user related to the token you are using must belong to the default
workspace.
Akamas supports the integration with virtually any load testing tool.
This section describes how to setup the integration with some of the most common options:
The focus of this guide is how to integrate Akamas with MicroFocus LoadRunner in order to leverage LoadRunner as a performance testing tool in an Akamas optimization.
A working LoadRunner >=12.60 or LoadRunner Professional 2020 installation including a Controller, at least one Load Generator, and a valid license (see LoadRunner section below)
The server hosting the Loadrunner Controller should have:
Microsoft Windows Server 2016 or 2019
Powershell version 5.1 or greater
Winrm (see "Install Winrm" section below)
A Windows user (see "Create Windows users" section below):
with read capability on the LoadRunner results folder
with read capability on the Loadrunner test files (VuGen scripts and folder, lrs files, etc…) and their parent folders
and member of the following groups:
Remote Desktop Users
Remote Management Users
Users
The LoadRunner result folder must be configured either as a shared folder on the Controller itself or as a remote share, mounted on the Controller. (see "Create Shared drive" section below)
A working LoadRunner scenario (lrs file)
The server hosting the LoadRunner Controller must be reachable from the Akamas server at ports (see the schema here below):
5985/TCP or 5986/TCP (see "Install Winrm" section below)
445/TCP (see "Create Shared drive" section below)
139/UDP (see "Create Shared drive" section below)
Windows Remote Management (WinRM) is the Microsoft implementation of WS-Management Protocol, a standard Simple Object Access Protocol (SOAP)-based, firewall-friendly protocol that allows hardware and operating systems from different vendors to interoperate. Akamas leverages WinRM as a general communication mechanism with Windows hosts.
In the integration with LoadRunner WinRM is used to both invoke the performance test execution and to collect the resulting data. Therefore, the WinRM protocol must be enabled and configured on the Windows host where the LoadRunner Controller resides.
By default, Akamas' communication with Winrm is on port 5985 over HTTP and port 5986 over HTTPS (the latter is the recommended option). Both protocol and port fields need to be set accordingly in the Akamas operator configuration.
Notice that while Winrm is already installed out-of-the-box on every Windows deployment, additional configurations may be required for the Akamas integration to work.
You can verify the Winrm listeners' communication protocols & ports on the controller server by running the following command in a PowerShell console:
Then check if a listener with a communication protocol (Transport) and port of your interest is present (both HTTP and HTTPS examples below, the output may vary)
If you have ssh access to the Akamas server, you can also verify the connectivity from Akamas to the controller server on the desired port by running the following commands on the Akamas CLI:
If the output looks like the following one no additional configuration is required:
Otherwise, see below for a complete set of instructions on how to configure Winrm to work with Akamas.
To configure Winrm on the controller server open a PowerShell console (as Administrator) and run the following command:
Then press "y" to accept the basic configuration as per the following example:
Additional useful documentation is available on Microsoft KB.
Enable HTTPS
Akamas default and recommended approach is to connect to LoadRunner Controller using HTTPS protocol (default: port 5986).
Since Winrm out-of-the-box configuration does not include an HTTPS listener, you need to set it up as follows:
create a valid certificate (or a self-signed one);
create the listener and bind it to the certificate;
add a Windows firewall inbound rule on port 5986.
To create the certificate on the controller server, open a PowerShell console (as Administrator) and get the host HOST_DNS_NAME
running hostname
. Take note of the HOST_DNS_NAME
and replace it in the following command line to generate a self-signed certificate:
To get the CERTIFICATE_THUMBPRINT
from the certificate manager, run the command certlm.msc
. The Certificate Manager will open, then choose:
open Certificates (local computer) → Personal → Certificates
open the certificate issued to HOST_DNS_NAME
copy the CERTIFICATE_THUMBPRINT
from the details tab.
Take note of the CERTIFICATE_THUMBPRINT
, substitute it and the HOST_DNS_NAME
in the following command line and execute the resulting command line to create the Winrm https listener and bind it to the certificate.
To check the newly create Winrm HTTPS listeners run:
Then check if in the output a listener is present with Transport set to HTTPS and Port set to 5986, as in the following example:
To allow inbound connections on port 5986 in the Windows firewall run
If you have ssh access to the Akamas server, you can also verify the connectivity from Akamas to the controller server on port 5986 by running the following command from the Akamas CLI:
The output should look like the following one:
Additional useful documentation about How to configure WINRM for HTTPS is available on Microsoft KB.
Akamas requires a Windows user to start the Loadrunner test execution and retrieve the test data. The user account can be a local or a domain one.
All LoadRunner test files (VuGen scripts and folder, lrs files, etc…) and their parent folders, must be readable and writable by the user account used by Akamas
Please note username, password since you will need to add them to the operator configuration.
User groups
The user must be a member of the following groups:
Remote Desktop Users
Remote Management Users
Users
To add the user to the groups run the following steps:
Right-click on Start in Windows task
Open "Computer Management"
Expand "System Tools", then "Local Users and Groups", finally select "Users"
Select the appropriate user, right-click on it, and then select "Properties"
In the dialog window, select the tab "Member Of" and use the "Add" button to add the user to the required groups
A user requires specific permissions to run a program through a WinRm session. To grant them, follow these steps on the controller instance:
Open a PowerShell terminal;
Run the command winrm configSDDL default
. An ACL dialog window will open;
Use the "Add" button to add the appropriate user and then select "Read", "Write" and "Execute" permissions.
Close the dialog with the "OK" button.
The results generated by LoadRunner are imported into Akamas using the LoadRunner Telemetry Provider.
In order to upload the data, the results must be available on a CIFS network share which must be configured on the LoadRunner Controller instance. This can either be a shared Controller folder or a network share mounted on the server. Please notice that *this target folder should never be the default LoadRunner result folder (usually LoadRunner_home_folder/results).
Notice: the user created in the previous section should have read access only to the shared folder.
Please note the share name you have chosen, since you will need to add it to the operator configuration.
Right-click on the folder, then select Properties
Go to Sharing tab, then select Advanced Sharing
In the opened window, enable Share this folder
In the "Share name" textbox type the name of the share. This is the name of the share over the network
Then click on Permissions, then Add
In the textbox type the name of the user or the group (with the domain if required) that you want to grant access to the share, then click OK
Select the added user (or group) and grant the required permissions
Click OK, OK, and then Close
Open "This PC" from the Start menu, then click on "Map network drive"
In the "Map Network Drive" window, select a suitable drive letter name and enter the remote folder path of the network share provided to you by your storage admin (it should be something with the format \mycompanyshareserver.mycompany\foldername)
Make sure to check "Reconnect at sign-in" and "Connect using different credentials".
Click Finish
In the "Windows Security" window enter the username (with the domain, if required) and the password for the network share and check "Remember my credentials".
Click Ok
According to the standard Akamas implementation process, the following three steps are required to complete the integration with LoadRunner:
create one or more component(s): this requires specific LoadRunner properties to set for each web-application component modeling the service layer;
create one or more telemetry instance(s): this requires information on how to connect to the LoadRunner instance;
create the workflow task: this requires using the LoadRunner operator in a workflow task so as to get LoadRunner test execution (via Winrm) to be run as part of the workflow.
At the component level, only a single empty property "LoadRunner" needs to be specified for the reference component type "Web Application". For more details about the component, see Integrating LoadRunner Professional provider.
The following represents a working example of a LoadRunner Web Application component:
Akamas provides a dedicated LoadRunner telemetry provider which can be deployed as a telemetry instance. If the telemetry provider is not available on your Akamas installation (please check the telemetry provider section of the UI) you can get it installed by following the instruction on Setup LoadRunner Professional provider page.
After installing a telemetry provider a LoadRunner telemetry instance can be created by following the instructions provided on Create LoadRunner Professional telemetry instance page.
The following is an example of a LoadRunner telemetry instance definition:
For full details about the telemetry instance please refer to this section in the Akamas documentation.
Check LoadRunner metrics
Please notice that the Akamas integration to LoadRunner has been designed to only collect a subset of the metrics available on the LoadRunner Controller (e.g.: transactions_response_time_p99 or other percentiles values are not returned). The full list of the supported LoadRunner metrics is described on the LoadRunner Professional telemetry metrics mapping page.
Please verify the availability of a metric before defining your goal, constraints, and returned metrics.
A dedicated task based on the LoadRunner operator needs to be created in a workflow to test the integration. The LoadRunner operator is documented on the LoadRunner Operator page.
The following is an excerpt of a workflow definition featuring a LoadRunner operator configuration:
Please notice that in this excerpt:
All slashes on the scenarioFile and resultFolder paths should be escaped (\).
Regarding the resultFolder
:
the placeholders between braces ..\{study}\{exp}\{trial}
are replaced by Akamas at runtime in order to save the LoadRunner results of each study, experiment, and trial in a separate folder. Since this could lead to a huge disk usage in case of long-lasting studies, before starting a study please be sure to:
adopt all the LoadRunner best practices to reduce the size of the result (e.g.: log, snapshot, etc);
have enough disk space on the LoadRunner server (the space required by a study = expected size of the LoadRunner results * max number of planned experiments);
the full path, up to the ..\results
folder, must exist;
the ..\results
folder has to be shared and the share name has to match the shareName
provided in the telemetry instance configuration; in our provided examples "C:\Users\Administrator\Desktop\lr\results" has to be shared as "akamas" and be to be available as \lr_hostname\akamas;
the path \{study}\{exp}\{trial}
can be modified or completely removed to save the LoadRunner results over and over in the same folder to save disk space; in any case, this change must be reflected in the resultFolder
of the telemetry instance (in our example, where resultFolder
is set to 'C:\\Users\\Administrator\\Desktop\\lr\\results\\{study}
, a resultFolder
needs to be set to "{study}"
in the telemetry instance)
In the controller section, if you set HTTP as protocol, then port 5985 needs to be used to communicate with winrm daemon on the LoadRunner instance. Notice that if only the port is specified, then Akamas will default to the HTTPS protocol to communicate with the LoadRunner instance.
The integration relies on InfluxDB acting as an external analysis server for LoadRunner Enterprise (LRE).
The following schema illustrates the components and networking connections you need to configure to setup the environment:
connection granted between the Akamas server and the LRE server (the one exposing LRE APIs) on port 443 - this connection is used by Akamas to invoke LRE APIs over HTTPs;
bi-directional connection granted between the LRE server (the one exposing LRE APIs) and InfluxDB on port 8086.: this connection is used by LRE to store analysis data into InfluxDB;
connection granted between Akamas server and InfluxDB on port 8086 - this connection is used by Akamas to collect LRE analysis data from InfluxDB.
The following assumes that you have already deployed your InfluxDB instance. For more information on how to deploy an InfluxDB instance. As a reference, please see here for a native deployment or here for a containerized deployment.
Once you have an InfluxDB deployment, you can configure it by running the following commands:
Since Akamas starts importing the LRE analysis data immediately once the execution is ended, there is no need to store data for a longer period of time than 1 day, which is the value set in the last command.
Please take note of the admin user credentials (akamasinfluxadmin
| password
in the example above) as you will need them later in order to configure the external analysis server on LRE.
It is recommended to create a dedicated LRE project to store the scripts and tests that you want to run using Akamas. It is also a good practice to also create a dedicated domain.
This can be done by accessing the administration panel on your LRE installation, whose URL should either look like the following:
or, in a multitenancy-enabled environment, like:
First, navigate to the Projects
menu:
then click on the Manage domains
button, add a domain and then fill in the required information:
Second, click on the Add project
button and add a project:
and fill in the required information (make sure to select the correct domain), then click on the Manage domains
button, add a domain and then fill in the required information:
Access the administration panel of your LRE installation and navigate to the Analysis Servers
menu:
and then click on the plus button to create a new Analysis Server by filling in the required information:
Make sure that the linked projects section lists the dedicated project you have created in the previous step. This step is required to let LoadRunner publish the performance test metrics to InfluxDB for the selected projects.
Notice: you may want to test that the connection to InfluxDB is working correctly by clicking on the Test connection button.
It is recommended to reserve a dedicated user for executing performance tests that you want to run using Akamas.
To create this user please access the administration panel of your LRE installation, then navigate to the Users menu:
then click on the plus button to add a user by filling in the required information:
Please notice that:
this user must be associated with the project created before and it must have the Performance tester role
this user does not need to have any special admin privilege
As a final step on the LRE environment, you need to retrieve the Test Identifier (ID) and the Test Set associated to the performance tests what will be executed by Akamas.
aside positive In the following it is assumed that you already have a test scenario defined in your LRE environment that Akamas will execute as part of an optimization study.
These test ID and test set can be retrieved from LRE Loadtest panel, which you can access it through a link which looks similar to:
or, for a multitenancy-enabled environment:
You can retrieve the ID by selecting the Test Management menu:
and then by clicking on the test that you want to execute: the ID is displayed next to the test name:
You can also retrieve the test set from the test details page: the test set is displayed in the upper right corner of the screen:
At this point, your LoadRunner Enterprise is ready to be integrated with Akamas.
To leverage the integration with LoadRunner Enterprise via InfluxDB, a telemetry instance needs to be created on the Akamas side.
First of all, check whether the telemetry provider for LoadRunnerEnterprise is installed:
Then, create a telemetry instance as follows:
where:
address
: it is the FQDN of the server hosting your InfluxDB instance
port
: the port where InfluxDB is running
username
and password
: credentials of the InfluxDB schema created in the previous steps
database
: the name of the InfluxDB database schema created in the previous steps.
A workflow needs to be created for your specific offline optimization study by leveraging the LoadRunnerEnterprise
operator to trigger the execution of a performance test.
The following only represents an example of a simple workflow that you can use to test your LRE integration. It contains just one task that triggers the execution of the specified performance test on LoadRunner Enterprise:
where:
address
: it is the basic address of your LRE farm, where the tenant and any other URL or path parameter have been removed
username
and password
: the credentials that you have previously created in the LRE admin panel
domain
and project
: the domain and the project you have previously created in the LRE admin panel
tenantID
: the ID of the tenant your project and user belong to - in case multitenancy is not enabled on your LRE environment, you can skip this parameter or set to the default value, thus fa128c06-5436-413d-9cfa-9f04bb738df3
testId
: the test ID of the test that will be executed by Akamas (you should have already identified it in the previous steps)
testSet
: the test set related to the test specified by testId (you should have already identified it in the previous steps)
timeSlot
: it specifies the amount of time that LRE will reserve for running your test, therefore it must be greater or equal of the test duration
verifySSL
: a flag to enable or ignore the SSL validation when connecting to LRE APIs - this flag is especially useful if your LRE environment exposes APIs over HTTPs with a self-signed certificate.