Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The FileConfigurator operator allows configuring systems tuned by Akamas by interpolating configuration parameters into files on remote machines.
The operator performs the following operations:
It reads an input file from a remote machine containing templates for interpolating the configuration parameters generated by Akamas
It replaces the values of configuration parameters in the input file
It writes the file with replaced configuration parameters on a specified path on another remote machine
Access on remote machines is performed using SFTP (SSH).
The FileConfigurator allows writing templates for configuration parameters in two ways:
specify that a parameter should be interpolated directly:
specify that all parameters of a component should be interpolated:
It is possible to add a prefix or suffix to interpolated configuration parameters by acting at the component-type level:
Notice that any parameter that does not contain the FileConfigurator element in the operators' attribute is ignored and not written.
In the example above, the parameter x1
will be interpolated with the prefix PREFIX
and the suffix SUFFIX
, ${value}
will be replaced with the actual value of the parameter at each experiment.
Let's assume we want to apply the following configuration:
where component1
is of type MyComponentType
and MyComponentType
is defined as follows:
A template file to interpolate only parameter component1.param1
and all parameters from component2
would look like this:
The file after the configuration parameters are interpolated would look like this:
Note that the file in this example contains a bash command whose arguments are constructed by interpolating configuration parameters. This represents a typical use case for the File Configurator: to construct the right bash commands that will configure a system with the new configuration parameters computed by Akamas.
source
and target
structures and argumentsHere follows the structure of either the source
or target
operator argument
component
The component
argument can be used to refer to a component by name and use its properties as the arguments of the operator. In case the mapped arguments are already provided to the operator, there is no override.
In this case, the operator replaces in the template file only tokens referring to the specified component. A parameter bound to any component will cause the substitution to fail.
where the apache-server-1
component is defined as:
Name | Type | Value Restrictions | Required | Default | Description |
---|---|---|---|---|---|
Name | Type | Value restrictions | Required | Default | Description |
---|---|---|---|---|---|
Component property | Operator argument |
---|---|
source
Object
should have a structure like the one defined in the next section
no, if the Component whose name is defined in component
has properties that map to the ones defined within source
Information relative to the source/input file to be used to interpolate optimal configuration parameters discovered by Akamas
target
Object
should have a structure like the one defined in the next section
no, if the Component whose name is defined in component
has properties that map to the ones defined within target
Information relative to the target/output file to be used to interpolate optimal configuration parameters discovered by Akamas
component
String
should match the name of an existing Component of the System under test
no
The name of the Component whose properties can be used as arguments of the operator
ignoreUnsubstitutedTokens
Boolean
no
False
Behavior of the operator regarding leftover tokens in the target file.
When False
, FileConfigurator fails.
When True
, FileConfigurator succeeds regardless of leftover tokens
hostname
String
should be a valid SSH host address
yes
SSH endpoint
username
String
yes
SSH login username
password
String
cannot be set if key
is already set
no
SSH login password
sshPort
Number
1≤sshPort
≤65532
no
22
SSH port
key
String
cannot be set if password
is already set
no
SSH login key, provided directly its value or the path of the file to import from. The operator supports RSA and DSA Keys
path
String
should be a valid path
yes
The path of the file to be used either as the source or target of the activity to applying Akamas computed configuration parameters using files
hostname
source->hostname
target->hostname
username
source->username
target->username
sshPort
source->sshPort
target->sshPort
password
source->password
target->password
key
source->key
target->key
sourcePath
source->path
targetPath
target->path
This page introduces the OracleConfigurator operator, a workflow operator that allows configuring the optimized parameters of an Oracle instance.
This section provides the minimum requirements that you should meet in order to use the OracleConfigurator operator.
Oracle 12c or later
The Oracle operator must be able to connect to the Oracle URL or IP address and port (default port: 1521).
The user used to log into the database must have ALTER SYSTEM
privileges.
In order to configure the tuned parameters the Oracle Configurator operator requires to be bound to a component with one of the following types:
Oracle Database 12c
Oracle Database 18c
Oracle Database 19c
Databases hosted on Amazon RDS are not supported.
When you define an OracleExecutor task in the workflow you should specify some configuration information to allow the operator to connect to the Oracle instance.
You can specify configuration information within the config
part of the YAML of the instance definition. The operator can also inherit some specific arguments from the properties
of a bound component when not specified in the task.
The following table describes all the properties for the definition of a task using the OracleConfigurator operator.
In the following example, the workflow leverages the OracleConfigurator operator to update the database parameters before triggering the execution of the load test for a component oracledb
:
This page introduces the OracleExecutor operator, a workflow operator that allows executing custom queries on an Oracle instance.
This section provides the minimum requirements that you should meet in order to use the Oracle Executor operator.
Oracle 12c or later
The OracleExecutor operator must be able to connect to the Oracle URL or IP address and port (default port is 1521)
The user used to log into the database must have enough privilege to perform the required queries
When you define a task that uses the Oracle Executor operator you should specify some configuration information to allow the operator to connect to the Oracle instance and execute queries.
The operator inherits the connection
arguments from the properties of the component when referenced in the task definition. The Akamas user can also override the properties of the component or not reference it at all defining the connection
fields directly in the configuration of the task.
The following table provides the list of all properties required to define a task that uses the OracleExecutor operator.
Notice: it is a good practice to define only queries that update the state of the database. Is not possible to use SELECT queries to extract data from the database.
In the following example, the operator performs a cleanup action on a table of the database:
In the following example, the operator leverages its templating features to update a table:
The referenced oracledb component contains properties that specify how to connect to the Oracle database instance:
The WindowsExecutor operator executes a command on a target Windows machine using WinRM.
The command can be anything that runs on a Windows Command Prompt.
Name | Type | Value Restrictions | Required | Default | Description |
---|---|---|---|---|---|
host
structure and argumentsHere follows the structure of the host
argument
with its arguments:
component
The component
argument can refer to a Component by name and use its properties as the arguments of the operator. In case the mapped arguments are already provided to the operator, there is no override.
Here is an example of a component that overrides the host
and the command
arguments:
All operators accept some common, optional, arguments that allow you to control how the operator is executed within your workflow.
The following table reports all the arguments that can be used with any operator.
Name | Type | Value Restrictions | Required | Default | Description |
---|---|---|---|---|---|
The LinuxConfigurator operator allows configuring systems tuned by Akamas by applying parameters related to the Linux kernel using different strategies.
The operator can configure provided Components or can configure every Component which has parameters related to the Linux kernel.
The parameters are applied via SSH protocol.
In the most basic use of the Operator, it is sufficient to add a task of type LinuxConfigurator in the workflow.
The operator makes use of properties specified in the component to identify which instance should be configured, how to access it, and any other information required to apply the configuration.
If no component
is provided, this operator will try to configure every parameter defined for the Components of the System under test
The following table highlights the properties that can be specified on components and are used by this operator.
The properties blockDevices
and networkDevices
allow specifying which parameters to apply to each block/network-device associated with the Component, as well as which block/network-device should be left untouched by the LinuxConfigurator.
If the properties are omitted, then all block/network-devices associated with the Component will be configured will all the available related parameters.
All block-devices called loopN (where N is an integer number greater or equal to 0) are automatically excluded from the Component’s block-devices
The properties blockDevices
and networkDevices
are lists of objects with the following structure:
In this example, only the parameters os_StorageReadAhead and os_StorageQeueuScheduler are applied to all the devices that match the regex "xvd[a-z]" (i.e. xvda, xvdb, …, xvdc).
In these examples, only the parameter os_StorageMaxSectorKb is applied to block device xvdb and loop0.
Note that the parameter is applied also to the block device loop0, since it is specified in the name filter, this overrides the default behavior since loopN devices are excluded by the Linux Optimization Pack
In this example, no parameters are applied to the wlp4s0 network device, which is therefore excluded from the optimization.
To support the scenario in which some configuration parameters related to the Linux kernel may be applied using the strategies supported by this operator, while others with other strategies (e.g, using a file to be written on a remote machine), it is necessary to specify which parameters should be applied with the LinuxConfigurator, and this is done at the ComponentType level; moreover, still at the ComponentType level, it is necessary to specify which strategy should be used to configure each parameter. This information is already embedded in the Linux Optimization pack and, usually, no customization is required.
With this strategy, a parameter is configured by leveraging the sysctl utility. The sysctl variable to map to the parameter that needs to be configured is specified using the key
argument.
With this strategy, a parameter is configured by echoing and piping its value into a provided file. The path of the file is specified using the file
argument.
With this strategy, each possible value of a parameter is mapped to a command to be executed on the machine the LinuxConfigurator operates on(this is especially useful for categorical parameters).
With this strategy, a parameter is configured by executing a command into which the parameter value is interpolated.
The Executor Operator can be used to execute a shell command on a target machine using SSH.
Name | Type | Values restrictions | Required | Default | Description |
---|---|---|---|---|---|
Host
structure and argumentsHere follows the structure of the host
argument:
with its arguments:
Name | Type | Value Retrictions | Required | Default | Description |
---|---|---|---|---|---|
component
The component
argument can refer to a component by name and use its properties as the arguments of the operator (see mapping here below). In case the mapped arguments are already provided to the operator, there is no override.
Let's assume you want to run a script on a remote host and expect the script to be executed successfully within 30 seconds but might fail occasionally.
Launch a script, wait for its completion, and in case of failures or timeout retry 3 times by waiting 10 seconds between retries:
Execute a uname command with explicit host information (explicit SSH key)
Execute a uname command with explicit host information (imported SSH key)
Execute a uname command with host information taken from a Component
Start a load-testing script and keep it running in the background during the workflow
Due to the stderr configuration, it could happen that invoking a bash script on a server has a different result than running the same script from Akamas Executor Operator. This is quite common with Tomcat startup scripts like $HOME/tomcat/apache-tomcat_1299/bin/startup.sh
.
To avoid this issue simply create a wrapper bash file on the target server adding the set -m instruction before the sh command, eg:
and then configure the Executor Operator to run the wrapper script like:
You can run the following to emulate the same behavior of Akamas running scripts over SSH:
There are cases in which you would like to keep a script running for the whole duration of the test. Some examples could be:
A script applying load to your system for the duration of the workflow
The manual start of an application to be tested
The setup of a listener that gathers logs, metrics, or data
In all the instances where you need to keep a task running beyond the task that started it, you must use the detach: true
property.
Note that a detached executor task returns immediately, so you should run only the background task in detached mode.
Remember to keep all tasks requiring synchronous (standard) behavior out of the detached task.
Example:
Library references
The library used to execute scripts remotely is Fabric, a high-level Python library designed to execute shell commands remotely over SSH, yielding useful Python objects in return.
The Fabric library uses a connection object to execute scripts remotely (see connection — Fabric documentation). The option of a dedicated detach
mode comes from implementing the more robust disown
property from the Invoke Runner underlying the Connection (see runners — Invoke documentation). This is the reason you should rely on detach
whenever possible instead of running the background processes straight into the script.
In the Frequently Asked/Answered Questions (FAQ) — Fabric documentation you may find some further information about the typical problems and solutions due to hanging problems for background processes.
The WindowsFileConfigurator operator allows configuring systems tuned by Akamas by interpolating configuration parameters into files on remote Windows machines.
The operator performs the following operations:
It reads an input file from a remote machine containing templates for interpolating the configuration parameters generated by Akamas
It replaces the values of configuration parameters in the input file
It writes the file with replaced configuration parameters on a specified path on another remote machine
Access on remote machines is performed using WinRM
The Windows File Configurator allows writing templates for configuration parameters in two ways:
a single parameter is specified to be interpolated:
all parameters of a component to be interpolated:
It is possible to add a prefix or suffix to interpolated configuration parameters by acting at the component-type level:
In the example above, the parameter x1
will be interpolated with the prefix PREFIX
and the suffix SUFFIX
, ${value}
will be replaced with the actual value of the parameter at each experiment.
Suppose we have the configuration of the following parameters for experiment 1 of a study:
where component1
is of type MyComponentType
defined as follows:
A template file to interpolate only parameter component1.param1
and all parameters from component2
would look like this:
The file after the configuration parameters are interpolated would look like this:
Note that the file in this example contains a bash command whose arguments are constructed by interpolating configuration parameters. This represents a typical use case for the WindowsFileConfigurator: to construct the right bash commands that configure a system with the new configuration parameters computed by Akamas.
source
and target
structure and argumentsHere follows the structure of either the source
or target
operator argument
component
The component
argument can be used to refer to a Component by name and use its properties as the arguments of the operator. In case the mapped arguments are already provided to the operator, there is no override.
Notice that in this case, the operator replaces in the template file only tokens referring to the specified component. A parameter bound to any component causes the substitution to fail.
Where the apache-server-1
component is defined as:
The SparkLivy operator uses Livy to run Spark applications on a Spark instance.
Name | type | Value restrictions | Is required | Default | Description |
---|
The operator fetches the following parameters from the current Experiment to apply them to the System under test.
Name | Description | Restrictions |
---|
The SSHSparkSubmit operator connects to a Spark instance invoking a spark-submit on a machine reachable via SSH.
Name | Type | Value Restrictions | Required | Default | Description |
---|
component
This operator automatically maps some properties of its component
to some arguments. In case the mapped arguments are already provided to the operator, the is no override.
The NeoLoadWeb operator allows piloting performance tests on a target system by leveraging the Tricentis NeoLoad Web solution.
Once triggered, this operator will configure and start the execution of a NeoLoad test run on the remote endpoint. When the test is unable to run then the operator blocks the Akamas workflow issuing an error.
This operator requires five pieces of information to pilot successfully performance tests within Akamas:
The location of a .zip archive(project file) containing the definition of the performance test. This location can be a URL accessible via HTTP/HTTPS or a file path accessible via SFTP. Otherwise, the unique identifier of a previously uploaded project must be provided.
The name of the scenario to be used for the test
The URL of the NeoLoad Web API (either on-premise or SaaS)
The URL of the NeoLoad Web API for uploading project files
The account token used to access the NeoLoad Web APIs
When a projectFile
is specified the Operator uploads the provided project to NeoLoad and launches the specified scenario. After the execution of the scenario, the project is deleted from NeoLoad. When a projectId
is specified the Operator expects the project to be already available on NeoLoad. Please refer to on how to upload a project and obtain a project ID.
Name | Type | Value Restrictions | Required | Default | Description |
---|
ProjectFile
structure and argumentsThe projectFile
argument needs to be specified differently depending on the protocol used to get the specification of the performance test:
HTTP/HTTPS
SSH (SFTP)
Here follows the structure of the projectFile
argument in the case in which HTTP/HTTPS is used to get the specification of the performance test:
with its arguments:
Here follows the structure of the projectFile
argument in the case in which SFTP is used to get the specification of the performance test.
with its arguments
component
structure and argumentsThe component
argument can be used to refer to a component by name and use its properties as the arguments of the operator.
Field | Type | Description | Default Value | Restrictions | Required | Source |
---|
Field | Type | Description | Default Value | Restrictions | Required | Source |
---|
Name | Type | Value Restrictions | Required | Default | Description |
---|---|---|---|---|---|
Component Property | Operator Argument |
---|---|
Name | Type | Value Restrictions | Required | Default | Description |
---|
Name | Type | Value Restrictions | Required | Default | Description |
---|
Name | Type | Value Restrictions | Required | Default | Descrption |
---|
Type | Value Restrictions | Required | Default | Description |
---|
Component property | Operator argument |
---|
command
String
Yes
The command to be executed on the remote machine
host
Object
It should have a structure like the one described here below
No
Information relative to the target machine onto which the command has to be executed
component
String
It should match the name of an existing Component of the System under test
No
The name of the Component whose properties can be used as arguments of the operator
protocol
String
https
http
Yes, if the Component whose name is defined in component
hasn’t a property named host->protocol
https
The protocol to use to connect to the Windows machine with WinRM
hostname
String
Valid FQDN or ip address
Yes, if the Component whose name is defined in component
hasn’t a property named host->hostname
-
Windows machine’s hostname
port
Number
1≤port
≤65532
Yes, if the Component whose name is defined in component
hasn’t a property named host->port
5863
WinRM port
path
String
-
Yes, if the Component whose name is defined in component
hasn’t a property named host->path
/wsman
The path where WinRM is listening
username
String
username
domain\username
username@domain
Yes, if the Component whose name is defined in component
hasn’t a property named host->hostname
-
User login (domain or local)
password
String
-
Yes, if the Component whose name is defined in component
hasn’t a property named host->password
-
Login password
authType
String
ntlm
ssl
Yes, if the Component whose name is defined in component
hasn’t a property named host->authType
ntlm
The authentication method to use against Windows machine
validateCertificate
Boolean
true
false
Yes, if the Component whose name is defined in component
hasn’t a property named host->validateCertificate
False
Whether or not validate the server certificate
ca
String
A valid CA certificate
Yes, if the Component whose name is defined in component
hasn’t a property named host->ca
-
The CA that is required to validate the servier certificate
operationTimeoutSec
Integer
Must be greather then 0
No
The amount in seconds after which the execution of the command is considered failed
Notice that the ouput of the command doesn’t reset the timeout.
readTimeoutSec
Integer
Must be greather then operationTimeoutSec
No
The amount of seconds to wait before an HTTP connect/read times out
Executes a shell command on a machine using SSH
Interpolates configuration parameters values into a file with templates and saves this file on a machine using SSH
Configures Linux kernel parameters using different strategies
Executes a command on a target Windows machine using WinRM
Interpolates configuration parameters into files on remote Windows machines
Pauses the execution of the workflow for a certain time
Executes custom queries on Oracle database instances
Configures Oracle database instances
Executes a Spark application using spark-submit on a machine using SSH
Executes a Spark application using spark-submit locally
Executes a Spark application using the Livy web service
Triggers the execution of performance tests using NeoLoad Web
Runs a performance test with LoadRunner Professional
Runs a performance test with LoadRunner Enterprise
retries
integer
-
no
1
How many times a task can be re-executed in case of failures. If a task reaches the maximum number of retries and fails the entier workflow execution is aborted and the trial is considered failed.
retry_delay
string
string (supporting seconds, minutes and hours) int (seconds only)
no
5m
How much time to wait before retrying a failed task.
timeout
string
string (supporting seconds, minutes and hours) int (seconds only)
no
Infinite
The maximum time a task can run before considering a failure. If the timeout exceeds the task is considered failed.
Name
Type
Value restrictions
Required
Default
Description
component
String
It should match the name of an existing Component of the System under test
No
The name of the Component for which available Linux kernel parameters will be configured
Name
Type
Value restrictions
Required
Default
Description
hostname
String
It should be a valid SSH host address
Yes
SSH host address
sshPort
Integer
1≤sshPort
≤65532
Yes
22
SSH port
username
String
Yes
SSH login username
key
Multiline string
Either key or password is required
SSH login key, provided directly its value or the path of the file to import from. The operator supports RSA and DSA Keys
password
String
Either key or password is required
blockDevices
List of objects
It should have a structure like the one described in the next section
No
Allows the user to restrict and specify to which block-device apply block-device-related parameters
networkDevices
List of objects
It should have a structure like the one described in the next section
No
Allows the user to restrict and specify to which network-device apply network-device-related parameters
Name
Type
Value restrictions
Required
Default
Description
name
String
It should be a valid regular expression to match block/network-devices
Yes
A regular expression that matches block/network-devices to configure with related parameters of the Component
parameters
List of strings
It should contain the names of matching parameters of the Component
No
The list of parameters to be configured for the specified block/network-devices. If the list is empty, then no parameter will be applied for the block/network-devices matched by name
command
String
yes
The shell command to be executed on the remote machine
host
Object
See structure documented below
no
Information relative to the target machine onto which the command has to be executed using SSH
component
String
It should match the name of an existing Component of the System under test
no
The name of the Component whose properties can be used as arguments of the operator
detach
Boolean
no
False
The execution mode of the shell command.
Default (False
) execution will be synchronous, detached (True
) execution will be asynchronous and will return immediately
hostname
String
should be a valid SSH host address
no, if the Component whose name is defined in component
has a property named hostname
SSH endpoint
username
String
no, if the Component whose name is defined in component
has a property named username
SSH login username
password
String
cannot be set if key
is already set
no, if the Component whose name is defined in component
has a property named password
SSH login password
sshPort
Number
1≤sshPort
≤65532
no
22
SSH port
key
String
cannot be set if password
is already set
no, if the Component whose name is defined in component
has a property named key
SSH login key. Either provide directly the key value or specify the path of the file (local to the cli executing the create command) to read the key from. The operator supports RSA and DSA Keys.
hostname
host->hostname
username
host->username
sshPort
host->sshPort
password
host->password
key
host->key
seconds
Number (integer)
seconds
> 0
Yes
The number of seconds for which pause the workflow
| Object | It should have a structure like the one defined in the next section | No, if the Component whose name is defined in | Information relative to the source/input file to be used to interpolate optimal configuration parameters discovered by Akamas |
| Object | It should have a structure like the one defined in the next section | No, if the Component whose name is defined in | Information relative to the target/output file to be used to interpolate optimal configuration parameters discovered by Akamas |
| String | It should match the name of an existing Component of the System under test | No | The name of the Component whose properties can be used as arguments of the operator |
| String | It should be a valid host address | Yes | Windows host |
| String | Yes | Login username |
| String | Windows password for the specified user | Yes | Login password |
| String | It should be a valid path | Yes | The path of the file to be used either as the source or target of the activity to applying Akamas computed configuration parameters using files |
Component property | Operator argument |
|
|
|
|
|
|
|
|
|
|
| String | It should be a path to a valid java or python spark application file | Yes | Spark application to submit (jar or python file) |
| List of Strings, Numbers or Booleans | Yes | Additional application arguments |
| String | No. Required for java applications. | The entry point of the java application. |
| String | No | Name of the task. When submitted the id of the study, experiment and trial will be appended. |
| String | No | The name of the YARN queue to which submit a Spark application |
| List of Strings | Each item of the list should be a path that matches an existing python file | No | A list of python scripts to be added to the PYTHONPATH |
| String | No | The user to be used to launch Spark applications |
| Number |
| No | 10 | The number of seconds to wait before checking if a launched Spark application has finished |
| String | It should match the name of an existing Component of the System under test | Yes | The name of the component whose properties can be used as arguments of the operator |
| Memory for the driver |
| Memory per executor |
| Total cores used by the application | Spark standalone and Mesos only |
| Cores per executor | Spark standalone and YARN only |
| The number of executors | YARN only |
| String | DSN or EasyConnect string | Is possible to define only one of the following sets of configurations:
| task, component |
| String | Address of the database instance | task, component |
| Integer | listening port of the database instance | 1521 | task, component |
| String | Database service name | task, component |
| String | Database SID | task, component |
| String | User name | Yes | task, component |
| String | User password | Yes | task, component |
| String | Connection mode |
| task, component |
| String | Name of the component to fetch properties and parameters from | Yes | task |
Name | Type | Values restrictions | Required | Default | Description |
| String | It should be a path to a valid java or python spark application file | Yes | Spark application to submit (jar or python file) |
| List of Strings, Numbers or Booleans | Yes | Additional application arguments |
| String | t should be a valid supported Master URL:
| Yes | The master URL for the Spark cluster |
|
| No |
| Whether to launch the driver locally ( |
| String | No | The entry point of the java application. Required for java applications. |
| String | No | Name of the task. When submitted the id of the study, experiment and trial will be appended. |
| List of Strings | Each item of the list should be a path that matches an existing jar file | No | A list of jars to be added in the classpath. |
| List of Strings | Each item of the list should be a path that matches an existing python file | No | A list of python scripts to be added to the PYTHONPATH |
| List of Strings | Each item of the list should be a path that matches an existing file | No | A list of files to be added to the context of the spark-submit command |
| Object (key-value pairs) | No | Mapping containing additional Spark configurations. See Spark documentation. |
| Object (key-value pairs) | No | Env variables when running the spark-submit command |
| Boolean | No | true | If additional debugging output should be output |
| String | It should be a path that matches an existing executable | No | The default for the Spark installation | The path of the spark-submit executable command |
| String | It should be a path that matches an existing directory | No | The default for the Spark installation | The path of the SPARK_HOME |
| String | No | The user to be used to execute Spark applications |
| String | It should be a valid SSH host address | No, if the Component whose name is defined in | SSH host address |
| String | No, if the Component whose name is defined in | SSH login username |
| Number | 1≤ | No | 22 | SSH port |
| String | Cannot be set if | No, if the Component whose name is defined in | SSH login password |
| String | Cannot be set if | No, if the Component whose name is defined in | SSH login key, provided directly its value or the path of the file to import from. The operator supports RSA and DSA Keys. |
| String | It should match the name of an existing Component of the System under test | Yes | The name of the Component whose properties can be used as arguments of the operator |
|
|
|
|
|
|
|
|
|
|
| String | It should be a valid URL or IP | Yes | The URL of the project file |
| Boolean | No | true | If the https connection should be verified using the certificates available on the machine in which the operator is running |
| String | It should be a valid SSH host address | Yes | SSH host address |
| String | Yes | SSH login username |
| String | No. Either | SSH login password |
| Number (integer) | 1≤ | 22 | SSH port |
| String | No, Either | SSH login key, provided directly its value or the path of the file to import from. The operator supports RSA and DSA Keys. |
| String | It should be a valid path on the SSH host machine | Yes | The path of the project file |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| String | The DSN or EasyConnect string | Is possible to define only one of the following sets of configurations:
| task, component |
| String | The address of the database instance | task, component |
| Integer | The listening port of the database instance | 1521 | task, component |
| String | The database service name | task, component |
| String | The database SID | task, component |
| String | The user name | Yes | task, component |
| String | The user password | Yes | task, component |
| String | The connection mode |
| task, component |
| List[String] | The list of queries to update the database status before or after the workload execution. Queries can be templatized, containing tokens referencing parameters of any component in the system. | Yes | task |
| boolean | A Flag to enable the auto-commit feature | False | No | task |
| String | The name of the component to fetch properties from | No | task |
| String | It should be a path to a valid java or python spark application file | Yes | Spark application to submit (jar or python file) |
| List of Strings, Numbers or Booleans | Yes | Additional application arguments |
| String | It should be a valid supported Master URL:
| Yes | The master URL for the Spark cluster |
|
| No |
| Whether to launch the driver locally ( |
| String | No | The entry point of the java application. Required for java applications. |
| String | No | Name of the task. When submitted the id of the study, experiment and trial will be appended. |
| List of Strings | Each item of the list should be a path that matches an existing jar file | No | A list of jars to be added in the classpath. |
| List of Strings | Each item of the list should be a path that matches an existing python file | No | A list of python scripts to be added to the PYTHONPATH |
| List of Strings | Each item of the list should be a path that matches an existing file | No | A list of files to be added to the context of the spark-submit |
| Object (key-value pairs) | No | Mapping containing additional Spark configurations. See Spark documentation. |
| Object (key-value pairs) | No | Env variables when running the spark-submit command |
| String | It should be a path that matches an existing executable | No | The default for the Spark installation | The path of the spark-submit executable command |
| String | It should be a path that matches an existing directory | No | The default for the Spark installation | The path of the SPARK_HOME |
| String | No | The user to be used to execute Spark applications |
| Boolean | No | true | If additional debugging output should be displayed |
| String | It should match the name of an existing Component of the System under test | Yes | The name of the component whose properties can be used as arguments of the operator |
| String | It should match an existing scenario in the project file. Can be retrieved from the "runtime" section of your neoload controller. | No, if the component whose name is defined in | The name of the scenario to be used for the performance piloted by Akamas |
| String | It should be a valid UUID | No, if a | The identified of a previously uploaded project file. Has precedence over |
| Object | It should have a structure like the one described here below | No, if a | The specification of the strategy to be used to get the archive containing the specification of the performance test to be piloted by Akamas. When defined |
| String | It should be a valid URL or IP | No | The address of the API to be used to upload project files to NeoLoad Web |
| String | It should be a valid URL or IP | No | The address of the Neotys' NeoLoad Web API |
| String | Comma-separated list of zones and number of LG | No | The list of LG zones id with the number of the LGs. Example: "ZoneId1:10,ZoneId2:5". If empty, the default zone will be used with one LG. |
| String | A controller zone Id | No | The controller zone Id. If empty, the default zone will be used. |
| String | It should match the name of an existing component of the System under test | No | The name of the component whose properties can be used as arguments of the operator. |
| String | It should match an existing access token registered with NeoLoad Web | No, if specified in the component. See example below | The token to be used to authenticate requests against the NeoLoad Web APIs |
This page introduces the LoadRunner operator, a workflow operator that allows piloting performance tests on a target system by leveraging Micro Focus LoadRunner. This page assumes you are familiar with the definition of a workflow and its tasks. If this is not the case, then check Creating automation workflows.
This section provides the minimum requirements that you should meet to use this operator.
Micro Focus LoadRunner 12.60 or 2020
Microsoft Windows Server 2016 or 2019
Powershell version 5.1 or greater
To configure WinRM to allow Akamas to launch tests please read the Integrating LoadRunner Professional page.
All LoadRunner test files (VuGen scripts and folder, lrs files) and their parent folders, must be readable and writable by the user account used by Akamas.
When you define a task that uses the LoadRunner operator you should specify some configuration information to allow the operator to connect to the LoadRunner controller and execute a provided test scenario.
You can specify configuration information within the arguments
that are part of a task in the YAML of the definition of a workflow.
You can avoid specifying each configuration information at the task level, by including a component
property with the name of a component; in this way, the operator will take any configuration information from the properties of the referenced component
controller
- a set of pieces of information useful for connecting to the LoadRunner controller
scenarioFile
- the path to the scenario file within the LoadRunner controller to execute the performance test
resultFolder
- the path to the performance tests results folder with the LoadRunner controller
To make it possible for the operator to connect to a LoadRunner controller to execute a performance test you can use the controller
property within the workflow task definition:
This table reports the configuration reference for the arguments
section.
Important notice: remember to escape your path with four backslashes (e.g. C:\\\\Users\\\\\...
)
Controller
argumentsThis table reports the configuration reference for the controller
section, which is an object with the following fields:
Important notice: remember to escape your path with four backslashes (e.g. C:\\\\Users\\\\\...
)
This page introduces the LoadRunnerEnterprise operator, a workflow operator that allows piloting performance tests on a target system by leveraging Micro Focus LoadRunner Enterprise (formerly known as Performance Center).
This section provides the minimum requirements that you should meet to use this operator.
Micro Focus Performance Center 12.60 or 12.63
LoadRunner Enterprise 2020 SP3
When you define a task that uses the LoadRunnerEnterprise operator you should specify some configuration information to allow the operator to connect to LoadRunner Enterprise and execute a provided test scenario.
You can specify configuration information within the arguments
that are part of a task in the YAML of the definition of a workflow.
You can avoid specifying each configuration information at the task level, by including a component
property with the name of a component; in this way, the operator will take any configuration information from the properties of the referenced component
This table reports the configuration reference for the arguments
section
Field | Type | Value Restrictions | Required | Dafault | Description |
---|---|---|---|---|---|
testId
valueThe following screenshot from Performance Center shows the testId
value highlighted.
testSet
valueThe following screenshot from Performance Center shows the testSet
name highlighted.
How to retrieve the testId
value from LoadRunner Enterprise
URL: http://<LRE address>/Loadtest/
then test management from the main menu
Field | Type | Value restrictions | Required | Default | Description |
---|---|---|---|---|---|
Field | Type | Value restrictions | Required | Default | Description |
---|---|---|---|---|---|
controller
Object
Yes
The information required to connect to LoadRunner controller machine.
component
String
No
The name of the component from which the operator will take its configuration options
scenarioFile
String
Matches an existing file within the LoadRunner controller
Yes
The LoadRunner scenario file to execute the performance test.
resultFolder
String
Yes
The folder, on the controller, where Loadrunner will put the results of a performance test.
You can use the placeholders {study}, {exp}, {trial} to generate a path that is unique for the running Akamas trial.
It can be a local path on the controller or on a network share
loadrunnerResOverride
String
A valid name for a Windows folder
No
res
The folder name where LoadRunner save the analysis results.
The default value can be changed in the LoadRunner controller.
timeout
String
The string must contain a numeric value followed by a suffix (s, m, h, d).
No
2h
The timeout for the Loadrunner scenario. If Loadrunner doesn’t finish the scenario within the specified amount of time, Akamas will consider the workflow as failed.
checkFrequency
String
The string must contain a numeric value followed by a suffix (s, m, h, d).
No
1m
The interval at which Akamas check’s the status of the Loadrunner scenario.
executable
String
A valid windows path
No
C:\Program Files (x86)\Micro Focus\LoadRunner\bin\Wlrun.exe
The LoadRunner executable path
component
String
No
The name of the component from which the operator will take its configuration options.
scenarioFile
String
Matches an existing file within the LoadRunner controller
Yes
The LoadRunner scenario file to execute the performance test.
resultFolder
String
Yes
The folder, on the controller, where Loadrunner will put the results of a performance test.
You can use the placeholders {study}, {exp}, {trial} to generate a path that is unique for the running Akamas trial.
It can be a local path on the controller or on a network share.
loadrunnerResOverride
String
A valid name for a Windows folder
No
res
The folder name where LoadRunner save the analysis results.
The default value can be changed in the LoadRunner controller.
timeout
String
The string must contain a numeric value followed by a suffix (s, m, h, d).
No
2h
The timeout for the Loadrunner scenario. If Loadrunner doesn’t finish the scenario within the specified amount of time, Akamas will consider the workflow as failed.
checkFrequency
String
The string must contain a numeric value followed by a suffix (s, m, h, d).
No
1m
The interval at which Akamas check’s the status of the Loadrunner scenario.
executable
String
A valid windows path
No
C:\Program Files (x86)\Micro Focus\LoadRunner\bin\Wlrun.exe
The LoadRunner executable path.
address
String
A valid URL I.e. http://loadrunner-enterprise.yourdomain.com
Yes
-
The information required to connect to LoadRunner Enterprise.
username
String
-
Yes
-
The username used to connect to LoadRunner Enterprise
password
String
-
Yes
-
The password for the specified user
tenantID
String
-
No
-
The id of the tenant (Only for LR2020)
domain
String
-
Yes
The Domain of your load test projects.
project
String
-
Yes
The Project name of your load test projects
testId
Number
-
Yes
The id of the load test. See here below how to retrieve this from LoadRunner.
testSet
String
-
Yes
-
The name of the TestSet. See here below how to retrieve this from LoadRunner.
timeSlot
String
A number followed by the time unit.
Values must be multiple of 15m and greater then 30m
Valid units are:
m: minutes
h: hours
Yes
-
The reserved time slot for the test.
Examples:
1h
45m
1h30m
component
String
A valid component name
No
-
The name of the component from which the operator will take its configuration options
pollingInterval
Number
A positive integer number
No
30
The frequency (in seconds) of at witch Akamas checks for the load test status
verifySSL
String
True, False
No
True
Wether to validate the certificate provided by the LRE server when using an https connection