Executor Operator
The Executor Operator can be used to execute a shell command on a target machine using SSH. It allows configuring systems tuned by Akamas by interpolating configuration parameters and study information into commands on remote machines.
Templates for configuration parameters
The Executor allows writing templates for configuration parameters in two ways:
specify that a parameter should be interpolated directly:
${component_name.parameter_name}specify that all parameters of a component should be interpolated:
${component_name.*}Suffix or prefix for interpolated parameters
It is possible to add a prefix or suffix to interpolated configuration parameters by acting at the component-type level:
Notice that any parameter that does not contain the FileConfigurator element in the operators' attribute is ignored and not written.
name: Component Type 1
description: My Component type
parameters:
- name: x1
domain:
type: real
domain: [-5.0, 10.0]
defaultValue: -5.0
# Under this section, the operator to be used to configure the parameters is defined
operators:
FileConfigurator:
# using this OPTIONAL confTemplate property is possible to interpolate the parameter value with a prefix and a suffix
confTemplate: "PREFIX${value}SUFFIX"In the example above, the parameter x1 will be interpolated with the prefix PREFIX and the suffix SUFFIX, ${value} will be replaced with the actual value of the parameter at each experiment.
Example
Let's assume we want to apply the following configuration:
where component1 is of type MyComponentType and MyComponentType is defined as follows:
A template command to interpolate only parameter component1.param1 and all parameters from component2 would look like this:
The command after the configuration parameters are interpolated would look like this:
Note that the command in this example is a bash command whose arguments are constructed by interpolating configuration parameters.
Configuration Template Override
Some commands may require formatting parameter value using a template different from the one provided by default in the componentType. For this reason, it is possible to override the configuration template of each parameter at the workflow task level.
To handle these situations, it is possible to override the configuration template at the workflow level. The override can be applied to all components of a certain componentType.
Example
In this example, the param1 parameter for all components of type MyComponentType will be configured using the ${value} template.
Templates for study information
The Executor Operator supports the interpolation of study-related information into commands executed on remote machines. This feature allows you to dynamically include metadata about the study, experiment, and trial within your commands or scripts.
You can access the following study information using the corresponding placeholders:
Study Name:
${ak-study.name}Study ID:
${ak-study.id}Step Name:
${ak-step.name}Experiment ID:
${ak-experiment.id}Trial ID:
${ak-trial.id}Trial Start Time:
${ak-trial.start_time}Task Try Number:
${ak-task.retry}Task Start Time:
${ak-task.start_time}
By including these placeholders in your command templates, the Executor Operator will replace them with the actual values when the command is executed.
Example
Saving Study Information as Environment Variables
A common use case is to pass study information to scripts executed in subsequent tasks. This can be achieved by exporting, in a task, the study information as environment variables which can then be accessed by scripts in later tasks. In the example below, target-machine is a customer machine that contains the bash script named script.sh
Passing Study Information to Scripts
Or the same study information can, instead, be passed directly to scripts as arguments:
Operator arguments
command
String
If the template ${componentType.param} is present in the command, the component type and the specified parameter must exist.
yes
The shell command to be executed on the remote machine
host
Object
See structure documented below
no
Information relative to the target machine onto which the command has to be executed using SSH
component
String
It should match the name of an existing Component of the System under test
no
The name of the Component whose properties can be used as arguments of the operator
detach
Boolean
no
False
The execution mode of the shell command.
Default (False) execution will be synchronous, detached (True) execution will be asynchronous and will return immediately
replaceTemplate
Boolean
no
True
Enable/Disable template replacement ${componentType.parameter}
confTemplate
Object
A map where the keys are in the format componentType.parameter, and the values are the corresponding new templates
no
The values of new parameter templates
Host structure and arguments
Host structure and argumentsHere follows the structure of the host argument:
with its arguments:
hostname
String
should be a valid SSH host address
no, if the Component whose name is defined in component has a property named hostname
SSH endpoint
username
String
no, if the Component whose name is defined in component has a property named username
SSH login username
password
String
cannot be set if key is already set
no, if the Component whose name is defined in component has a property named password
SSH login password
sshPort
Number
1≤sshPort≤65532
no
22
SSH port
key
String
cannot be set if password is already set
no, if the Component whose name is defined in component has a property named key
SSH login key. Either provide directly the key value or specify the path of the file (local to the cli executing the create command) to read the key from. The operator supports RSA and DSA Keys.
Get operator arguments from component
componentThe component argument can refer to a component by name and use its properties as the arguments of the operator (see mapping here below). In case the mapped arguments are already provided to the operator, there is no override.
Component property to operator argument mapping
hostname
host->hostname
username
host->username
sshPort
host->sshPort
password
host->password
key
host->key
Workflow examples
Let's assume the user wants to run a script on a remote host and expects the script to be executed successfully within 30 seconds but might fail occasionally.
The code below launches a script, waits for its completion and, in failure/timeout cases, retries 3 times by waiting 10 seconds between retries:
This snippet executes a uname command with explicit host information (explicit SSH key)
The next piece of code executes a uname command with explicit host information (imported SSH key)
This executes a uname command with host information taken from a Component
This starts a load-testing script and keeps it running in the background during the workflow
Logging with command scripts
When executing a time-consuming script on a remote SSH machine, it may be helpful to let it provide user feedback during its execution. Adding logging lines in the script allows you to inspect its current status while it runs. For example, in the case of an offline study, if you're using the Akamas UI, you can view these logging lines in a dedicated drawer that pops up in the offline study detail page. This drawer is accessible by expanding the trial detail and selecting the desired workflow task (it will be indicated by a blue dot if the study is currently running that script). See the example image below:
By clicking on the blue dot in the Tasks column, a drawer will popup showing the workflow task log details while it's running. See example image:

Everything between the line Start of script and the bottom line is the console log of the executed script. By default the Executor logs everything to the standard output (black text) or to the standard error stream (red text).
To display the output in real-time in the UI task log, ensure that every text line sent to the console ends with a line break. The implementation depends on the language you're using. For example:
Use methods that automatically append a newline, such as
printlninstead ofprint.or when the only print command available doesn't append newline, ensure strings explicitly end with a newline (e.g.: in Golang messages, add trailing
\n).
In most languages like Bash, Node.js, or Golang, this approach is sufficient. However, Python scripts executed through the Executor may present the following undesired effects:
All output appears in red (typically caused by logging to an unspecified stream, when the default is standard error).
Logs are displayed only at the end of the script (due to unflushed text buffers).
Solving Python-Specific Issues
To address these issues in Python, you can use one of the following solutions:
1. Explicitly flush each print statement:
2. Globally enable auto-flushing: Add these two lines at the beginning of your script:
This ensures all print statements automatically flush without modifying the rest of the script.
3. Use the logging library: Replace print with logging.info() and configure the logger to use the standard output stream:
NOTE: This logging.basicConfig approach requires few configuration lines but can only log every message line (errors included) in black text.
Advanced Logging Example
Below is an advanced example demonstrating how to log messages with different levels (INFO, WARNING, DEBUG, ERROR) and display them in two different colors:
This script correctly handles real-time output and is compatible with the Executor's logging system.
Troubleshooting
Troubles in running sh scripts remotely
Due to the stderr configuration, it could happen that invoking a bash script on a server has a different result than running the same script from Akamas Executor Operator. This is quite common with Tomcat startup scripts like $HOME/tomcat/apache-tomcat_1299/bin/startup.sh.
To avoid this issue simply create a wrapper bash file on the target server adding the set -m instruction before the sh command, eg:
and then configure the Executor Operator to run the wrapper script like:
You can run the following to emulate the same behavior of Akamas running scripts over SSH:
Troubles in keeping a script running in the background
There are cases in which you would like to keep a script running for the whole duration of the test. Some examples could be:
A script applying load to your system for the duration of the workflow
The manual start of an application to be tested
The setup of a listener that gathers logs, metrics, or data
In all the instances where you need to keep a task running beyond the task that started it, you must use the detach: true property.
Note that a detached executor task returns immediately, so you should run only the background task in detached mode.
Remember to keep all tasks requiring synchronous (standard) behavior out of the detached task.
Example:
Library references
The library used to execute scripts remotely is Fabric, a high-level Python library designed to execute shell commands remotely over SSH, yielding useful Python objects in return.
The Fabric library uses a connection object to execute scripts remotely (see connection — Fabric documentation). The option of a dedicated detach mode comes from implementing the more robust disown property from the Invoke Runner underlying the Connection (see runners — Invoke documentation). This is the reason you should rely on detach whenever possible instead of running the background processes straight into the script.
In the Frequently Asked/Answered Questions (FAQ) — Fabric documentation you may find some further information about the typical problems and solutions due to hanging problems for background processes.
Last updated
Was this helpful?