Resource management commands

This page describes all commands that allow Akamas resources to be managed with their options (see also common options available for all commands).

Command
Description

list the available aliases for Akamas resources

build a resource from a file or directory

create a resource from a file

delete a resource

list a set of resources

describe a resource

update a resource

install a resource from a file

uninstall a resource

start a study

terminate a study or experiment

resume a study

export a study

import a study

General information

Common options

The following table describes the common options available for all commands:

Option
Short option
Type
Description

--debug

-d

Flag

Print detailed information in case of errors

--workspace

-w

String

Overrides the workspace defined in the configuration file when interacting with resources such as systems, workflows and studies

--help

Flag

Print command line help

Akamas aliases

The Akamas CLI allows using a set of aliases or shortcuts for many resources.

Any resource can be specified using either the singular or plural form. Furthermore, the shortcuts listed below are available:

You can print the list of available aliases with the following command

Here are a few examples demonstrating how aliases work in Akamas CLI:

  • akamas list study is equivalent to akamas list studies

  • akamas log is equivalent to akamas logs

  • akamas delete workspace is equivalent to akamas delete ws

Build command

This command builds either a new optimization pack or a new scaffolding hierarchy.

Build optimization pack

In this case, you supply a folder with a specific hierarchy: the needed folders are metrics, component-types and parameters, and each of them contains a set of yaml resource files listing the supported resources. Then the command akamas build optimization-pack FOLDER_NAME creates a full JSON file with all the optimization pack contents inside it.

Build scaffolding

In this case, you supply two folders: one for variables, named variables and one for templates named templates. The variables folder must contain one yaml file for each desired output set and the templates folder should hold all generic templates that can contain variable parameters specified in the variables file.

For example, the variables folder could contain two files named test.yaml and prod.yaml. The contents of these files are:

Then suppose that the template folder contains some YAML files and one of them the is following file named template-study.yaml:

When launching the command akamas build scaffold SCAFFOLDING_DIR_NAME/, a new folder outputis created inside SCAFFOLDING_DIR_NAME along with two sub-folders test and prod. Each sub-folder now contains the templates rendered with the values set in the variables files. The create by folder command makes it easier to create bulk entities.

Create command

Create the Akamas resource described in the provided YAML file.

Create by file/folder

You can also omit the resource type from the create command and use the -f flag instead to create most of the resources with a YAML file in a single command. The supported resources are component, system, optimization-pack, study, workflow, telemetry-instance and telemetry-provider. To use this feature, it's required to add the kind key inside each YAML file. Also, the system key must be added when the resource is required to be attached to a system (applies to telemetry instances and system components).

For example, to create a new telemetry instance, you should add the following to your YAML file:

Then you can use the akamas create -f <filename.yaml> command instead of akamas create telemetry-instance <filename.yaml> SYSTEM_NAME.

Similarly, to create a new telemetry-provider (which does not need the system attribute), you just need to specify the kind to your YAML file:

This also works for optimization packs. For standard optimization packs provided by akamas, you need to write a YAML file such as:

If you want to install a custom optimization pack, you can also supply a JSON file. In this case, there is no need to specify the kind attribute.

Finally, if you supply a folder to the command akamas create, it will process all files inside this folder and create all the requested resources. You can, for example, use one of the output folders created by the command akamas build scaffold. Let's assume we have a folder named scaffold that contains the following files:

the following command will process all of the files above and (if correct) create all resources described inside of them:

Delete command

Delete an Akamas resource, identified by UUID or name.

with the following options:

Option
Short option
Type
Description

--force

-f

Flag

Force the deletion of the resource(s)

Delete by file/folder command

Similarly to the create command, you can use the flag -f to delete the supplied resources. See the section Create by file/folder command for instructions on the supported resources and the additional required fields.

All resources created with the command akamas create -f <folder> can also be deleted by using the opposite command akamas delete -f <folder>. The only difference is that the command akamas delete -f has an additional flag --complete. When supplied, it deletes all supported objects including optimization packs and telemetry providers. When the --complete flag is missing, however, optimization packs and telemetry providers are not deleted.

List command

List the resources for the selected type with their id, name, and description. Additional resource-specific fields can be shown.

with the following options:

Option
Short option
Type
Values
Default
Description

--no-pagination

-no-pag

Flag

Show all resources without pagination

--use-seconds

-u-s

Flag

If durations should be output in seconds

--sort-asc, --sort-desc

-s-asc, -s-desc

Flag

Sort items by creation time

--output

-o

Choice

  • table

  • json

  • yaml

table

Switch the output to table (default), json or yaml

List experiments

Option
Short
Type
Description

--bookmarked

-b

Flag

List only bookmarked experiments.

List trials

If experiment-id is omitted, trials from all experiments of the study are listed.

Describe command

Describe an Akamas resource with all its fields.

with the following options:

Option
Short option
Type
Values
Default
Description

--output

-o

Choice

  • table

  • json

  • yaml

table

Switch the output to table (default), json or yaml

Notice that this command does not support the resource type System.

Update command

Update an Akamas resource, identified by UUID or name.

with the following options:

Option
Short option
Type
Values
Default
Description

--output

-o

Choice

  • table

  • json

  • yaml

table

Switch the output to table (default), json or yaml

Update experiment command

Update an experiment, identified by the ID of the study and the experiment.

with the following options:

Option
Type
Values
Description

--approve-configuration

Flag

Approve a waiting experiment.

--parameter

String

list of key-value pairs

Updated the experiment's configuration with the values provided in the key-value pairs.

Update study command

Update the properties of a Study. Optionally, a YAML file can be supplied to update the study goal and constraints without re-running experiments.

with the following options:

Option
Type
Values
Description

--exploration-factor

0–1 or FULL-EXPLORATION

Set the exploration factor. 0 = no exploration, 1 = full space exploration for non-categorical parameters.

--safety-factor

Float

0–1

Set the safety factor. 0 = no safe, 1 = super safe.

--engine-version

String

Set the optimizer engine image version.

--approval

Choice

automatic, manual

Set the approval mode for recommended configurations.

Update trial command

Update the state of a trial, identified by the study, experiment, and trial.

with the following options:

Option
Type
Description

--fail

Flag

Mark the trial as failed.

--finished

Flag

Mark the trial as finished.

Install command

Install a License or an Optimization Pack

with the following options:

Option
Short option
Type
Description

--force

-f

Flag

Force the installation of the resource

Uninstall command

Uninstall a License or an Optimization Pack

with the following options:

Option
Short option
Type
Description

--force

-f

Flag

Force the uninstall of the resource

Start command

Start the execution of a Study.

Finish command

Terminate the execution of a Study or a specific Experiment. Finished studies can be resumed.

Resume command

Resumes the execution of a stopped Study.

The resume process can restart the study in 3 ways:

  • by creating a new experiment and running it (with option -m NEW)

  • by deleting all last failed experiments, creating a new experiment then running it (with option -m DEL)

  • by deleting all failed trials of the last experiment then resuming from current experiment, if applicable (with option -m KEEP). This is the default behavior. The experiment will be resumed if it's multi-trial (e.g.: 24 trial per experiment) and there are still trials to be processed (e.g.: only 10 passed trials). If the experiment is single trial and it already has a valid experiment, resuming the study will create a new experiment.

Export command

To export a study, the study name or the study UUID can be used from the command line.

An optional filename can be specified, with a relative or absolute path:

with the following options:

Option
Type
Default
Description

--show-secrets

Flag

Export without masking protected values.

--timeout / -t

Integer

600

Maximum allowed time in seconds for the export.

The exported information will be saved in tar.gz format.

The following entities are exported:

  • The Study

  • The Steps of the Study

  • The Experiments of the Study

  • The Trials of the Study

  • The Workflow to which the Study refers

  • The Timeseries collected during the study run

  • The System to which the Study refers

  • The Component related to the Study's System

  • The ComponentType of each Component

  • The Metrics definitions of each ComponentTypes

  • The Parameters definitions of each ComponentTypes

  • The Logs of the Study

Note: this operation can require a long time, depending on the quantity of data to be collected. During this time the CLI will wait for Akamas to send the exported package. Do not interrupt the CLI during this phase, as otherwise, the process will need to restart from the beginning.

Be mindful that there may be external timeouts imposed by network components such as load balancers or ingress controllers. These external timeouts are beyond Akamas's control and may cause the connection to be terminated prematurely if the export operation exceeds them.

Import command

Before starting the import, please ensure that you have installed the latest versions of the optimization packs. This ensures that the import procedure will bind the studies to the latest optimization pack versions (i.e. the installed ones) instead of importing potentially outdated ones from the source system.

Use the following command to import a study into an existing Akamas instance:

Where FILENAME refers to the file of a previously exported study.

with the following options:

Option
Short
Type
Default
Description

--force

-f

Flag

Forcibly replace an already existing study (if present).

--timeout

-t

Integer

600

Maximum allowed time in seconds for the import.

When imported, the following entities will have a new UUID:

  • Study

  • Workflow

  • System

  • Component

  • ComponentType

  • Metrics

  • Parameters

In case a resource that is being imported has the same name as an existing one, the existing entity will not be deleted. The existing entity (with its UUID) will be used instead of the imported one.

All steps, experiments, and trials will maintain the same id and, therefore, the same execution order as the original exported study.

Note: this operation can require a long time. If the CLI shows a timeout error or if the operation is interrupted, the import will continue on the Akamas server. However, the CLI will not show the completion status, so you may need to check the server logs or interface to confirm that the import has finished successfully.

Be aware of external timeouts from network components like load balancers or ingress controllers. If the import operation exceeds these external timeouts, it might be interrupted prematurely.

Last updated

Was this helpful?