Audit logs

Akamas audit logs

Akamas stores all its logs into an internal Elasticsearch instance: some of these logs are reported to the user in the GUI in order to ease the monitoring of workflow executions, while other logs are only accessible via CLI and are mostly used to provide more context and information to support requests.

Audit access can be performed by using the CLI in order to extract logs related to UI or API access or service or authentication logs. For instance, to extract audit logs from the last hour for the current workspace, use the following commands:

  • UI Logs

akamas logs --audit --no-pagination -S akamas-ui -f -1h
  • API Logs

akamas logs --audit --no-pagination -S kong -f -1h
  • Service Logs

# for studies, experiments, trials and steps
akamas logs --audit --no-pagination -S campaign -f -1h
# for licenses and optimization packs
akamas logs --audit --no-pagination -S license -f -1h
# for components, metrics, parameters and systems
akamas logs --audit --no-pagination -S system -f -1h
# for telemetry providers and the creations and telemetry provider instances
akamas logs --audit --no-pagination -S telemetry -f -1h
# for workflows
akamas logs --audit --no-pagination -S orchestrator -f -1h
for users and workspaces
akamas logs --audit --no-pagination -S users -f -1h
  • Authentication logs

  • All audit logs

Notice: to visualize the system logs unrelated to the execution of workflows bound to workspaces, you need an account with administrative privileges.

Remark: If you need to list logs for all workspaces, just add --platform to the commands above.

Storing audit logs into files

Akamas can also be configured to store specific access and usage logs into files in order to ease the integration with external logging systems. Enabling this feature ensures that, when a user interacts with the UI or the API, Akamas will store access and usage logs on the internal database and in specific files in a dedicated log folder. Every day Akamas will create new files following the pattern %{logtype}-%{+YYYY-MM-dd}.log to enforce automatic daily log rotation.

For instance, a log file named campaign-service-2025-01-31.log will hold logs from campaign-service (see below) for January 31st 2025. Here's a breakdown of all available log types:

ui-access: will list Akamas UI accesses

gateway-access: will list Akamas API accesses

keycloak: will list timestamps and userIds performing logins/logout

campaign-service: will list timestamps and userIds performing creations and deletions of studies, experiments, trials and steps license-service: will list timestamps and userIds performing installations and uninstallations of licenses and optimization packs

orchestrator-service: will list timestamps and userIds performing creations and deletions of workflows

system-service: will list timestamps and userIds performing creations and deletions of components, metrics, parameters and systems

telemetry-service: will list timestamps and userIds performing installations and uninstallations of telemetry providers and the creations and deletions of telemetry provider instances

users-service: will list timestamps and userIds performing creations and deletions of users and workspaces

access: will list other logs not falling in the above categories. Usually logs from Akamas initializer and elasticsearch

Docker version

To enable this feature you should:

  1. Create a folder named logs in the same directory as the Akamas docker-compose.yml file

  2. Edit the docker-compose.yml file and change the line FILE_LOG: "false" to FILE_LOG: "true"

  3. If Akamas is already running, issue the following command:

Otherwise, start Akamas first.

This way, all logs will be saved to logs folder created in step 1.

Kubernetes version

To enable this feature you should go to your Akamas chart folder, edit your values file (typically values-files/my-values.yaml), and add the following section (if a logstash: section is already present, add the new values to it):

then perform an installation/update as explained in Start the installation

In this specific case, the logs will be stored in a dedicated volume attached to the logstash pod, under the folder /akamas/logs/.

To list them you can use the command:

To read a log file you can use the command (replace LOGFILENAME.log with the actual name):

To copy them to your local machine you can use:

Last updated

Was this helpful?