All pages
Powered by GitBook
1 of 3

Loading...

Loading...

Loading...

Install CSV provider

To install the CSV File provider, create a YAML file (called provider.yml in this example) with the specification of the provider:

# CSV File Telemetry Provider
name: CSV File
description: Telemetry Provider that enables to import of metrics from a remote CSV file
dockerImage: 485790562880.dkr.ecr.us-east-2.amazonaws.com/akamas/telemetry-providers/csv-file-provider:3.1.0

Then, you can then install the provider with the Akamas CLI:

akamas install telemetry-provider provider.yml

CSV provider

The CSV provider collects metrics from CSV files and makes them available to Akamas. It offers a very versatile way to integrate custom data sources.

Prerequisites

This section provides the minimum requirements that you should match before using the CSV File telemetry provider.

Network requirements

The following requirements should be met to enable the provider to gather CSV files from remote hosts:

  • Port 22 (or a custom one) should be open from Akamas installation to the host where the files reside.

  • The host where the files reside should support SCP or SFTP protocols.

Permissions

  • Read access to the CSV files target of the integration

Akamas supported version

  • Versions < 2.0.0 are compatibile with Akamas until version 1.8.0

  • Versions >= 2.0.0 are compatible with Akamas from version 1.9.0

Supported component types

The CSV File provider is generic and allows integration with any data source, therefore it does not come with support for a specific component type.

Setup the data source

To operate properly, the CSV file provider expects the presence of four fields in each processed CSV file:

  • A timestamp field used to identify the point in time a certain sample refers to.

  • A component field used to identify the Akamas entity.

  • A metric field used to identify the name of the metric.

  • A value field used to store the actual value of the metric.

These fields can have custom names in the CSV file, you can specify them in the provider configuration.

The page describes how to get this Telemetry Provider installed. Once installed, this provider is shared with all users of your Akamas installation and can be used to monitor many different systems, by configuring appropriate telemetry provider instances as described in the page.

Install CSV provider
Create a CSV provider instance

Create CSV telemetry instances

To create an instance of the CSV provider, build a YAML file (instance.yml in this example) with the definition of the instance:

Then you can create the instance for the system using the Akamas CLI:

timestampFormat format

Notice that the week-year format YYYY is compliant with the ISO-8601 specification, but you should replace it with the year-of-era format yyyy if you are specifying a timestampFormat different from the ISO one. For example:

  • Correct: yyyy-MM-dd HH:mm:ss

  • Wrong: YYYY-MM-dd HH:mm:ss

You can find detailed information on timestamp patterns in the Patterns for Formatting and Parsing section on the page.

Configuration options

When you create an instance of the CSV provider, you should specify some configuration information to allow the provider to correctly extract and process metrics from your CSV files.

You can specify configuration information within the config part of the YAML of the instance definition.

Required properties

  • address - a URL or IP identifying the address of the host where CSV files reside

  • username - the username used when connecting to the host

  • authType - the type of authentication to use when connecting to the file host; either password

Optional properties

  • protocol - the protocol to use to retrieve files; either scp or sftp. Default is scp

  • fieldSeparator - the character used as a field separator in the CSV files. Default is ,

You should also specify the mapping between the metrics available in your CSV files and those provided by Akamas. This can be done in the metrics section of the telemetry instance configuration. To map a custom metric you should specify at least the following properties:

  • metric - the name of a metric in Akamas

  • datasourceMetric - the header of a column that contains the metric in the CSV file

The provider ignores any column not present as datasourceMetric in this section.

The sample configuration reported in this section would import the metric cpu_util from CSV files formatted as in the example below:

Telemetry instance reference

The following represents the complete configuration reference for the telemetry provider instance.

The following table reports the configuration reference for the config section

Field
Type
Description
Default Value
Restrictions
Required

The following table reports the configuration reference for the metrics section

Field
Type
Description
Restrictions
Required

Use cases

Here you can find common use cases addressed by this provider.

Linux SAR

In this use case, you are going to import some metrics coming from , a popular UNIX tool to monitor system resources. SAR can export CSV files in the following format.

Note that the metrics are percentages (between 1 and 100), while Akamas accepts percentages as values between 0 and 1, therefore each metric in this configuration has a scale factor of 0.001.

You can import the two CPU metrics and the memory metric from a SAR log using the following telemetry instance configuration.

Using the configured instance, the CSV File provider will perform the following operations to import the metrics:

  1. Retrieve the file "/csv/sar.csv" from the server "127.0.0.1" using the SCP protocol authenticating with the provided password.

  2. Use the column hostname to lookup components by name.

  3. Use the column timestamp to find the timestamps of the samples (that are expected to be in the format specified by timestampFormat).

# CSV Telemetry Provider Instance
provider: CSV File
config:
  address: host1.example.com
  authType: password
  username: akamas
  auth: akamas
  remoteFilePattern: /monitoring/result-*.csv
  componentColumn: COMPONENT
  timestampColumn: TS
  timestampFormat: YYYY-MM-dd'T'HH:mm:ss
metrics:
  - metric: cpu_util
    datasourceMetric: user%
akamas create telemetry-instance instance.yml system
or
key
  • auth - the authentication credential; either a password or a key according to authType. When using keys, the value can either be the value of the key or the path of the file to import from

  • remoteFilePattern - a list of remote files to be imported

  • componentColumn - the header of the column containing the name of the component. Default is COMPONENT

  • timestampColumn - the header of the column containing the timestamp. Default is TS

  • timestampFormat - the format of the timestamp (e.g. yyyy-MM-dd HH:mm:ss zzz). Default is YYYY-MM-ddTHH:mm:ss

  • The port to connect to, in order to retrieve the file

    22

    1≤port≤65536

    No

    username

    String

    The username to use in order to connect to the remote machine

    Yes

    protocol

    String

    The protocol used to connect to the remote machine: or

    scp

    scp sftp

    No

    authType

    String

    Specify which method is used to authenticate against the remote machine:

    • password: use the value of the parameter auth as a password

    • key: use the value of the parameter auth as a private key. Supported formats are RSA and DSA

    password key

    Yes

    auth

    String

    A password or an RSA/DSA key (as YAML multi-line string, keeping new lines)

    Yes

    remoteFilePattern

    String

    The path of the remote file(s) to be analyzed. The path can contains expressio

    A list of valid path for linux

    Yes

    componentColumn

    String

    The CSV column containing the name of the component.

    The column's values must match (case sensitive) the name of a component specified in the System

    COMPONENT

    The column must exists in the CSV file

    Yes

    timestampColumn

    String

    The CSV column containing the timestamps of the samples

    TS

    The column must exists in the CSV file

    No

    timestampFormat

    String

    Timestamps' format

    YYYY-mm-ddTHH:MM:ss

    Must be specified using .

    No

    fieldSeparator

    String

    Specify the field separator of the CSV

    ,

    , ;

    No

    Yes

    scale

    Decimal number

    The scale factor to apply when importing the metric

    staticLabels

    List of key-value pairs

    A list of key-value pairs that will be attached to the specific metric sample

    No

    Collect the metrics (two with the same name, but different labels, and one with a different name):

    • cpu_util: in the CSV file is in the column %user and attach to its samples the label "mode" with value "user".

    • cpu_util: in the CSV file is in the column %system and attach to its samples the label "mode" with value "system".

    • mem_util: in the CSV file is in the column %memory.

    address

    String

    The address of the machine where the CSV file resides

    A valid URL or IP

    Yes

    port

    metric

    String

    The name of the metric in Akamas

    An existing Akamas metric

    Yes

    datasourceMetric

    String

    The name (header) of the column that contains the specific metric

    DateTimeFormatter (Java Platform SE 8)
    SAR

    Number (integer)

    An existing column in the CSV file

    TS,                   COMPONENT,  user%
    2020-04-17T09:46:30,  host,       20
    2020-04-17T09:46:35,  host,       23
    2020-04-17T09:46:40,  host,       32
    2020-04-17T09:46:45,  host,       21
    provider: CSV File             # this is an instance of the CSV provider
    config:
      address: host1.example.com   # the address of the host with the CSV files
      port: 22                     # the port used to connect
      authType: password           # the authentication method
      username: akamas             # the username used to connect
      auth: akamas                 # the authentication credential
      protocol: scp                # the protocol used to retrieve the file
      fieldSeparator: ","          # the character used as field separator in the CSV files
      remoteFilePattern: /monitoring/result-*.csv    # the path of the CSV files to import
      componentColumn: COMPONENT                     # the header of the column with component names
      timestampColumn: TS                            # the header of the column with the time stamp
      timestampFormat: YYYY-mm-ddTHH:MM:ss           # the format of the timestamp
    metrics:
      - metric: cpu_util                             # the name of the Akamas metric
        datasourceMetric: user%                      # the header of the column with the original metric
        staticLabels:
          mode: user                                 # (optional) additional labels to add to the metric
    hostname, interval,     timestamp, 		        %user,	%system,      %memory
    machine1, 600,		2018-08-07 06:45:01 UTC,	30.01,	20.77,		96.21
    machine1, 600,		2018-08-07 06:55:01 UTC,	40.07,	13.00,		84.55
    machine1, 600,		2018-08-07 07:05:01 UTC,	5.00,	90.55,		89.23
    provider: CSV File
    config:
      remoteFilePattern: /csv/sar.csv
      address: 127.0.0.1
      port: 22
      username: user123
      auth: password123
      authType: password
      protocol: scp
      componentColumn: hostname
      timestampColumn: timestamp
      timestampFormat: yyyy-MM-dd HH:mm:ss zzz
    metrics:
      - metric: cpu_util
        datasourceMetric: %user
        scale: 0.001
        staticLabels:
          mode: user
      - metric: cpu_util
        datasourceMetric: %system
        scale: 0.001
        staticLabels:
          mode: system
      - metric: mem_util
        scale: 0.001
        datasourceMetric: %memory
    SCP
    SFTP
    GLOB
    Java syntax