Open KPI Store

With the Open KPI Store it is possible to store any time series of numeric values with some identifiers in one generic table and display the result in a chart of System Analysis. The sender of these values can be anything from an APM (like DynaTrace or Introscope), a script running on a host, a weather station or coffee machine. The sender just needs to send the data in a given json format (see data format).

Concepts

Metric / KPI (Key Performance Indicator)

A metric represents one time-based numeric value series. In most cases it will be based on a continuous stream of time points of fixed frequency (for example one point per minute). One point in time can only hold one metric value (number with 25 digits whole part and 6 decimal digits). Uniqueness has to be guaranteed by the sender and will not be checked.

A metric can have a unit (20 characters) and a preferred aggregation method (3 characters, MIN, MAX, SUM, AVG, CNT) which will be used for any non-time based aggregation. The metric description can be up to 128 characters long.

A sender can have an unlimited number of metrics.

Example: File size in MB with the dimension file name. The metric aggregation will be SUM, so that the metric value is aggregated as a sum over every maintained file name. 

Dimension

Additionally to the time dependency a metric can have up to 18 character-based dimensions. The metric value for one point of time and one dimension value tuple need to be unique. This uniqueness has to be guaranteed by the sender and will not be checked.

The dimension description can be up to 128 characters long.

Dimensions are unique to the sender, not to one particular metric! If a sender has multiple metrics, they all share the same dimensions. Dimensions are also independent from each other.

The first dimension value can be up to 512 characters long, each others dimension can be up to 128 characters long.

Sender

A sender represents one generic data storage within the Open KPI Store. It can be interpreted as one monitored object that has exactly one sender type. In that case compared to system monitoring the sender would correspond to one particular technical system like ABCADM~ABAP. The sender could also be interpreted as a group of similar metrics with similar dimensions. In that case monitored objects could be modeled as the first dimension. For example multiple hosts could send their data to the same sender with different values as the first dimension.

A sender has a 32 character long GUID (globally unique Identifier) that is self-given. It is highly recommended to use a randomized alphanumeric string for this, so that it is unlikely that multiple senders feed the same data storage by accident. The sender also has a 128 character description and a 40 character sender type. The sender type specifies similar senders that feature common attributes. A sender can have up to 5 attributes (128 characters). The labels of these attributes can be maintained per sender type (using the configuration menu from within the System Analysis UI).

Reasonable attributes might describe the location (data center, geographic location like country, region, city) or the technical details of a device (brand, manufacturer, model name).

When comparing to system monitoring, the sender type can be mapped to a technical system type like "ABAP", "JAVA", or "HANA" to categorize the monitoring objects. A sender can then be interpreted as an instance of that sender type.

It is recommended to model a sender type in such a way, that the metrics and dimensions of senders within this sender type are the same or at least comparable. If different senders deliver metrics with the same name but different semantics the end user has to take care about this.

Housekeeping

The Open KPI Store is part of the RCA housekeeping job SAP_FRN_RCA_HOUSEKEEPER, which will partition the data and log data base tables. It will also delete any data entries or logs that are older then 30 days. This can be customized using SM30 view rca_hkconfig.

Sender meta data will not be cleaned up automatically.

Data Format

The sender must send the data in the following format.

  • A POST request body for the Open KPI Store consists of an array of JSON objects. Each JSON object corresponds to exactly one sender and has three parts: 
    • guid: Each sender must send a unique identifier. This can be a 32 character GUID or a 32 character hash from fields that makes itself unique. E.g. host name with concatenated value. This value is handled as the key for the sender and is not displayed in the UI.
    • meta: the meta data describing the sender and the metric data. After initial creation the meta data part is optional. Any change to the meta data can result in inconsistent data!
    • data: The numerical time series data points

The meta sub object contains the following properties:

  • description: human readable description or name of the sender, e.g. a host name
  • type: classification of the sender that can be used for logical grouping and filtering, e.g. "Host"
  • dataEncoding: Encoding for dimension values. One out of STRING (encoded as specified in request content type), UTF-8, BASE64.
  • attributes: optional set of attribute values of a sender, e.g. IP address, geographic location properties. These attributes can be used for filtering senders in the scope selection. The attribute names can be maintained separately in the Open KPI Store settings of the System Analysis application.
  • metrics: enumeration of all metrics that this sender provides. For each metric the following properties are specified:
    • description: human readable metric name. Should be unique for all senders of same type.
    • unit: The unit of measure to be displayed in the UI (max 20 chars).
    • aggregation: How to aggregate metrics cross sender and time. Possible values: AVG, SUM, MIN, MAX, CNT. Default is SUM if omitted.
  • dimensions: list of dimensions that apply to the metrics. In the case of host-related data this could be e.g. file system name or network interface name.

The data property contains a nested array of the actual data points: Each data point is an array of the following values:

  • timestamp in the format YYYYMMDDHHMMSS
  • One to n metric values corresponding to the metrics specified in the meta.metrics section. The number of metric values should exactly match to the number of metrics specified in meta.metrics. The metric values must be ordered in the same way as in the meta data, i.e. the metrics are identified by position.
  • Zero to n dimension values corresponding to the dimensions specified in the meta.dimensions section. The number of dimension values should exactly match to the number of dimensions specified in meta.dimensions.

The following constraints apply:

  • Similar senders should use the same type. This allows a better selection in the UI later.
  • If the meta data is missing and the GUID is unknown the request will return an error message!
  • Since the POST request body uses JSON format, JSON control keys (",:{}[]''=) are not permitted for descriptions and values. If it is possible that the data contains such characters, the data has to be send encoded. Possible values for the data encoding are UTF-8,BASE64,STRING.
  • Metric and dimension descriptions are position based. Same applies to the attributes.
  • Data for multiple senders can be pushed in a single call: The outermost JSON array then contains one JSON object for each sender.
  • The data is list/array based. One data line is a list of the values for the time stamp, followed by the metric values, followed by the dimension values. The value for a metric can be empty, specified as "". In that case it will not be interpreted. An initial metric value should be "0". If for a specific time stamp and combination of dimension values a metric value cannot be given or the dimension values are not application to a metric then the position of this metric value must be filled with "". The position of this metric must not be skipped completely. For example, if you specified 20 metrics in the meta section but in the current request only want to report the value for metric 17, then the data point would look like 
    [„timestamp1“, „“, …, „“, „val17“, „“, …,““, „dim1“,“dim2“,”dim3”].
  • If a data line cannot be interpreted (for example the metric value is not a number) it will be ignored.

POST Request Structure

[
{ "guid":"12345678901234567890123456789012",
    "meta":    
            { "description":"Sender description",
               "type":"Sender type",
               "dataEncoding":"STRING",
               "attributes":[{"value":"Attribute1 value"} {"value":"Attribute2 value"}],
               "metrics":[{"description":"Metric1","unit":"MB","aggregation":"SUM"},
                                    {"description":"Metric2","unit":"°C","aggregation":"AVG"}],
               "dimensions":[{"description":"Dimension1"}, 
                {"description":"Dimension2"}]
            },
  "data":[
       ["timestamp1","Metric1 value","Metric2 value","Dimensions1 value","Dimension2 value"],
       ["timestamp1","Metric1 value","Metric2 value","Dimensions1 value","Dimension2 value2"],
       ["timestamp1","Metric1 value","Metric2 value","Dimensions1 value2","Dimension2 value1"],
       ["timestamp1","Metric1 value","Metric2 value","Dimensions1 value2","Dimension2 value2"],
    ...
      ["timestamp2","Metric1 value","Metric2 value","Dimensions1 value","Dimension2 value"],
    ...}
]

  • The data is list based. One data line is a list of the values for the timestamp, followed by the metric values, followed by the dimension values. The metric value can be empty. In that case it will not be interpreted. An initial metric value should be "0".
  • The timestamp format is YYYYMMDDHHMMSS
  • If a data line cannot be interpreted (for example the metric value is not a number) it will be ignored.

[
{ "guid":"12345678901234567890123456789012"
  "data":[
    ["timestamp1","Metric1 value","Metric2 value","Dimensions1 value","Dimension2 value"],
["timestamp1",...],
    ...
["timestamp2","Metric1 value","Metric2 value","Dimensions1 value","Dimension2 value"],
    ...
]
}
]

Format for n metrics:

if you only want to provide data for metric 17 then values for all other metrics should be "".

[

[„timestamp1“, „“, …, „“, „val17“, „“, …,““, „dim1“,“dim2“,”dim3”],

[„timestamp2“, „“, …, „“, „val17“, „“, …,““, „dim1“,“dim2“,”dim3”]

]

If you want to provide data for metric 17 for multiple dimensions, same time stamp it would look like this:

[

[„timestamp1“, „val1“, …, „“, „val17“, „“, …,““, „HOST1“,“DISK1“,””],

[„timestamp1“, „val1“, …, „“, „val17“, „“, …,““, „HOST1“,“DISK2“,””],

[„timestamp1“, „val1“, …, „“, „val17“, „“, …,““, „HOST1“,“DISK3“,””],

[„timestamp1“, „val1“, …, „“, „val17“, „“, …,““, „HOST2“,“DISK1“,””],

[„timestamp1“, „val1“, …, „“, „val17“, „“, …,““, „HOST2“,“DISK2“,””],

[„timestamp1“, „val1“, …, „“, „val17“, „“, …,““, „HOST2“,“DISK3“,””],

[„timestamp2“, „val1“, …, „“, „val17“, „“, …,““, „dim1“,“dim2“,”dim3”]

]

Examples

Example 1: Post request for host-related data. Note that the pseudo-comments are just for illustration and would not be valid for the JSON payload.

[{
 "guid": "e98ea98bcce2dcd78f4dbc65a83d7189",
 "meta": {
  "description": "wdflbmd16609|SAP Netweaver|JC3_J00_server0",
  "type": "Introscope Agent",
  "dataEncoding": "STRING",
  "attributes": [{
    "value": "wdflbmd16609"
   }, {
    "value": "SAP Netweaver"
   }, {
    "value": "JC3_J00_server0"
   }, {
    "value": "SuperDomain"
   }, {
    "value": "mo-902eb6b3a.mo.sap.corp"
   }
  ],
  "metrics": [{
    "description": "Value",
    "unit": ""
   }
  ],
  "dimensions": [{
    "description": "Resource"
   }, {
    "description": "Metric"
   }
  ]
 },
 "data": [
  ["20190116080745", "425813104", "VM|Memory|Non Heap", "Used"],
  ["20190116080745", "691528056", "VM|Memory", "Java Heap Memory Consumption"],
  ["20190116080745", "0", "VM|Memory Pool|Par Eden Space", "Used After GC"],
  ["20190116080745", "0", "VM|OS", "System Time (%)"]
 ]
}]

Example 2: Post Request for Dynatrace Host Data

[{
 "guid": "57b03fd8345531de708430494a9e1629",
 "meta": {
      "description": "myhost1",
      "type": "Host",
      "dataEncoding": "STRING",
       "attributes": [{
             "value":"10.20.30.40" // IP address
        },{
             "value":"Linux" // Operating System
        },{
             "value":"dc-eu-1" // data center
       }],
       "metrics": [{
           "description": "CPU Usage","unit": "%"
        },{
           "description": "Disk space available","unit": "%"
        }],
       "dimensions": [{
             "description": "File System"
        }]
 },
 "data": [
  ["20190116071900", "55.75", "", ""], // metric value 1 = CPU Usage, metric value 2 empty, dimension empty since not applicable for CPU
  ["20190116071900", "", "95.0", "/usr"], // metric value 1 empty, metric value 2 for file system /usr
  ["20190116071900", "", "95.0", "/var"], // metric value 1 empty, metric value 2 for file system /var
  ["20190116072000", "80.34", "", ""] // next time period
]
}]

Example 2: Post request for data from weather stations. Here it was decided to also add the country code and city name as dimensions of the metrics. This is not necessary - basically the designer of the data collector can decide which modeling fits best. 

 

[{

"guid": "265FBF6CABF91ED2F3B7DCFEC3AE42BC",
"meta": {
    "description": "Frankfurt",
    "type": "Weather",
    "dataEncoding": "STRING",
    "attributes": [{
        "value": "DE"
    }, {
        "value": "Frankfurt"
    }],
    "metrics": [{
        "description": "Temperature","unit": "°C","aggregation": "AVG"
    }, {
        "description": "Humidity","unit": "%","aggregation": "AVG"
    }, {
        "description": "WindSpeed","unit": "km/h","aggregation": "AVG"
    }, {
        "description": "Barometric Pressure","unit": "hPa","aggregation": "AVG"
    }],
    "dimensions": [{
        "description": "Country Code"
    }, {
        "description": "Station Name"
    }]
    },
    "data": [
        ["20190606133000","18","59","04","1013","DE","Frankfurt"]
    ]
}]

ABAP Reports

Sender Deletion

The ABAP report RCA_GS_SENDER_DELETION enables the user to delete senders from the Open KPI Store. This includes meta information, metric data and logs. It is irreversible.

  1. Use transaction SE38 to execute program RCA_GS_SENDER_DELETION
  2. Enter the unique sender GUID you want to delete. The F4 Help offers you an easy way to select one or multiple senders from the system.
  3. Hit F8 to start the deletion.

Please make sure that the collector of the sender is disabled as well. As long as the collector is sending data and meta information to the Open KPI inbound processing, the sender will reappear in the scope selection.


Dynatrace Collector

The ABAP report RCA_GS_DYNATRACE enables the user to pull Dynatrace data into the Open KPI Store without creating an own collector. It is mainly meant to be executed as an background job.


After entering the Dynatrace URL, the environment UUID, name and the access token the user can click on the "Load Metric Definition" -button. This triggers the retrieval of the metric catalog from the Dynatrace environment, which is also used for the Dynatrace metrics selection F4-Help.

Additionally to the required fields, the user can also provide a tag defined in Dynatrace to narrow down the object selection on Dynatrace side. Dynatrace tags can be used, e.g., to map hosts and processes to Cloud Foundry spaces.

After loading the metric catalog the user can access the F4-Help to select every metric he wants to pull from the given Dynatrace environment.


Besides the available metric ID and description the F4-Help also shows the sender type this metric will be put into.

Attention: After the selection of a subset of metrics here and executing the program as a background job, it it not possible to change the selection for this environment, tag and sender type anymore. Any changes to the metric selection will result in data inconsistency! We recommend to delete any sender of the sender type in question before changing the metric subset!

If the user wants to maintain multiple environments and/or tags, he needs to create one background job per environment and tag combination.

Dynatrace collector results:

  • For every maintained metric set a new sender type will appear (for example "Dynatrace - Host" for above selection)
  • Based on the data that is coming from the Dynatrace environment a dynamic number of senders will be created in the scope selection (one for each host in our example)
  • For easier selection each sender will have the environment name and tag as an attribute value attached to it
  • Senders in the scope selection are unique for the environment UUID, tag and sender type combination to ensure data consistency.
  • Each sender within a Dynatrace sender type features the same metrics and dimensions.