Please note: This functionality was replaced with the new Aggregation Framework in SAP Focused Run 3.0 FP02
System Analysis Aggregation allows to retain selected metric data from System Monitoring (UDM) store beyond the standard retention time of the System Monitoring store. The selection of metric data is defined based on variants. Using these variants, a daily task in Expert Scheduling Management collects and aggregates data from the System Monitoring store and persists it in a dedicated store.
System Analysis Aggregation supports two different use cases out of the box:
Note: “System Analysis Configuration” and “System Analysis Aggregation” will be used alternatively in this document. It will refer to the same application here.
Simple to use variant configuration, which allows user to maintain the systems, metrics, target granularity and aggregation store for target execution.
Provides aggregation functionality according to the variants configured and runs once daily
The target groups are service providers and customers who need to aggregate data beyond the standard retention time of the System Monitoring store for longer durations. This data is intended to be used for reporting and analysis.
General Availability with SAP Focused Run 1.0 FP01. The Release Notes are available in SAP Help Portal of SAP Focused Run.
Expert Scheduling Management application is setup. Refer to “Expert Scheduling Management Cockpit” application under SAP Focused Run Expert Portal.
Refer to the following sections in the master guide of SAP Focused Run to setup System Analysis Aggregation;
Note: The below configuration steps are shown from SAP Focused Run 2.0 FP02.
It is possible to configure aggregation of System Monitoring data from System Analysis Configuration application available under Advanced Root Cause Analysis.
If the task is not created in Expert Scheduling Management, then by default the interface pops-up to maintain the task when you launch System Analysis Configuration application. The log store and collection timestamp can also be maintained.
Note: The screenshots are taken from SAP Focused Run 2.0 FP02.
You will need to maintain the below attributes for a variant:
3 Overlap Period / Lookback time (in days)
4 Retention Period (in days): After this period data will be eliminated from the aggregate store
5 Aggregation Granularity
a. 5 minutes
b. 15 minutes
6 Customer Network (If maintained empty, all customer networks are considered)
7 Data Store (at least one is mandatory): “Standard Aggregate Table Store” is the only option supported for consumption in the System Analysis application.
8 Technical System (at least one is mandatory)
a. Data Center
b. System Type
c. IT Admin Role
d. Lifecycle Status
e. Extended System ID
9 Metric (at least one is mandatory)
a. Metric Category
b. Metric Name
It is possible to do the below actions with variants with System Analysis Configuration application.
Create: You to create a variant by providing the relevant information needed for the variant. By default, the variant is active.
Edit: You can edit a variant by providing the relevant information needed for the variant. It is not possible to change the name of the variant. The variant retains the active/inactive status.
Delete: You can delete the variant from the configuration
Calculate duplicates: You can check for duplicates across variants. The check is to find out if Extended System IDs and Metrics are duplicated across variants.
Activate: You can change the state of the variant to active. Only active variants are considered for aggregation during the task run.
Deactivate: Possible to change the state of the variant to inactive. The inactive variants are not considered for aggregation during the task run.
Example configuration: You want to aggregate all performance metrics of all customer networks for productive ABAP systems and retain data for one year. Granularity of aggregated data should be one hour. Overlap between System Monitoring raw (UDM) data and aggregated data should be minimal.
Notes: The overlap period (between the UDM store and the aggregate store) of one day indicates that the oldest data of the UDM store should be aggregated. For a default UDM store retention time of 28 days this means, e.g., on September 9 the data of August 11 will be aggregated. Since the aggregate store is configured to cover a period of 365 days, both stores in combination cover
In System Analysis, you can see 392 days of data as in this example (data from both System Monitoring and System Analysis Aggregation store).
(UDM store retention) + (aggregated store retention) – (overlap) = 28 + 365 – 2 = 391 days
Example configuration: You want to export all exception metrics for HANA systems to CSV files daily for the preceding business day.
The destination folder for the generated CSV files can be configured via variable SYSANA_REPORT_DIR in transaction SFILE. The user executing the export (typically FRN_BTC_SRA) needs the role “SAP_FRN_AAD_SYA_ALL” to write to the file system.
In the above example, the initial run will cover 28 days of data aggregation which will then be exported to csv. To avoid performance issues due to large number of systems. It is recommended to start with a small metric set for the initial aggregation and increase the number of metrics in the variant for the subsequent task run.
The status of the aggregation task needs to be regularly monitored in “Expert Scheduling Management cockpit” application under “Infrastructure Administration”. You need to select the “System Analysis” task to see the status.
It is also possible to jump into the task directly from System Analysis Configuration application.
You want to run the report program manually to aggregate data in chunks when the overlap period and the number of the systems are high. In the first run, there are chances that the task runs for a long time or gets banned due to performance constraints. This needs to be done for the period of overlap and suggested to run daily in the background.
Example: If the overlap period is 28 days, then it is recommended to run the report program in the background in the System Monitoring data store 28 times (representing time range of raw data in System Monitoring store).
1. Expert Scheduling Management is already configured.
2. System Analysis Configuration task is created and at least one variant is created through “System Analysis Configuration”
1. In the Home page, navigate to “Expert Scheduling Management cockpit”.
2. Select the “System Analysis” task, click on “Manage” and “Deactivate”.
3. On the column “Active Status”, confirm if the status of “System Analysis” task is “Task Deactivated”
Run the report
1. Execute the program “SYSANA_AGGR_EXECUTION”
i. Variant Name (assumption: there are no duplicates maintained in System Analysis Configuration)
ii. Start Time stamp (example: 20190630000000; YYYYMMDDhhmmss)
iii. End Time stamp (example: 20190630000000; YYYYMMDDhhmmss)
i. Records aggregated
3. Run the report in the background to the times equal to the overlap period. If there is no data for a specific day in the System Monitoring store, then no aggregation will be done.
You can check the table “SYSANA_AGGR_DATA” for aggregated data.
1. In the Home page, navigate to “Expert Scheduling Management cockpit”.
2. Select the “System Analysis” task, click on “Manage” and “Activate”.
3. On the column “Active Status”, confirm if the status of “System Analysis” task is “Task Activated”.
How to find the oldest timestamp in System Monitoring data store?
Option 1: You can check the report program “MAI_UDM_STORE_PARTITIONING” for the partition settings maintained and calculated the oldest timestamp in System Monitoring data store.
Option 2: You can use the class method “CL_SYA_MAI_UTILS -> GET_UDM_STORE_MIN_RAW” in transaction SE24 to find the oldest timestamp in the System Monitoring data store.
You need to use the below programs to manage your personal data.
SYSANA_PERS_DATA_DELETE – Report program to delete the user if the user already exists in System Analysis Aggregation application
SYSANA_PERS_DATA_USAGE – Report program to check if the user id is maintained in the System Analysis Aggregation application