Log File Content

Log files are widely used by software components to provide execution details. You, as operator, are eager to identify any critical situation being reported by such log files, within your managed landscape.

Using SAP Focused Run, you can centrally monitor such error and warning log messages, being persisted inside text files. However, a set of preconditions apply to enable this capability:

  • On the respective hosts, make sure the Simple Diagnostics Agent have the necessary access to the relevant log file(s). Elsewise, consider extending the allow list.
  • The syntax of log lines in scope must be described, by creating and uploading a Log File Content Definition. This typically enables SAP Focused Run to identify the text-portions, representing the log time stamp, log severity, and log message. Additional context information can be extracted along with any such log lines.

Implement the latest version of SAP Note 3399091 Corrections for EXM 3rd party Log File Monitoring.

Integration & Exception Monitoring Setup

To enable the monitor of relevant log entries, proceed as follows.


Enable the log file collection – Part I: Make sure the log file is accessible

  1. Open the ‘File System Browser' application tile in transaction FRUN
  2. In the scope selection, specify the host of the log file to be monitored
  3. Navigate to the location of the relevant log file.
    Remark: In case the root folder of the log file is not listed, consider extending the directory allow list, as described in KBA 2632477.
  4. Try to open the log file by clicking on the log file name.
    Remark: In case you get a permission error while opening the log file in scope, consider extending the file extension allow list, as described in KBA 2603369.


Enable the log file collection – Part II: Create a collection filter of type Log File Content

  1. Having cross-checked that the log file(s) in scope are reachable, open the 'Integration & Exception Monitoring' application. You can start it from the associated tile in transaction FRUN
  2. Click on the Configuration Icon in the upper right corner of the 'Integration & Exception Monitoring' application
  3. In the configuration panel expand the 'Technical Systems' tray and click on the pen icon in the upper right corner. This will open the 'Integration & Exception Monitoring - Systems' view. 
  4. If your system is not on the list yet, click the 'Add' button to add it. If it is on the list, simply click on the System ID of the system.
  5. In the next step, you see all monitoring categories being available for the system. Select the monitoring category: Log File Content
  6. Pressing ‘Next', you see a filter overview table, listing the default or already defined collection filters for that technical system.
  7. In the table header press the ‘Add New Filter' icon, to specify which log file you want to monitor.
    Remark:

In the context of Exception Monitoring, you can specify only one filter-value for a given filter-parameter. For instance, you must define 3 collection filters, if you would like to collect exceptions that fulfil the following criteria:

  • Having a message containing the string “Program error xyz” and with username “Harry” → Filter 1
  • or having as message “Unhandled error abc” → Filter 2
  • or having as username “Granger” → Filter 3

Alike, if you would like to monitor several different log file contents, you must define one collection filter per type of log file to be monitored.

Note: With SAP Focused Run 4.0 FP02 you can monitor different types of log files for a given system, under the condition that they are located on different hosts.


Enable the log file collection – Part III: Define and maintain the necessary File Content Definitions

  1. Define and maintain, per Customer Network, the necessary File Content Definitions, using the Exception Monitoring Administration tool. This tool is accessible using the Customize link.
    Note: You find this navigation link nearby the File Content Definition input field.
  2. In section Custom Log File Content Definitions, choose the Customer Network corresponding to the managed system for which you would like to define a log file content to be supported.
  3. A Log File Content Definition JSON has typically the follow structure:

Log File Content Definition

{

    "logModels": [{

            "id": "<ID of 1st log content definition>",

            "name": "<Name of 1st log content definition>",

            "version": "<Version number placeholder for documentation purposes, like '1.0'>",

            "fileSet": {

                "type": "<Timestamp or Numbering>",

                "format": "yyyy-MM-dd or Ascending",

                "intro": "",

                "trailer": ""

            },

            "logEntry": {

                "utcDiff": "",

                "lines": "1"

            },

            "lineRegExPatterns": [

                {

                    "regEx": "<Regular expression describing the text portion groups in the log line. Note: Backslash characters must be doubled (\\)>",

                    "fields": [{

                            "fieldName": "Log.TimeStamp",

                            "position": "<Reg expr group number, where the log timestamp is positioned>",

                            "timestampFormats": [ "<Array of expected time format strings>" ]

                        },

                        {

                            "fieldName": "Log.Severity",

                            "position": "<Reg expr group number, where the log severity is positioned>",

                            "severityMapping": {

                                "Error": [ "<Array of Strings designating error logs, like 'SEVERE','ERR', etc.>" ],

                                "Warning": [ "<Array of Strings designating warning logs, like 'WARNING','WARN', etc.>" ]

                            }

                        },

                        {

                            "fieldName": "Log.Message",

                            "position": "<Reg expr group number, where the log message is positioned>"

                        },

                        {

                            "fieldName": "<Any relevant additional log context information>",

                            "position": "<Reg expr group number, where this additional parameter is positioned>"

                        },

                        {

                            "fieldName": "Location",

                            "position": "6"

                        }

                    ]

                }

            ]

        }

    ]

}


Here a sample, as delivered by SAP:

Sample, as delivered by SAP

{

    "logModels": [

{

            "id": "EXM_SAP_WebDispatcher_FileSet",

            "name": "SAP WEBDISPATCHER LOG",

            "version": "1.2",

            "fileSet": {

                "type": "Numbering",

                "format": "Ascending",

                "intro": "",

                "trailer": ""

            },

            "logEntry": {

                "utcDiff": "",

                "lines": "1"

            },

            "lineRegExPatterns": [{

                    "regEx": "^\\[([^\\]]*)\\] (.*) (.*) - \".*(GET|POST|HEAD|PUT|UPDATE) (.*) HTTP\\/[\\d]\\.[\\d]\" ([\\d]{3}) ([\\d]+) ([\\d]+) epp\\[([^\\]]*)\\] ref\\[(.*(siteId=(.*?))?)\\]",

                    "fields": [{

                            "fieldName": "Log.Timestamp",

                            "position": "1",

                            "timestampFormats": [{

                                    "format": "dd/MMM/yyyy:hh:mm:ss Z",

                                    "lang": "en",

                                    "country": "US"

                                }

                            ]

                        }, {

                            "fieldName": "Location",

                            "position": "2"

                        }, {

                            "fieldName": "User",

                            "position": "3"

                        }, {

                            "fieldName": "Method",

                            "position": "4"

                        }, {

                            "fieldName": "URL",

                            "position": "5"

                        }, {

                            "fieldName": "Log.Severity",

                            "position": "6",

                            "severityMapping": {

                                "Error": ["500", "501", "502", "503"],

                                "Warning": ["401", "402", "403", "404", "405"]

                            }

                        }, {

                            "fieldName": "RespSize",

                            "position": "7"

                        }, {

                            "fieldName": "RespTime",

                            "position": "8"

                        }, {

                            "fieldName": "Log.Message",

                            "position": "12"

                        }

                    ]

                }

            ]

        },

[...]

]

}


}


4
  Click on the Upload JSON link to upload a definition.

CAUTION: The custom definition you might have already uploaded for the selected Customer Network will be overridden by this operation. If needed, you can download the currently used definition, before proceeding with a new upload.


Enable the log file collection – Part IV: Specify the log file collection filter

  1. Back in the Integration & Exception Monitoring configuration screen, you need to complete the input fields, within the lower panel, as described here below:

a  Filter Name:
Choose a meaningful name for the collection filter, typically designating the type of log file

b  Host name(s):
Choose the host(s) where that type of log file is present (in the context of the selected system)

c  File Name Pattern:
Enter the name of the log file including the file extension. You can use wildcards to collect a set of files.
Example: catalina.*.log

d  Folder Path:
Specify the absolute path to the folder where the log file(s) are located.
Example:
For Windows hosts: F:\\SAP BusinessObjects\tomcat\logs\
For Unix hosts: /usr/sap/BO4/ccw/sap_bobj/tomcat/logs/

e  File Content Definition:
Select the log file content definition corresponding to the log format. The drop-down list contains a few SAP predefined sample definitions. However, as already explained you can upload additional custom definitions, using the Customize link.
Remark: You will notice that the list of Filter Parameters being displayed in the table at the bottom, depends on the File Content Definition that you selected.

f  Filter Parameters:
Specify the filter values that shall apply to the data collection.
Remark: The values specified inside the 'Monitoring' step apply to the forthcoming data being collected from the managed system. When specifying no value at all for the listed filter parameters, all forthcoming data available within the managed system is collected. Please consider the implications on the resource consumption.
Note: As described in the following section, you usually define filter groups in the 'Alerting' step, to alert on subsets of the overall collected data for that category.

2  Press ‘Save'


Enable the log file collection – Part V: Restart the Simple Diagnostics Agent (with SAP Focused Run 4.0 FP02)

  1. With SAP Focused Run 4.0 FP02, each time you define or modify such a Log File Content category filter, remind to also restart the impacted Simple Diagnostics Agent(s). Else, the settings won't be considered.

Configure Alerts

The setup of the collection filters only ensures that the relevant data is collected. However, alerts are not created automatically. To create alerts and notifications you have to create alert filters. 

  1. Go to the view 'Alerting'
  2. Click on the '+' button in the upper right corner of the alerts table. A new panel will appear below the table for you to enter the alert information
  3. Select the monitoring category
  4. Select the metric name. The available metrics depend on the monitoring category.
  5. Enter a name for the alert

In the sub-panels, you can maintain the alert filter. For most monitoring categories the available filter fields will be the same as for the 'Monitoring' configuration described above. For some metrics, you have additional filter fields.

You can check the collected data in SAP Focused Run to determine which filter values to use for alerting. Most of the fields can be found in the Collection Context of the collected data.

Please consider that on 'Alerting' level the filters only apply on data being collected and stored in SAP Focused Run. They do not influence the data collection itself.

  1. Enter a name for the alert filter
  2. Maintain the necessary filter fields

In the last sub-panel, you have to activate the alert and can change other alert attributes:

  1. Check the check box next to 'Active'
  2. You can adjust the calculation frequency and the severity
  3. Available threshold types depend on the selected metric. If the threshold type allows it, you can adjust the threshold value which triggers the alert.
  4. Select the notification variant and the outbound connector variant from the drop-down list.
  5. Enter a description for the alert. You can use the placeholders described here in the alert description.

For single exceptions, the threshold type is always 'Already Rated'. This means depending on the calculation frequency, the number of newly collected exceptions is checked: an alert is created if this number is higher than 0. If you want to reduce the number of alerts for these metrics, you could increase the value for the calculation frequency to increase the time between the alert instance creations.

Available Metrics

For Log File Content the following metrics are collected:

Log File Content

  • The collector retrieves any metric as declared within the associated File Content Definition.  Refer to Enable the log file collection – Part III: Define and maintain the necessary File Content Definitions.