Log File Content
Log files are widely used by software components to provide execution details. You, as operator, are eager to identify any critical situation being reported by such log files, within your managed landscape.
Log files are widely used by software components to provide execution details. You, as operator, are eager to identify any critical situation being reported by such log files, within your managed landscape.
Using SAP Focused Run, you can centrally monitor such error and warning log messages, being persisted inside text files. However, a set of preconditions apply to enable this capability:
Implement the latest version of SAP Note 3399091 Corrections for EXM 3rd party Log File Monitoring.
To enable the monitor of relevant log entries, proceed as follows.
Enable the log file collection – Part I: Make sure the log file is accessible
Enable the log file collection – Part II: Create a collection filter of type Log File Content
In the context of Exception Monitoring, you can specify only one filter-value for a given filter-parameter. For instance, you must define 3 collection filters, if you would like to collect exceptions that fulfil the following criteria:
Alike, if you would like to monitor several different log file contents, you must define one collection filter per type of log file to be monitored.
Note: With SAP Focused Run 4.0 FP02 you can monitor different types of log files for a given system, under the condition that they are located on different hosts.
Enable the log file collection – Part III: Define and maintain the necessary File Content Definitions
{
"logModels": [{
"id": "<ID of 1st log content definition>",
"name": "<Name of 1st log content definition>",
"version": "<Version number placeholder for documentation purposes, like '1.0'>",
"fileSet": {
"type": "<Timestamp or Numbering>",
"format": "yyyy-MM-dd or Ascending",
"intro": "",
"trailer": ""
},
"logEntry": {
"utcDiff": "",
"lines": "1"
},
"lineRegExPatterns": [
{
"regEx": "<Regular expression describing the text portion groups in the log line. Note: Backslash characters must be doubled (\\)>",
"fields": [{
"fieldName": "Log.TimeStamp",
"position": "<Reg expr group number, where the log timestamp is positioned>",
"timestampFormats": [ "<Array of expected time format strings>" ]
},
{
"fieldName": "Log.Severity",
"position": "<Reg expr group number, where the log severity is positioned>",
"severityMapping": {
"Error": [ "<Array of Strings designating error logs, like 'SEVERE','ERR', etc.>" ],
"Warning": [ "<Array of Strings designating warning logs, like 'WARNING','WARN', etc.>" ]
}
},
{
"fieldName": "Log.Message",
"position": "<Reg expr group number, where the log message is positioned>"
},
{
"fieldName": "<Any relevant additional log context information>",
"position": "<Reg expr group number, where this additional parameter is positioned>"
},
{
"fieldName": "Location",
"position": "6"
}
]
}
]
}
]
}
Here a sample, as delivered by SAP:
{
"logModels": [
{
"id": "EXM_SAP_WebDispatcher_FileSet",
"name": "SAP WEBDISPATCHER LOG",
"version": "1.2",
"fileSet": {
"type": "Numbering",
"format": "Ascending",
"intro": "",
"trailer": ""
},
"logEntry": {
"utcDiff": "",
"lines": "1"
},
"lineRegExPatterns": [{
"regEx": "^\\[([^\\]]*)\\] (.*) (.*) - \".*(GET|POST|HEAD|PUT|UPDATE) (.*) HTTP\\/[\\d]\\.[\\d]\" ([\\d]{3}) ([\\d]+) ([\\d]+) epp\\[([^\\]]*)\\] ref\\[(.*(siteId=(.*?))?)\\]",
"fields": [{
"fieldName": "Log.Timestamp",
"position": "1",
"timestampFormats": [{
"format": "dd/MMM/yyyy:hh:mm:ss Z",
"lang": "en",
"country": "US"
}
]
}, {
"fieldName": "Location",
"position": "2"
}, {
"fieldName": "User",
"position": "3"
}, {
"fieldName": "Method",
"position": "4"
}, {
"fieldName": "URL",
"position": "5"
}, {
"fieldName": "Log.Severity",
"position": "6",
"severityMapping": {
"Error": ["500", "501", "502", "503"],
"Warning": ["401", "402", "403", "404", "405"]
}
}, {
"fieldName": "RespSize",
"position": "7"
}, {
"fieldName": "RespTime",
"position": "8"
}, {
"fieldName": "Log.Message",
"position": "12"
}
]
}
]
},
[...]
]
}
}
4 Click on the Upload JSON link to upload a definition.
CAUTION: The custom definition you might have already uploaded for the selected Customer Network will be overridden by this operation. If needed, you can download the currently used definition, before proceeding with a new upload.
Enable the log file collection – Part IV: Specify the log file collection filter
a Filter Name:
Choose a meaningful name for the collection filter, typically designating the type of log file
b Host name(s):
Choose the host(s) where that type of log file is present (in the context of the selected system)
c File Name Pattern:
Enter the name of the log file including the file extension. You can use wildcards to collect a set of files.
Example: catalina.*.log
d Folder Path:
Specify the absolute path to the folder where the log file(s) are located.
Example:
For Windows hosts: F:\\SAP BusinessObjects\tomcat\logs\
For Unix hosts: /usr/sap/BO4/ccw/sap_bobj/tomcat/logs/
e File Content Definition:
Select the log file content definition corresponding to the log format. The drop-down list contains a few SAP predefined sample definitions. However, as already explained you can upload additional custom definitions, using the Customize link.
Remark: You will notice that the list of Filter Parameters being displayed in the table at the bottom, depends on the File Content Definition that you selected.
f Filter Parameters:
Specify the filter values that shall apply to the data collection.
Remark: The values specified inside the 'Monitoring' step apply to the forthcoming data being collected from the managed system. When specifying no value at all for the listed filter parameters, all forthcoming data available within the managed system is collected. Please consider the implications on the resource consumption.
Note: As described in the following section, you usually define filter groups in the 'Alerting' step, to alert on subsets of the overall collected data for that category.
2 Press ‘Save'
Enable the log file collection – Part V: Restart the Simple Diagnostics Agent (with SAP Focused Run 4.0 FP02)
The setup of the collection filters only ensures that the relevant data is collected. However, alerts are not created automatically. To create alerts and notifications you have to create alert filters.
In the sub-panels, you can maintain the alert filter. For most monitoring categories the available filter fields will be the same as for the 'Monitoring' configuration described above. For some metrics, you have additional filter fields.
You can check the collected data in SAP Focused Run to determine which filter values to use for alerting. Most of the fields can be found in the Collection Context of the collected data.
Please consider that on 'Alerting' level the filters only apply on data being collected and stored in SAP Focused Run. They do not influence the data collection itself.
In the last sub-panel, you have to activate the alert and can change other alert attributes:
For single exceptions, the threshold type is always 'Already Rated'. This means depending on the calculation frequency, the number of newly collected exceptions is checked: an alert is created if this number is higher than 0. If you want to reduce the number of alerts for these metrics, you could increase the value for the calculation frequency to increase the time between the alert instance creations.
For Log File Content the following metrics are collected:
Log File Content