Streaming key performance indicators for CICS Transaction Server for z/OS monitoring

IBM® Z Common Data Provider uses SMF_110_1_KPI to collect key performance indicators for CICS® Transaction Server for z/OS® monitoring. In addition to the fields in SMF_110_1_KPI, you can use DEFINE TEMPLATE to stream more data fields.

Before you begin

For more information about the content of SMF_110_1_KPI data stream, see SMF_110_1_KPI data stream content. For more information about the fields in SMF_110_1_KPI, see Table 1.

About this task

You can create DEFINE TEMPLATE statement to filter more fields of SMF_110_1_KPI records and customize the data streams to stream the fields. After that, create and update the policy in the Configuration Tool to include the data stream.

Procedure

  1. If you do not already have one, create a partitioned data set (PDS) that is used as the user concatenation library for the custom template definition.
    For more information about how to create the partitioned data set, see step 1.a in Creating a System Data Engine data stream definition.
  2. Copy the sample update and template definitions from the member HBOUUKPI of the SMP/E target data set hlq.SHBODEFS to the user concatenation library, and edit the definitions based on your requirements.
    The following example shows how to define an update and a template definitions for filtering more fields of SMF_110_1_KPI records based on the sample template definition.
    SET IBM_FILE = 'SMF110xx';
    
    DEFINE UPDATE SMF_110_1_CUST
      VERSION 'CDP.210'
      FROM SMF_CICS_T
      TO &IBM_UPDATE_TARGET
      AS &IBM_FILE_FORMAT SET(ALL);
    
    DEFINE TEMPLATE SMF_110_1_CUST FOR SMF_110_1_CUST
      ORDER
      (SMFMNTME,
       SMFMNDTE,
       fld1,
       fld2,
       ......
        fldn)
      AS &IBM_FILE_FORMAT;
    
    SET
    The SET statement is needed only when the target of the data stream is a file, which means the variable IBM_UPDATE_TARGET is set to FILE &IBM_FILE.
    DEFINE UPDATE
    The custom update definition name must be unique among update definitions. For the language reference of the DEFINE UPDATE statement, see DEFINE UPDATE statement.
    DEFINE UPDATE SMF_110_1_CUST
    You can change the value of CUST in SMF_110_1_CUST according to your needs.
    DEFINE TEMPLATE
    For filtering more fields of SMF_110_1_KPI records, add a DEFINE TEMPLATE statement for the update definition in the same data set member of that update definition. The template definition name must be the same as the update definition name to replace the default template definition that streams all fields for the update definition.

    For versions before 4Q2019 PTF, in the template definition, you must include the date SMFMNDTE and time SMFMNTME fields from the SMF record header of SMF_CICS_T. These fields are required for timestamp resolution when you ingest data to your analytics platform.

    fld1, fld2, fildn
    This section defines the fields in SMF_110_1_KPI record. These fields are separated by commas. You can select the fields that are listed in Fields for SMF_110_1_CUST data stream.

    For the language reference of the DEFINE TEMPLATE statement, see DEFINE TEMPLATE statement.

  3. Validate the syntax of the custom update and template definitions.
    Use the following example job to verify the member of the custom definitions.
    //HBOJBCOL JOB (),'DUMMY',MSGCLASS=X,MSGLEVEL=(,0),
    //         CLASS=A,NOTIFY=&SYSUID                  
    //*                                                            
    //HBOSMFCB EXEC PGM=HBOPDE,REGION=0M,PARM='SHOWINPUT=YES'      
    //STEPLIB  DD   DISP=SHR,DSN=hlq.SHBOLOAD                   
    //HBOOUT   DD   SYSOUT=*                                       
    //HBODUMP  DD   SYSOUT=*                                       
    //HBOIN    DD   DISP=SHR,DSN=hlq.SHBODEFS(HBOCCSV)          
    //         DD   DISP=SHR,DSN=hlq.SHBODEFS(HBOCCORY)         
    //         DD   DISP=SHR,DSN=hlq.SHBODEFS(HBOLLSMF)
    //         DD   DISP=SHR,DSN=hlq.SHBODEFS(HBOTCIFI)  
    //         DD   DISP=SHR,DSN=hlq.SHBODEFS(HBORS110)         
    //         DD   DISP=SHR,DSN=USERID.LOCAL.DEFS(HBOUUKPI)        
    //         DD   *                                              
    COLLECT SMF                                                    
    WITH STATISTICS                                                
    BUFFER SIZE 1 M;                                               
    //*                                                            
    //HBOLOG   DD   DUMMY 
    
    hlq
    Change the hlq to the high-level qualifier for the IBM Z Common Data Provider SMP/E target data set.
    // DD DISP=SHR,DSN=USERID.LOCAL.DEFS(HBOUUKPI)
    Specifies the data set member for the custom definitions. USERID.LOCAL.DEFS is the user concatenation library. HBOUUKPI is the member that contains the update and template definitions. Replace the values based on your configuration.
    Important: Ensure that the definitions are error-free by running the validation job before you create the custom data stream.
    Messages are in the output file that is defined by HBOOUT.
    If there is no syntax error, you see the following messages.
    HBO0201I Update SMF_110_1_CUST was successfully defined.
    HBO0500I Template SMF_110_1_CUST was successfully defined.
    

    If there are syntax errors, correct the errors according to the messages in the output file.

  4. Validate the data collection with the custom update and template definitions.

    Collect data from an SMF data set that contains SMF type 110 subtype 1 records by using a batch System Data Engine job, and validate the data by reviewing the output data set.

    Use the following example job to verify the data that is collected with the custom definitions.
    //HBOJBCOL JOB (),'DUMMY',MSGCLASS=X,MSGLEVEL=(,0),
    //         CLASS=A,NOTIFY=&SYSUID                  
    //*                                                            
    //HBOSMFCB EXEC PGM=HBOPDE,REGION=0M,PARM=' ALLHDRS=YES'      
    //STEPLIB  DD   DISP=SHR,DSN=hlq.SHBOLOAD                   
    //HBOOUT   DD   SYSOUT=*                                       
    //HBODUMP  DD   SYSOUT=*                                       
    //HBOIN    DD   DISP=SHR,DSN=hlq.SHBODEFS(HBOCCSV)          
    //         DD   DISP=SHR,DSN=hlq.SHBODEFS(HBOCCORY)        
    //         DD   DISP=SHR,DSN=hlq.SHBODEFS(HBOLLSMF)         
    //         DD   DISP=SHR,DSN=hlq.SHBODEFS(HBOTCIFI)  
    //         DD   DISP=SHR,DSN=hlq.SHBODEFS(HBORS110)         
    //         DD   DISP=SHR,DSN=USERID.LOCAL.DEFS(HBOUUKPI)        
    //         DD   *                                              
    COLLECT SMF                                                    
    WITH STATISTICS                                                
    BUFFER SIZE 1 M;
    /*
    //HBOLOG   DD DISP=SHR,DSN=HLQ.LOCAL.SMFLOGS                                               
    //*                                                            
    //SMF110xx DD   DSN=USERID.SMF110xx.CSV,                          
    //         DISP=(NEW,CATLG,DELETE),SPACE=(CYL,(10,10)),        
    //         DCB=(RECFM=V,LRECL=32756)
    
    hlq
    Change hlq to the high-level qualifier for the IBM Z Common Data Provider SMP/E target data set.
    // DD DISP=SHR,DSN= USERID.LOCAL.DEFS(HBOUUKPI)
    Specifies the data set member for the custom definitions. USERID.LOCAL.DEFS is the user concatenation library. HBOUUKPI is the member that contains the update and template definitions. Replace the values based on your configuration. Ensure that the record definition member is included before the update definition member.
    //HBOLOG DD DSN=
    Specifies the SMF data set that contains your SMF records.
    //SMF110xx DD DSN=
    Specifies the data set that stores the output data. Ensure that this value is the same as the value of the statement SET IBM_FILE= in the corresponding update definition. The output data set is a CSV file which you can download and open with spreadsheet applications for validation.
  5. Create a custom System Data Engine data stream named SMF_110_1_CUST in the Configuration tool.
    For more information about how to create the custom System Data Engine data stream, see Creating a System Data Engine data stream definition.

    Verify that the data stream name, the custom update definition name, and the custom template definition name are the same.

    Fill in the SHBODEFS data set members field as:
    HBOLLSMF
    HBORS110
    HBOTCIFI
    HBOUUKPI
  6. Update your analytics platform so that it can process the new data stream.
    • If you are ingesting SMF_110_1_CUST data to the Elastic Stack, for each data stream, create a field name annotation configuration file, and a timestamp resolution configuration file in the Logstash configuration directory.
      Field name annotation configuration file
      The file is named H_SMF_110_1_CUST.lsh. Here is an example of the file:
      # CDPz ELK Ingestion
      #
      # Field Annotation for stream zOS-SMF_110_1_CUST
      #
      
      filter {
         if [sourceType] == "zOS-SMF_110_1_CUST " {
      
            csv{ columns => [  "Correlator", " SMFMNTME", "SMFMNDTE", "fld1”, "fld2", "fldn" ]
               separator => "," }
         }
      }
      
      Make sure the value of CUST in SMF_110_1_CUST is the same as the value that is specified for the update definition name.
      sourceType
      The value of sourceType must match the data source type of the data stream. The naming convention is zOS-SMF_110_1_CUST.
      if [sourceType] == "zOS-SMF_110_1_CUST"
      fld1, fld2, and fldn
      Replace fld1, fld2, and fldn with the fields and order in your custom define template definition. Keep Correlator as the first column in the list.
      Timestamp resolution configuration file
      The file is named N_SMF_110_1_CUST.lsh. Here is an example of the file:
      # CDPz ELK Ingestion
      #
      # Timestamp Extraction for stream zOS-SMF_110_1_CUST
      #
      
      filter {
         if [sourceType] == "zOS-SMF_110_1_CUST" {
            mutate{ add_field => {
               "[@metadata][timestamp]" => "%{SMFMNDTE} %{SMFMNTME}"
              }}
            date{ match => [
                   "[@metadata][timestamp]", "yyyy-MM-dd HH:mm:ss:SS"
              ]}
         }
      }
      
      Make sure the value of CUST in SMF_110_1_CUST is the same as the value that is specified for the update definition name.
      sourceType
      The value of sourceType must match the data source type of the data stream. The naming convention is zOS-SMF_110_1_CUST.
      if [sourceType] == "zOS-SMF_110_1_CUST"
      Restart Logstash after you create the files for the new data stream. Refer to Logstash documentation for more information about the configuration files.
    • If you are ingesting SMF_110_1_CUST data to Splunk, define the layout of the data stream to the Splunk server by creating the props.conf file in the Splunk_Home/etc/apps/ibm_cdpz_buffer/local directory on the Splunk server. If the props.conf file exists, append the following content to the file.
      #
      # SMF_110_1_CUST
      #
      
      [zOS-SMF_110_1_CUST]
      TIMESTAMP_FIELDS = SMFMNDTE, SMFMNTME, timezone
      TIME_FORMAT= %F %H:%M:%S:%2Q %z
      FIELD_NAMES = "sysplex","system","hostname","","","sourcename","timezone","Correlator","SMFMNTME","SMFMNDTE","fld1”, "fld2", "fldn"
      INDEXED_EXTRACTIONS = csv
      KV_MODE = none
      NO_BINARY_CHECK = true
      SHOULD_LINEMERGE = false
      category = Structured
      disabled = false
      pulldown_type = true
      
      Make sure the value of CUST in SMF_110_1_CUST is the same as the value that is specified for the update definition name.
      [zOS-SMF_110_1_CUST]
      You must specify the data source name of the data stream. The naming convention is zOS-SMF_110_1_CUST.
      FIELD_NAMES
      Replace fld1, fld2, and fldn with the fields and order in your custom template definition. If the column Correlator exists, do not remove it.
      In the Splunk user interface, you must also configure the file to data source type mapping for the new data stream. The file that the Data Receiver saves is named zOS-data_stream_name-*.cdp. For example, the data stream SMF_110_1_CUST has the file that is named CDP-zOS-SMF_110_1_CUST-*.cdp.

      Restart the Splunk server after you make the changes. Refer to Splunk documentation for more information.

  7. Create or update the policy to add the new System Data Engine data stream SMF_110_1_CUST.
    1. In the Configuration Tool primary window, create a new policy or select the policy that you want to update.
    2. Click the Add Data Stream icon Add Data Stream icon in the Policy Profile Edit window.
    3. Find and select the new data stream from the list in the select data stream window.
    4. Assign a subscriber for each new data stream.
    5. In the Policy Profile Edit window, click SYSTEM DATA ENGINE to ensure that values are provided for USER Concatenation and CDP Concatenation fields, and click OK. Fill in the field USER Concatenation with the data set name of your user concatenation library.
    6. Click Save to save the policy.
    Important: Each time that the associated update definition or template definition is changed, you must edit and save the policy in the Configuration Tool so that the changes are reflected in the policy.
    For more information on how to update a policy, see Updating a policy.
  8. Restart the Data streamer and the System Data Engine.