Telemetry ONAP Integration

So far in last two part of this series we have spoke about 1) Juniper Telemetry 2)Brief intro to ONAP and in this part let dive into telemetry integration in ONAP.

In many deployments, customers could already be using some type of data collection/analysis infrastructure and would ideally want the device to pump data into that system instead of building another collector to receive data from Junos and integrate that into rest of their system.

screen shot 2019-01-27 at 1.53.03 pm

Fig1: Typical telemetry deployment.

NTF Agent (Network Telemetry Framework) is designed to help solve this problem. NTF agent will provide the ability to push JTI data in different formats to different types of endpoints. NTF agent will be available as part on Junos 19.x release which would be available for download from the public website. NTF agent will be providing a plugin architecture to write southbound (output) python plugins that can help in pushing data to different endpoints.

screen shot 2019-01-27 at 1.53.22 pm

Fig2: Work flow of NTF agent

In case of ECOMP /ONAP the streaming data need to be in VES data format before we jump into how NTF agent could help us integrate Junos telemetry into ONAP , let have a quick look at on VES itself.

VES

Virtual Event Streaming (VES) Collector is RESTful collector for processing JSON messages into DCAE. The collector verifies the source (when authentication is enabled) and validates the events against VES schema before distributing to DMAAP MR topics for the downstream system to subscribe.

Why VES?

Telemetry data formats and semantics / expected actions by management systems vary widely For Fault Events, vendors use SNMP, 3GPP Corba, MTOSI, OSSJ etc, and semantics can differ (e.g. Critical Severity as “1” or “5”). For Measurement events (KPI/KCI), vendors deliver CSV or XML based files, with varying internal data formats.This variance results in substantial development and maintenance costs for device integration into management systems, typically 3-6 months development is needed. With VES streaming there is a significant reduction in effort to integrate telemetry into automated device management systems, by promoting convergence to a common event streaming format and collection system.

VES Collector

VES collector is a collection framework which is present as a part of the DCAE project. The collection framework provides collectors that need to collect information from the infrastructure. The infrastructure elements can be both physical or virtual elements. As a part of life cycle management and data movement, the streamed information are loaded into particular topics of DMaaP . Various components of ONAP, for example Event Processor, Policy Management etc will subscribe to these topics of the DMaaP and take necessary actions or process the data further to perform closed loop automation.

Telemetry data today vary widely received from management systems in terms of format. VES format provides a common event header which can be consumed by all the other components within ONAP. A VES agent will collect the data from the VNF and deliver it to the VES collector within DCAE. The VES collector will then validate the formatted data against a well-defined schema. The schema is a simple json file which indicates all the fields carrying information from the XNF (VNF/PNF) for processing/consuming.

The above figure (fig 5) demonstrates the flow of how data is processed

Data flow:

  1. VNF’s use REST calls to push measurement data into DCAE VES collector.
  2. VES collector validates, filters the data received from the VNF and publishes onto a topic on the DMaaP
  3. The analytics application receives the measurement data on the specific measurement topic of DMaaP
  4. The analytics application analyzes the data on the measurement topic, and if any alert condition which is defined by DCAE is met, it would publish an alert message on the event topic of DMaaP
  5. Other components of ONAP which are subscribed to the event topic receive the data accordingly.

Junos does support stream telemetry data openconfig key/value pair but VES has different data structure then just vanilla key/value pair. The device needs to convert this data into valid VES format and then send to DCAE collector for it pass the schema validation. This is where NTF agent come into play which allow user to change the data format as needed using python output plugin.

screen shot 2019-01-27 at 1.53.54 pm

Fig3: Work flow of NTF agent as a VES output plugin

screen shot 2019-01-27 at 2.10.29 pm

Fig4: Openconfig based output parsed into VES format using NTF agent

With the use of NTF-Agent user written output plugins we can stream juniper telemetry data out of the box in VES format directly. The output plugin can be written in python.

Below is the configuration on vMX to use the python output plugin to stream out syslog data in VES format to the VES collector within DCAE.

Sample configuration on Juniper Device

                     For syslog sensor
[edit services]
analytics {
    agent {
        service-agents py_ves {
            inputs {
                analytics {
                    parameters {
                        sample-frequency 0;
                        sensors /junos/events;
                        generate-tags
                    }
                }
            }

            outputs py_ves {
                parameters {
                    ves_domain syslog;
                }
            }
        }
    }
}
                       For linecard sensor

[edit services]
analytics {
    agent {
        service-agents py_ves {
            inputs {
                analytics {
                    parameters {
                        sample-frequency 0;
                        sensors /junos/system/linecard/cpu/memory/
                        generate-tags
                    }
                }
            }
            outputs py_ves {
                parameters {
                ves_domain interfaces;
                }
            }
        }
    }
}

Data Captures on Juniper device and VES collector

Output plugin info

root@vmx5# run show services analytics agent
Agent ID        Output Plugins    Input Plugins    Process ID
py_ves          1                 1                30073
root@vmx5# run show services analytics agent detail
Analytics agent:
Process ID          : 17043
Configuration File  : /var/etc/ntf-agent.conf
Log File            : /var/log/ntf_trace.log
Service Agent Count : 1
Analytics service agent(s):
  Agent Name        : py_ves
  Input Plugin/s    : grpc
  Output Plugin/s   : py_ves
  Process ID        : 30073
                      Output plugin VES format

root@vmx5:~ # more /var/tmp/py_ves.log
py_ves - INFO - Entering INIT
py_ves - INFO - Entering FLUSH
py_ves - INFO - {

    "event": {
        "commonEventHeader": {
            "domain": "syslog",
            "eventId": "UI_COMMIT_PROGRESS",
            "eventName": "syslog",
            "lastEpochMicrosec": 657858,
            "priority": "Normal",
            "reportingEntityName": "vmx5",
            "sequence": 4,
            "sourceId": "65535",
            "sourceName": "vmx5",
            "startEpochMicrosec": 657858,
            "version": 3.0
        },
        "syslogFields": {
            "eventSourceHost": "vmx5",
            "eventSourceType": "router",
            "syslogFacility": 23,
            "syslogFieldsVersion": 3,
            "syslogMsg": "UI_COMMIT_PROGRESS: Commit operation in progress: signaling 'agent for all the telemetry sensors', pid 6002, signal 1, status 0 with notification errors enabled",
            "syslogPri": 6,
            "syslogTag": "NIL "
        }
    }
}
requests.packages.urllib3.connectionpool - INFO - Starting new HTTP connection (1): 10.13.82.192
py_ves - INFO - POSTED TO DMAAP
py_ves - INFO - Message Accepted
py_ves - INFO - Entering INIT
py_ves - INFO - Entering FLUSH
py_ves - INFO - {

"event": {
    "otherFields": {
      "otherFieldsVersion": 1.1000000000000001,
      "hashOfNameValuePairArrays": [
        {
          "arrayOfFields": [
            {
              "name": "mem-util-kernel-size",
              "value": "2147479528"
            },
            {

              "name": "mem-util-kernel-bytes-allocated",
              "value": "173348352"
            },
              < ------ Trimmed ----------------------- >
              {
              "name": "mem-util-kernel-mdi-allocations-failed",
              "value": "0"
            }
          ],
          "name": "FPC0:CPU0"
        }
      ]
    },
    "commonEventHeader": {
      "eventId": "1",
      "reportingEntityName": "vmx5",
      "domain": "syslog",
      "lastEpochMicrosec": 4117135757,
      "sequence": 0,
      "sourceId": "0",
      "eventName": "/components/component/",
      "sourceName": "vmx5",
      "version": 3.0,
      "startEpochMicrosec": 4117135757,
      "priority": "Normal"
    }
  }
}

requests.packages.urllib3.connectionpool - INFO - Starting new HTTP connection (1): 10.13.82.192
py_ves - INFO - POSTED TO DMAAP
py_ves - INFO - Message Accepted

DCAE VES collector log

root@onap-dcae:~# docker ps  | grep ves
cefabcd07446        nexus3.onap.org:10001/onap/org.onap.dcaegen2.collectors.ves.vescollector:1.2.0      "/opt/app/docker-e..."   5 weeks ago         Up 5 weeks          8443/tcp, 0.0.0.0:8081->8080/tcp                                                                       mvp-dcaegen2-collectors-ves 
root@mvp-dcaegen2-collectors-ves:~# tail -f logs/collector.log
Input Messsage: {"event":{"commonEventHeader":{"startEpochMicrosec":1501922333000,"sourceId":"65535","sequence":6,"eventId":"1","domain":"syslog","lastEpochMicrosec":1501922333000,"eventName":"syslog","sourceName":"vmx5","priority":"Normal","version":3,"reportingEntityName":"vmx5"},"syslogFields":{"syslogTag":"Tag123","syslogFieldsVersion":3,"eventSourceType":"router","syslogMsg":"can't exec getty 'none' for port /dev/console: No such file or directory","eventSourceHost":"vmx5"}}}
[2018-12-04 21:07:46,886][INFO ][http-nio-8080-exec-9][org.onap.dcae.restapi.endpoints.EventReceipt]VESversion: v5 Schema File:./etc/CommonEventFormat_28.4.1.json
[2018-12-04 21:07:46,905][INFO ][http-nio-8080-exec-9][org.onap.dcae.restapi.endpoints.EventReceipt]Validation successful
[2018-12-04 21:07:46,905][INFO ][http-nio-8080-exec-9][org.onap.dcae.restapi.endpoints.EventReceipt]MessageAccepted and k200_ok to be sent

DMaaP Logs:

root@onap-message-router:~# docker exec -it f1cb81f888a9 bash
bash-4.4# cd /opt/kafka/bin
bash-4.4# ./kafka-topics.sh --list --zookeeper 10.13.82.183:2181
AAI-EVENT
telegraf
unauthenticated.SEC_EVENT_OUTPUT
unauthenticated.SEC_FAULT_OUTPUT
unauthenticated.SEC_MEASUREMENT_OUTPUT
unauthenticated.SEC_SYSLOG_OUTPUT
unauthenticated.TCA_EVENT_OUTPUT
unauthenticated.VES_MEASUREMENT_OUTPUT
unauthenticated.VES_SYSLOG_OUTPUT <
unauthenticated.VES_TELEMETRY_OUTPUT <


bash-4.4# ./kafka-console-consumer.sh --bootstrap-server 10.13.82.183:9092 --topic unauthenticated.VES_SYSLOG_OUTPUT
{
    "event": {    <<<<<<<<<<<<<<<<<<<<<<<<<<<<<< Sensor : /junos/event
        "commonEventHeader": {
            "domain": "syslog",
            "eventId": "UI_COMMIT_PROGRESS",
            "eventName": "syslog",
            "lastEpochMicrosec": 657858,
            "priority": "Normal",
            "reportingEntityName": "vmx5",
            "sequence": 4,
            "sourceId": "65535",
            "sourceName": "vmx5",
            "startEpochMicrosec": 657858,
            "version": 3.0
        },
        "syslogFields": {
            "eventSourceHost": "vmx5",
            "eventSourceType": "router",
            "syslogFacility": 23,
            "syslogFieldsVersion": 3,
            "syslogMsg": "UI_COMMIT_PROGRESS: Commit operation in progress: signaling 'agent for all the telemetry sensors', pid 6002, signal 1, status 0 with notification errors enabled",
            "syslogPri": 6,
            "syslogTag": "NIL "
        }
    }
}

"event": {  <<<<<<<<<<<<<<<<<<<<<<<<<<<<<< Sensor : /interface
    "otherFields": {
      "otherFieldsVersion": 1.1000000000000001,
      "hashOfNameValuePairArrays": [
        {
          "arrayOfFields": [
            {
              "name": "mem-util-kernel-size",
              "value": "2147479528"
            },
            {
              "name": "mem-util-kernel-bytes-allocated",
              "value": "173348352"
            },
              < ------ Trimmed ----------------------- >
              {
              "name": "mem-util-kernel-mdi-allocations-failed",
              "value": "0"
            }
          ],
          "name": "FPC0:CPU0"
        }
      ]
    },
    "commonEventHeader": {
      "eventId": "1",
      "reportingEntityName": "vmx5",
      "domain": "syslog",
      "lastEpochMicrosec": 4117135757,
      "sequence": 0,
      "sourceId": "0",
      "eventName": "/components/component/",
      "sourceName": "vmx5",
      "version": 3.0,
      "startEpochMicrosec": 4117135757,
      "priority": "Normal"
    }
  }
}

Leave a Reply

Your email address will not be published. Required fields are marked *