To start the data flow for VMware Telco Cloud Automation Pipeline reports, you must configure the VMware Telco Cloud Service Assurance with Kafka brokers in Airflow, or Edge Kafka, or any external Kafka.

For information about how to view the VMware Telco Cloud Automation reports, see View Pipeline Reports in VMware Telco Cloud Service Assurance User Guide.

Prerequisites

Ensure that you configure either Kafka brokers in Airflow, or Edge Kafka, or any external Kafka for ingesting data into VMware Telco Cloud Service Assurance.
  • If you are using Kafka brokers in Airflow for ingesting the data into VMware Telco Cloud Service Assurance, then perform the following steps:
    1. For Kafka brokers, navigate to Airflow > Variables and configure kafka_brokers variable.
      Note: Airflow used here is a third-party platform.
      Following is an example of kafka_brokers configuration in Airflow:
      [{
          "bootstrap_servers": ["kafka-edge:32092 "],
          "topic": "tca-stages",
          "security_protocol": "SASL_SSL",
          "additional_properties" : {
              "sasl_mechanism": "SCRAM-SHA-512",
              "sasl_plain_username": "kafka-scram-sha-512-client-credentials",
              "sasl_plain_password" : "RDFYek9vSTJCNGRYCg=="
          },
          "password": "Y2EucGFzc3dvcmQK",
          "ssl_cafile": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUZMVENDQXhXZ0F3SUJBZ0lVVlcrT2lEYVlUM3QyeWh6T3lidHJEelFrcXB3d0RRWUpLb1pJaHZjTkFRRU4KQlFBd0xURVRNQkVHQTFVRUNnd0thVzh1YzNSeWFXMTZhVEVXTUJRR0ExVUVBd3dOWTJ4MWMzUmxjaTFqWVNCMgpNREFlRncweU1qQTVNVE14TWpJMU5UVmFGdzB5TXpBNU1UTXhNakkxTlRWYU1DMHhFekFSQmdOVkJBb01DbWx2CkxuTjBjbWx0ZW1reEZqQVVCZ05WQkFNTURXTnNkWE4wWlhJdFkyRWdkakF3Z2dJaU1BMEdDU3FHU0liM0RRRUIKQVFVQUE0SUNEd0F3Z2dJS0FvSUNBUUMyR2l6cTl0aEs2aHhqaS9PeDRlNHprVVdDK3hWQzVNTW5ybGlzSWUrWApyMXFVWllnNTJxYWxJc0NaeGlER1FiM3BMb3R4bTJGNEhOQm10NlJuVWQ3QkJwaHFkcys3MjFSanpRZ1BZbmYyCkVtU0doOHMrai8wWlMzMmQ0ZUtKKzMwMVlSa0ljSkNQa3dTOWN5SkNVMU95OWoyTG9FZ21PNnlSVmplVm9YaFEKYjJ3ZmpqY2d1bTFkYmNYanNSY1R1cHRxclk4MlJVbWVoL0pPbkh1d016TUFFUElBYU9WZEpvVytrRW5oTFF0Zwp3cEUvMHltNncxelhWZm5iSjlXZUNkcnUwaEdJZ2ZkaWZ1MUdJTnM4UDVCcmU2TGZhcjRvZmlYMXJLNENHMHZWCkxKTTJoS29DYm1zRGhpNXc3azgxZ1V4b3hicDBKMTZWQWs5UWtCdnZwVG5zZnBhTDh5ZWMwMTBRS1ZtS0VUQWgKMHFVZFhxRy9XaUE5bVZlaU1lY1owSlp1RTViY2hsZnVrdTFoT0dBRlVvdVVDaDJuV1lYcHg5azdueW01VlN5ZQpEb3puSU8rSkpLMWREVlZwbUVJaUJSbzRrZTBySVhBZ0ZMc3F1U2tLNStlWEZBUkJLVEdISUlEWWYzeFZKdVJ5CkVpWkQ4NGdrK0RpSC9kTThGWkFrTXpjRGdCaUdKTjA0QysralZNYXZqd0pNRzlVV2FhaGgzV3UraHBlY2g3VnMKT0VtaXRHRGxTWUszWHdMNktYSnUrRjMwZkNyYlBGc2xIaTFmTjR5OVRGWWo0UmFOSmREUG0rQmRtMDEvR0taUQpjakhNbWU0VzhTa3J4TlhxLzhwWTJhdVE4cEVYWmlWdlNPRHhDRmMwbEQzVisxUXo0WVVrZXhQanBuNTVUQ1RtCjd3SURBUUFCbzBVd1F6QWRCZ05WSFE0RUZnUVVsR0traFpZTTNUN0FyQmM1SGRLT2w1SWJlSDh3RWdZRFZSMFQKQVFIL0JBZ3dCZ0VCL3dJQkFEQU9CZ05WSFE4QkFmOEVCQU1DQVFZd0RRWUpLb1pJaHZjTkFRRU5CUUFEZ2dJQgpBSG9MM2I5V1JWZUc5dElPQzROcCtZanVWclBiaEtoaTY0UU8vVEw0OVdJVjNRYkxsV0tlRkhacUtQNkJKV29qCi9EYkFtUVNaeWxmbkNzejZubll0RDNJcnFjN2w5WENGNjAwZEZaRCtGNkp1dnpCdlFxR3VJYlAza0czOHlDdU4KQWdyWHBiaXlRQXhRMzhXSnRFSlc4UHFzVzZtdnFSSEY2YUt4VXg4eDF4dWZKSHY4aWcxcG9PT1VqNWZ1Y044QwpQUWY1NGEzWm5YajA2K0QxM1Q2dnh6dzRlMVQyVllYWEtCSEFDL0hIZ095akZpbUhJeDg3eU9Dczduc2Q3c0NwCnU4RE9aT21URWRlUEkxcW95R2w5VmNBdjdVdERjdHI2dmNmdHFHZ2dMNUtEaTVFd09odDdjL1VuNEd0OFQwQTUKNDRWejJvdnZvcDFkU1JuNWplVHFocmxxUTRCVENrb0tSUFIxeEduMmFid1oxOWJFMEVzOXhjMkhWVEk4cDBDYwpJT085SW1mcVptVlBqQjc3cmZSZVpNWmRtTmpqOVVmeHBJMGZvcXZKN21kTTFYNzR1S2tCbTVkbG1sdUFRRE5rCk9BUFdhMnVvVXJEOGhkSUgrT0MrTzFueE02SW9Odm85aU5sdDFUaGl1RC9Mb2EvMUw0YWljQk9pK2dKbjk0UWEKclpQVlo5d2RxNjYvT0VoNlpyZmtadU1iMmVaRUZmZDJzN1dPYW54V0s1YXlNTTZKbmJLUUhjQ25iUWJZUG1lVQpNenF5c090SWxqeWQ1VnVQbVlzLyt1RUd5S1V6U2RIN1NQYTloeXFGWWJKUmJTY09yVFdhNTlTazlCZ1dwREtqClJ0ZkdHdzRieUZsd2FNbW5kYk11L25hU0M2V005eVVsSWJFdmJzWk9hTDUxCi0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K"
      }]
      Get the passwords and certificate by executing the following commands from the VMware Telco Cloud Service Assurance deployment VM:
      ssl_cafile:
      export CLUSTER_NAME=edge
      EDGENS=kafka-edge
      kubectl get secret -n $EDGENS $CLUSTER_NAME-cluster-ca-cert -o jsonpath='{.data.ca\.crt}'
      
      password:
      kubectl get secret -n $EDGENS $CLUSTER_NAME-cluster-ca-cert -o jsonpath='{.data.ca\.password}' > ca.password
      
      sasl_plain_password:
      export USER_NAME=kafka-scram-sha-512-client-credentials
      export SCRAM_PASSWORD_FILE_PATH=user-scram.password
      kubectl get secret -n $EDGENS $USER_NAME -o jsonpath='{.data.password}' > $SCRAM_PASSWORD_FILE_PATH
      export SCRAM_PASSWORD=`cat $SCRAM_PASSWORD_FILE_PATH`
    2. Run the following statefulset commands:
      kubectl get statefulset -n airflow 
      NAME                 READY   AGE
      airflow-postgresql   1/1     78d
      airflow-redis        1/1     78d
      airflow-worker       3/3     78d
      capv@airflow-cnc-control-plane-vjbx5 [ ~ ]$ 
  • If you are using Edge Kafka for ingesting data into VMware Telco Cloud Service Assurance, then perform the following steps from the deployment host.
    1. Install Kafka.
    2. Export KUBECONFIG=<KUBECONFIG-file-location>
      EDGENS=kafka-edge
      kubectl get secret -n $EDGENS $CLUSTER_NAME-cluster-ca-cert -o jsonpath='{.data.ca\.crt}' | base64 --decode > ca.crt
      kubectl get secret -n $EDGENS $CLUSTER_NAME-cluster-ca-cert -o jsonpath='{.data.ca\.password}' | base64 --decode > ca.password
      export CERT_FILE_PATH=ca.crt
      export CERT_PASSWORD_FILE_PATH=ca.password
      export KEYSTORE_LOCATION=cacerts
      export PASSWORD=`cat $CERT_PASSWORD_FILE_PATH`
      export CA_CERT_ALIAS=strimzi-kafka-cert
      keytool -noprompt -importcert -alias $CA_CERT_ALIAS -file $CERT_FILE_PATH -keystore $KEYSTORE_LOCATION -keypass $PASSWORD -storepass $PASSWORD
      export USER_NAME=kafka-scram-sha-512-client-credentials
      export SCRAM_PASSWORD_FILE_PATH=user-scram.password
      kubectl get secret -n $EDGENS $USER_NAME -o jsonpath='{.data.password}' | base64 --decode > $SCRAM_PASSWORD_FILE_PATH
      export SCRAM_PASSWORD=`cat $SCRAM_PASSWORD_FILE_PATH`
      <<KAFKALOCATION>>/bin/kafka-console-producer.sh --broker-list kafka-edge:32092 --producer-property security.protocol=SASL_SSL --producer-property sasl.mechanism=SCRAM-SHA-512 --producer-property ssl.truststore.password=$PASSWORD --producer-property ssl.truststore.location=$PWD/cacerts --producer-property sasl.jaas.config="org.apache.kafka.common.security.scram.ScramLoginModule required username=\"$USER_NAME\" password=\"$SCRAM_PASSWORD\";" --topic metrics < ${5Gdatadump}
  • If you are using external Kafka to ingest data into VMware Telco Cloud Service Assurance, then perform the following steps:
    1. Install Kafka in any RHEL host.
    2. Start Zookeeper.
      ${KafkaInstallLocation}/bin/zookeeper-server-start.sh -daemon ${KafkaInstallLocation}/config/zookeeper.properties
    3. Start Kafka Server.
      ${KafkaInstallLocation}/bin/kafka-server-start.sh -daemon /${KafkaInstallLocation}/config/server.properties
    4. Start Kafka Producer.
      ${KafkaInstallLocation}/bin/kafka-console-producer.sh --bootstrap-server ${kafkahost}:${kafkaport} --topic ${KafkaTopicname} < ${5Gdatadump}

Procedure

  1. Use the default tca-pipeline Kafka Mapper that is configured for the VMware Telco Cloud Automation Pipeline reports.TCAPipelineKafkaMapper

    The following screenshot represents the default tca-pipeline Kafka Mapper for VMware Telco Cloud Automation Pipeline Reports. TCAPipelineKafkaMapper

  2. Configure Kafka Collector. For more information about configuring Kafka Collector, see Configuring the Kafka Collector topic.
    KafkaCollectorPipelineReport KafkaCollectorPipelineReport