To start the data flow for VMware Telco Cloud Automation Pipeline reports, you must configure the VMware Telco Cloud Service Assurance with Edge Kafka or any external Kafka.

For information about how to view the VMware Telco Cloud Automation reports, see View Pipeline Reports through VMware Telco Cloud Automation Workflow Hub in VMware Telco Cloud Service Assurance User Guide.

Prerequisites

Ensure that you configure Edge Kafka or any external Kafka for ingesting data into VMware Telco Cloud Service Assurance.
  • If you are using Edge Kafka for ingesting the data directly into VMware Telco Cloud Service Assurance, then export the KUBECONFIG file from the deployment host.
    Export KUBECONFIG= <KUBECONFIG-file-location>
    EDGENS=kafka-edge
    kubectl get secret -n $EDGENS $CLUSTER_NAME-cluster-ca-cert -o jsonpath='{.data.ca\.crt}' | base64 --decode > ca.crt
    kubectl get secret -n $EDGENS $CLUSTER_NAME-cluster-ca-cert -o jsonpath='{.data.ca\.password}' | base64 --decode > ca.password
    export CERT_FILE_PATH=ca.crt
    export CERT_PASSWORD_FILE_PATH=ca.password
    export KEYSTORE_LOCATION=cacerts
    export PASSWORD=`cat $CERT_PASSWORD_FILE_PATH`
    export CA_CERT_ALIAS=strimzi-kafka-cert
    keytool -noprompt -importcert -alias $CA_CERT_ALIAS -file $CERT_FILE_PATH -keystore $KEYSTORE_LOCATION -keypass $PASSWORD -storepass $PASSWORD
    export USER_NAME=kafka-scram-sha-512-client-credentials
    export SCRAM_PASSWORD_FILE_PATH=user-scram.password
    kubectl get secret -n $EDGENS $USER_NAME -o jsonpath='{.data.password}' | base64 --decode > $SCRAM_PASSWORD_FILE_PATH
    export SCRAM_PASSWORD=`cat $SCRAM_PASSWORD_FILE_PATH`
    <<KAFKALOCATION>>/bin/kafka-console-producer.sh --broker-list kafka-edge:32092 --producer-property security.protocol=SASL_SSL --producer-property sasl.mechanism=SCRAM-SHA-512 --producer-property ssl.truststore.password=$PASSWORD --producer-property ssl.truststore.location=$PWD/cacerts --producer-property sasl.jaas.config="org.apache.kafka.common.security.scram.ScramLoginModule required username=\"$USER_NAME\" password=\"$SCRAM_PASSWORD\";" --topic metrics < ${5Gdatadump}
  • If you are using external Kafka to ingest data into VMware Telco Cloud Service Assurance, then perform the following steps:
    1. Install Kafka in any RHEL host.
    2. Start Zookeeper.
      ${KafkaInstallLocation}/bin/zookeeper-server-start.sh -daemon ${KafkaInstallLocation}/config/zookeeper.properties
    3. Start Kafka Server.
      ${KafkaInstallLocation}/bin/kafka-server-start.sh -daemon /${KafkaInstallLocation}/config/server.properties
    4. Start Kafka Producer.
      ${KafkaInstallLocation}/bin/kafka-console-producer.sh --bootstrap-server ${kafkahost}:${kafkaport} --topic ${KafkaTopicname} < ${5Gdatadump}

Procedure

  1. Use the default tca-workflow-hub Kafka Mapper for the Pipeline reports through the VMware Telco Cloud Automation Workflow Hub.TCAWorkflowHubMapper
    The following screenshot represents the default tca-workflow-hub Kafka Mapper for the Pipeline reports through the VMware Telco Cloud Automation Workflow Hub. KafkaMapperWorkflow
  2. Configure Kafka Collector. For more information about configuring Kafka Collector, see Configuring the Kafka Collector topic.
    KafkaCollectorWorkflow1 KafkaCollectorWorkflow2