You can configure Kafka Collector, to collect the data from the various data sources.
Procedure
- To configure the Kafka collector, navigate to Administration > Configuration > Collectors and Connectors.
- From the Collectors section, click Add.
- From the Collector Selection page, select kafka-collector.
The kafka-collector page is displayed.
You must provide the source Kafka topic where data come from, and the respective mapping that will be used to process incoming records.
Following table lists the parameter to configure the Kafka Collector.Input parameter Description Default Value Name of the collector Provide name of the collector NA Type Choose from following: - Metrics
- Events
Metrics Data Centre Select the location of the collector data. NA Source Kafka Configuration Topic Provide source Kafka topic where data can come from. metrics: If type of the collector is Metrics.
events: If type of the collector is Events.
Bootstrap servers Provide source Kafka broker detail. edge-kafka-bootstrap:9093 Transport Layer Security (TLS) Select the Transport Layer Security (TLS) toggle button, and click Upload TLS certificate, to upload the mandatory security certificate from your local machine. Authentication Select the Authentication toggle button, to enable authentication.
Select the authentication type from following and provide the respective details:- Plain: Provide user name and password.
- SCRAM-SHA-512: Provide user name and password.
- TLS: Upload TLS Certificate and TLS Key from your local machine.
- GSSAPI: Provides an alternate security protocol mechanism (Kerberos) for both client and server authentication. The Mandatory fields are mentioned below:
- Principal: Enter Kerberos principal is a unique identity to which Kerberos can assign tickets.
- Realm: Enter realm name.
- KDC: Enter Key Distribution center address.
- Kerberos Service Name: Enter identifier of a service instance.
- Keytab: Upload Keytab file from local machine.
The kerberos service DNS must be reachable from all the VMware Telco Cloud Service Assurance nodes.
- OAuth: Depending on the selection of OAuth type, other mandatory fields are mentioned in the table:
- Client Secret: Provide Client ID and Client Secret details.
- Refresh Token: Provide Client ID and Refresh Token details.
- Access Token: Provide Access Token details.
Avro Schema Support Select the Avro Schema Support toggle button, to enable the Avro schema support. Note: Only confluent schema registry is supported.NA Schema Registry URL Provide the registry URL of the schema in the Schema Registry URL. Note: Only confluent schema registry is supported.Schema Registry Certificate Select the Schema Registry Certificate toggle button, to upload the TLS certificate for schema registry.Note: This selection is mandatory only when the Schema Registry URL isHTTPS
.Client mTLS Select the Client mTLS toggle button, to upload the TLS Certificate and TLS Key for client.
Authentication Select the Authentication toggle button, to enable authentication.
Select the authentication type from following and provide the respective details:- Plain: Provide user name and password.
Destination Kafka Configuration (available only if type of the collector is Events.) Topic Provide Kafka topic to which the processed Event data must be written. vsa_events_raw Bootstrap servers Provide destination Kafka broker detail. edge-kafka-bootstrap:9093 Transport Layer Security (TLS) Select the Transport Layer Security (TLS) toggle button, and click Upload TLS certificate, to upload the mandatory security certificate from your local machine. Authentication Select the Authentication toggle button, to enable authentication.
Select the authentication type from following and provide the respective details:- Plain: Provide user name and password.
- SCRAM-SHA-512: Provide user name and password.
- TLS: Upload TLS Certificate and TLS Key from your local machine.
- GSSAPI: Provides an alternate security protocol mechanism (Kerberos) for both client and server authentication. The Mandatory fields are mentioned below:
- Principal: Enter Kerberos principal is a unique identity to which Kerberos can assign tickets.
- Realm: Enter realm name.
- KDC: Enter Key Distribution center address.
- Kerberos Service Name: Enter identifier of a service instance.
- Keytab: Upload Keytab file from local machine.
The kerberos service DNS must be reachable from all the VMware Telco Cloud Service Assurance nodes.
- OAuth: Depending on the selection of OAuth Type, other mandatory fields are mentioned in the table:
- Client Secret: Provide Client ID and Client Secret details.
- Refresh Token: Provide Client ID and Refresh Token details.
- Access Token: Provide Access Token details.
Advanced Configuration Application Id Provide an identifier for the stream processing application, must be unique within the Kafka cluster. Auto Offset Reset Possible values are: - Earliest: automatically reset the offset to the earliest offset.
- Latest: automatically reset the offset to the latest offset.
- None: throw exception to the consumer if no previous offset is found for the consumer's group.
- anything else: throw exception to the consumer.
Latest Group ID Provide an unique string that identifies the Connect cluster group this worker belongs to. Mapper Kafka Mapper Name of the Kafka Mapper to be used with this collector. Drop down list of all configured Kafka Mapper based on the type selected earlier. Refer, Kafka Mapper section for more information First entry of the drop down list. Mapping Definition Preview Preview the selected mapping schema definition as configured in the Kafka Mapper. NA - Click Create Collector.