Feed distribution enables you to customize the Uhana by VMware feed to your configured external sources.
The Data ingestion and distribution engine DIDE distributes a subset of the incoming feed that is ingested and processed by the Uhana by VMware platform to external consumers. You can use feed distribution to reduce compute and network load on critical network elements, such as eNodeBs and MMEs. It does this by making the network elements stream trace data to a single endpoint (DIDE) for ingestion and distribution to other endpoints.
You can access the Feed distribution page in the System menu to view, edit, delete existing distributions, and configure new feed distributions.
The following topics describe this functionality.
This section describes the different data types that can be distributed by DIDE to external consumers.
This section describes how an external consumer can consume messages from a feed distribution configured in the Uhana by VMware platform. DIDE allows configuring and consuming feed distributions with a Kafka consumer or a TCP client. The consumption options include
A Kafka feed distribution uses a Kafka consumer to consume a configured feed distribution. This distribution is supported for all the data types described in the Feed distribution data types section.
Use this distribution to replicate subsets of the data to different users who might need to analyze the data for a group of eNodeBs. It is also useful for feeding the data into an operator's data lake for offline storage and analysis.
Kafka distributions are secure and admins can configure the users, consumer groups, and hosts that are authenticated and authorized to consume these distributions. See Kafka feed distribution configuration details.
A TCP feed distribution uses a TCP client to consume TCP streams from a specified IP address and port. This distribution is supported for RAW_CTR and RAW_CTUM data types.
Use this distribution for external legacy systems that have been designed to ingest data directly using TCP connections initiated by eNodeBs and MMEs. DIDE replicates the trace data streaming functionality of eNodeBs and MMEs by initiating and maintaining TCP connections for every eNodeB's CTR data and for every MME's CTUM data.
This topic describes the various components of the Feed distribution page and how to configure a feed distribution.
The default view of Feed distribution page lists the feed distributions that are currently configured. The following actions are available in this view.
The following table specifies the user inputs required to configure a feed distribution.
Field | Description |
---|---|
Name | A unique name for the distribution. For a Kafka distribution, this also represents the name of the Kafka topic to which messages are distributed. |
Feed Type | Type of distributed messages. This can be one of RAW_CTR, RAW_CTUM, DECODED_CTR or TAGGED_CTR. |
Anonymize | Remove sensitive user information such as IMSI and IMEI. This is only applicable when the feed type is RAW_CTUM or TAGGED_CTR since only these feeds contain sensitive user information. |
Events | List of CTR events that need to be distributed. This is only applicable when the feed type is RAW_CTR, DECODED_CTR or TAGGED_CTR. |
eNodeBs | List of eNodeBs for which the feed needs to be distributed. This is applicable for all feed types. Select eNodeBs by choosing from a list or by drawing a polygon on a map or by uploading a file containing the list of eNodeBs. |
Output | Output type of the feed distribution; Kafka or TCP. |
Depending on the output type, additional fields are required to configure the distribution. These are specified in the following tables.
Field | Description |
---|---|
Authentication | Create a new user or use an existing Kafka user for this distribution. |
Username | Create a new Kafka user name or select an existing user name. |
Password | Specify a password to create a new Kafka user. |
Consumer Group | Specify the name of the Kafka consumer group to authorize consumption for this feed distribution. |
Allow Hosts | Specify an optional list of host IP addresses from which authorized Kafka consumers can consume this feed distribution. |
Field | Description |
---|---|
IPv4 | IPv4 address to which TCP connections need to be established for streaming the configured feed. |
Port | Port number in the specified IPv4 address where a TCP client listens for incoming connections. |
After saving the configuration, your feed distribution start for the updated configuration within one minute.