Data telemetry, the neural pathways of the broadband network
Intelligent, automated networks are not possible without data-centric architectures. In order for networks to function effectively, data needs to be easily accessible via open APIs. This brings up an essential element we need to consider: telemetry.
Data telemetry is the remote collection, processing, storage and consumption of network data. You can liken telemetry to the central nervous system (CNS), where the neurons are responsible for receiving, transforming and relaying sensory input from all over our bodies. The connections form neural pathways that generate our perception of the world and determine our behavior.
Capturing the stimuli of the body with the utmost precision and transmitting the sensory information almost instantaneously to the brain is what good telemetry does.
In a traditional (i.e., non-SDN) network, data from network nodes is only collected every 5 to 15 minutes, with most of the heavy lifting of data aggregation and metric computations taking place in the node. In contrast, in SDN-based networks, data is pushed from the nodes (that’s push-based telemetry streaming) into a central data lake in the cloud. This means an SDN controller can monitor a much larger number of network devices processing many more data points – up to 20x more counters, in fact. Data can be pushed at high frequency e.g., 5 second intervals: OK, not as much as in a real CNS, but much better than legacy systems, and sufficient to enable real-time monitoring, AI/ML insights and closed-loop automation.
Good data governance is fundamental for automation; the SDN controller needs to understand what it is processing, so data needs to be structured and prepared to ensure its suitability. Modern network data telemetry provides this structure and precision, capturing complex data such as configurations, logs, alarms, and counters in the right format. It offers centralized storage to avoid fragmented and incomplete views of distributed data, ensuring consistency and high-quality data in both real-time and historical contexts.
Another similarity with neural pathways is the flexible nature of telemetry: humans can suppress or sharply focus on signals we receive from our body. Similarly, adaptive telemetry techniques support dynamic adjustments in case of changing application needs. This ensures efficient monitoring and analysis, with options for both steady-state scanning and intensive collection modes. Operators have flexibility to pick and choose the amount, complexity, and frequency of the data they want to work with.
Standardization is also essential in telemetry, using non-proprietary data formats for compatibility with machine learning algorithms. Two standards in particular – BroadBand Forum TR-436 and ETSI Zero Touch Network and Services Management (ZSM), are worth investigating as they see telemetry as an essential component of the architectures they are dealing with: Automated Intelligent Management, and Analytics & Machine Learning scenarios.
A modern network data telemetry system helps operators broaden the network behavior that can be captured by:
- Handling larger volumes of data.
- Increasing the data sampling frequency.
- Supporting adaptive data telemetry.
- Enabling data input for AI/ML models.
And what it unlocks is a wealth of new use cases across network monitoring, assurance, engineering, and planning, through to customer care, and even marketing opportunities.
You can check out real-word examples in our white paper, Modern broadband network telemetry.