“The explosion of unstructured machine-generated from logs, sensors, networks and devices over the past few years has led to an exponential increase in data volume. Furthermore, the increase has been happening in parallel with a thirst for real-time Big Data applications. Enterprises simply want to extract greater value from their real-time Big Data asset.
However, applications based on the traditional store-first, process-second data management architectures are unable to scale for real-time Big Data applications. Even Hadoop-based systems are unable to offer the combination of latency and throughput requirements for real-time applications in industries such as telecoms, Internet of Things and cybersecurity.”
Big data undoubtedly has many applications in the field of OSS/BSS. Unfortunately many of the most important use cases that require the processing of large amount of data are not so well suited to traditional big data architectures.
For CSPs it’s the real-time processing of events / records that holds much of the potential in the form of examples such as:
- Real time call rating
- Quality of service
- Fraud detection
- Event correlation (alarms, performance, security, telemetry, etc)
- Resource / capacity allocation for elastic, virtualised environments
- Real time marketing (ie location systems, special offer processing, etc)
- Network performance and traffic engineering
These types of analyses can’t afford to be performed after the event so they need evolving stream processing rather than time-insensitive big data processing.
The link above provides examples of one particular vendor’s stream processing engine and the mechanics of achieving real-time performance.
Note: I’ve had not involvement with this vendor or its products, so I am not promoting them or recommending them in any way.Read the Passionate About OSS Blog for more or Subscribe to the Passionate About OSS Blog by Email