When people talk about the value of OSS data, the discussion invariably turns to privacy and some of the unconscionable things already done with our personal data. Off the top of my head, I can’t remember any telcos blatantly misusing the highly privileged information they have access to.
This is possibly a double-edged sword. It hints at the trust that has been earned over decades by many of the telco brands with our personal data. But it also possibly suggests that the telcos haven’t looked to monetise the data they have access to as aggressively as some other organisations have done.
Telcos have generally monetised by subscription to services rather than offering free services that are (possibly) surreptitiously funded by advertisers, et al. Vastly different business models. Telcos have seemed to be more cautious and stringent with the de-personalisation of any data they collect and use (although maybe that’s just a case of me being on the outside looking in to many of these organisations).
A recent discussion with an Internet of Things (IOT) dashboard / visualisation provider expert gave me an insight into the possibility of telcos to use data that is not just de-personalised, but was never even personalised to begin with.
His company connects to all sorts of sensors for smart city projects. Sensors that gather data across temperature, humidity, noise, air quality, parking bays, lighting, people / vehicle movement, number of mobile devices connecting to access points and much more. He told me he can tell when storms are brewing in the cities they monitor because he can see degradation in air quality long before anything appears on weather radars.
Telcos already have towers within close proximity to large swathes of the world’s population. Having these types of sensors (and more) mounted to every tower could provide additional sources from which to unlock really valuable insights and streaming decision support. Combine that with the wealth of knowledge available about our networks, the number of people connected, the number of services active, the volume of data being consumed, the geo-location of the crowds, etc, all of which has no individual personal identifiers.
Imagine if the GPS in our cars not only routed us around traffic snarls, but also around areas where air quality is poor or noise levels are dangerously high. Or perhaps guided us to the optimum combination of available car parking spaces and distance to walk to a sporting event we’re attending.
But it’s not the patterns we can imagine that are exciting. The thing I love even more about having access to these diverse streams of data is the potential insights we can unlock for civilisation, enterprise, etc that we could never have even imagined. Early predictions of storms and who knows what else?
Watch this space too as we’re about to start on some experiments using a new data tool that can visualise:
- Tens/hundreds of millions or data points
- Entities (eg devices, vehicles, etc) mapped across telemetry, space and time (temporal-spatial)
- Inject interactive layers that can be readily overlaid, turned on/off, be interacted with to highlight or de-highlight certain data trends
I’m really excited about the type of experiments we’re hoping to do with it.