Flow of data is happening, thanks to IoT, but the challenge is making sense of it all!
Sensors are cheap and getting cheaper. They are turning up everywhere. You can find sensors inside and on machines, tools, vehicles, household goods, buildings, plants and animals including human bodies. Sensors that measure vibration, sound, the presence of chemicals, minute electrical currents, capacitance, radiation and light, fluid movement, velocity, acceleration, height, distance, magnetism, impact forces, pressure, heat, proximity and life.
We have been constructing sensors for a long time but something has changed! Now sensors can talk - they have been fused with communications devices. They can be remotely interrogated via the internet. Their data can be stripped and analyzed, reformulated, visualized, geographically located, using the latest computer technologies.
Even more importantly, the data stripped from individual sensors can be aggregated and aligned with data from other types sensors. It is becoming possible to triangulate the data – to get confirmation that the sensor information is providing reliable information.
To define location exactly, you need at least 3 perspectives – to provide an improved understanding of what is real, you also need multiple perspectives.
Sensor networks are growing in scale and complexity providing data from many different types of sensors, in different times, places and spaces. Multiple perspectives are merging to provide mutual corroboration.
The perfect data storm is forming as sensor data is communicated, collected, collated and analyzed using the internet, computer processing and data storage systems and sophisticated analysis techniques, all coming together to create a growing typhoon of information generation.
There is an inevitability about this progression of the ability to measure and collect data from these devices. The age of evidence based decision making is on the horizon. Imagine governments making informed decisions based on accurate, reliable collations of factual data.
Of course, having measured data and generated information does not guarantee good decisions – political imperatives are quite capable of ignoring or suppressing evidence. But, such information resources can enable better decision making processes.
Is this information rich, evidence based future realistic? What are challenges this new ‘internet of things’ data rich environment faces? Can we really expect massive changes in the way we think and work? Is a ‘smart city’, ‘smart home’, ‘smart grid’, ‘intelligent transportation’ environment possible?
Well the sensors are there! The communications systems are available and the flow of data is happening but the challenge is making sense of it all. Too much data can be as problematic as not having enough data. If you have too much then the analysis, processing and understanding of the data can become a serious barrier.
The search is on for the meaning in the data. It is no easy task. Constructing the algorithm that will automatically bring together streams of data from multiple sensors and extract meaning is a real challenge.
However, defining the problem is the key. If we know what and why we want a solution, you can be sure the algorithm is on its way!
There are great examples of successful applications. Medical imaging is one example – sensor information used to provide visual representations that allow comparisons between healthy and unhealthy tissues – Radiography, MRI, Nuclear Medicine, Ultrasound, thermography.
Managing public transport systems is another obvious application. We have sensors that count the wheels going into and out of a section of rail and provide appropriate alerts and warnings.
Managing energy consumption and optimizing sustainable energy practices; monitoring for safety; monitoring for public safety – all examples of application areas for the benefit of our citizens.
Mobile phones, medical bracelets, wearable technology – are sensors measuring human behavior.
Algorithms are being developed to track population movement, monitor the use of spaces, analyze foot traffic and predict congestion.
Applications to energize marketing and target sales opportunities are using web based human behavior sensors encouraged by the commercial sector. In education, web page clicks in learning management systems are being used to try and understand whether learning outcomes are being achieved and where conceptual learning difficulties might be overcome.
The applications are growing and the commercial and marketing examples are getting real focus but developing the applications and finding the right applications is not easy. We are seeing silly one off applications that are perhaps fashionable but not deeply valuable. Finding sensible, meaningful information from oceans of single sensor sources of data is hard. The powerful applications will emerge as triangulated data sources are used to generate meaningful information aimed at specific problems.
The ethical issues around personal privacy have to become better understood. The easiest way to engage awareness of this issue is to ask the question: would you be comfortable if every movement, action, interaction of your daily life were able to be observed and recorded? Would you be happy to have multiple sensors mounted on or in your body to facilitate such data collection?
The reality is that the multiplicity of sensors we currently use is making that possible right now. We live with our phones, wear exercise sensors, health monitors and connect them with blue tooth communications technology.
The primary protection we have is the belief that the volume of the data will allow our personal lives to be ‘hidden’ within the data lake. We rely on the ‘school of fish’ mentality – we are safe because we are one of many. But we still get targeted personal preference advertising!!
The real protection is that the data lake is hard to create. The data lake is forming but it is not complete.
There is some comfort in knowing that the algorithms necessary to bring a holistic view of a person together are difficult to construct. But with a defined purpose or problem, the algorithms will emerge if the data lake is able and allowed to be constructed.
But the only real point of control is managing the construction of the data lake and determining the constraints around its use.
The challenge for our governments is to lift the protections against the mis-use of such activity and to tread the thin line between utility and the invasion of individual privacy.
This is part of our weekly OpenGov expert opinion series where we invite the public sector technology experts to share their opinions on the latest trends and happenings.