Already it's clear that one of the hot tech topics of 2012 will be "The Internet of Things" – the idea that even the most mundane objects will be hooked up to the Net and communicating over it. So far, pundits have concentrated on trivial applications like being able to check your fridge's contents from a browser, but potentially it could be much more than that if the "things" are groups of sensors whose data can be usefully aggregated.
Just what might be possible is hinted at in this fascinating post by Andrew Fisher, entitled "Towards a sensor commons":
For me the Sensor Commons is a future state whereby we have data available to us, in real time, from a multitude of sensors that are relatively similar in design and method of data acquisition and that data is freely available whether as a data set or by API to use in whatever fashion they like.
What this boils down to, then, is trends in freely-available real-time data from multiple sensors: it's about being able to watch the world change across some geographical area of interest -- even a small one -- and drawing conclusions from those changes. That's clearly a huge step up from checking what's in your fridge, and potentially has major political ramifications (unlike the contents of your fridge).
My definition is not just about “lots of data from lots of sensors” – there is a subtlety to it implied by the “relatively similar in design and method of data acquisition” statement.
In order to be useful, we need to ensure we can compare data relatively faithfully across multiple sensors. This doesn’t need to be perfect, nor do they all need to be calibrated together, we simply need to ensure that they are “more or less” recording the same thing with similar levels of precision and consistency. Ultimately in a lot of instances we care about trended data rather than individual points so this isn’t a big problem so long as an individual sensor is relatively consistent and there isn’t ridiculous variation between sensors if they were put in the same conditions.
The bulk of the post explores what Fisher sees as the key requirements for a sensor commons, which must:
Each of these is explored at some length, with always interesting and sometimes surprising insights and comments -- I urge you to read the whole thing.
- Gain trust
- Become dispersible
- Be highly visible
- Be entirely open
- Be upgradeable
Fisher concludes as follows:
The access we are getting to cheap, reliable, malleable technologies such as Arduino [open hardware boards] and Xbee [wireless modules] coupled with ubiquitous networks whether WiFi or Cellular is creating an opportunity for us to be able to understand our local environments better. Going are the days where we needed to petition councillors to do some water testing in our creeks and waterways or measure the quality of the air that we are breathing.
As that final "strong actions" hints, this is not your parents' Internet of Things.
The deployment of these community oriented technologies will create the Sensor Commons; providing us with data that becomes available and accessible to anyone with an interest. Policy creation and stewardship will pass back to the local communities – as it should be – who will have the data to back up their decisions and create strong actions as a result.
Follow me @glynmoody on Twitter or identi.ca, and on Google+