Move over Internet of Things – Here comes Space Time Robotics

robot-earth.hallmarkThe Internet of Things (IOT) is a hot topic these days.  All the cool kids are doing it.  Every company is adopting it as their battle flag.  While “The Internet of Things” (IoT) as an idea has been around since 1999, it has only come into its own in the past few years.  And, in 2014, it seems that nearly every tech writer was compelled to fan the flames of interest in the subject.  Going back to its beginning, I always liked the idea of IoT, but also found it vaguely unsatisfying.  Maybe its because I was never an EE guy focused on the physical networking layer.  I guess I always assumed the Internet’s existence, and focused on the Web layer.  After all, I left my small town for the big city in order to go to college just as the World Wide Web was emerging.  So, for me, it was all Web all the time.

At virtually the same time that IoT was coined, I met Dr. Mike Botts at a Technical Committee meeting of the Open Geospatial Consortium (www.opengeospatial.org) in Atlanta.  It was the first TC for both of us.  I was a new sponsor of the OGC, in my role as Chief Strategic Officer of In-Q-Tel, the CIA’s venture fund.  He was a principal research scientist at the University of Alabama at Huntsville, having done a stint at NASA headquarters, trying to figure out a strategy and implementation that might help achieve interoperability amongst their many sensors.  Space based sensors, predominantly.  But, Mike came in to the OGC process with a rich vision for sensor interoperability in which his “SensorML” could be used as an abstraction layer that would enable an architecture for constructing massively distributed, heterogenous SensorWebs comprised of space based, airborne, mobile, in situ and terrestrial remote sensors.  Not a network (e.g., Internet).  But a web (e.g., WWW).  Sensor Web Enablement (SWE) is the term that emerged within the OGC process.  SensorWeb was the shorthand.  And, within the OGC, all SensorWebs were rigorously geospatially-enabled and location aware.  As such, the whole time everyone was talking more and more about the Internet of Things, all I could think about the Location Enabled Web of Things.  Too bad “LEWoT” was a horrible acronym.  Otherwise, given the tech world’s fetish for snappy acronyms, regardless of the irrelevance of the content backing them up, we could have been off to the races within something well beyond what that IoT had to offer.  But, alas…

Interestingly, the OGC SWE architecture continued to evolve at a healthy pace, and it became widely adopted, on a global scale.  I like to say that it succeeded at becoming a globally adopted architecture, but not ubiquitous.  By 2009 or so, people were using SWE to task remote sensing satellites, task UAVs and their sensors, publish ocean buoy networks, enable webcams as location aware services, demonstrate terrestrial remote sensors such as doppler radars, and publish a variety of mobile and in situ sensors as interoperable OGC services.  The EU even developed a program called “Sensors Anywhere” (or SANY)  from which a book was authored.  As a member of the Board of Directors of the Open Geospatial Consortium, this evolution was a point of pride.  OGC had midwifed a global architecture that could deal with the most simple to the most exotic sensors on the planet, as well as near space.

Somewhere over this evolution, things got curiouser and curiouser.  I remember asking Mike “So, SWE isn’t just dealing with sensors?”, observing that applications of this architecture were doing things like tasking the movement of platforms like UAVs.  It was then that Mike blew my mind.  “Well, SensorML and SWE support sensors, actuators and processes.”  To a non-engineer, this is the kind of succinct statement that I remember my best professors leaving me to chew on for years.  Over many beers and many conversations (yes, I am slow), I came to ask things like “so, could you interface with a constellation of semi-autonomous robots using SWE?”  To which Mike would say things like, “Well, they are simply combinations of sensors, actuators and processes.  So, yes!”.  A constellation of geographically-enabled, location aware, semi-autonomous robots orchestrated and managed by SensorML and SWE.  Now that would be crazy.

This was the moment when I became unsatisfied not only with the term IoT, but also with Sensor Web Enablement, or SensorWebs.  They are perfectly good terms, so don’t get me wrong.  In particularly, I would definitely self-identify as a “sensor freak”.  And, I am super thrilled with the latest evolution of Mike’s OGC SWE vision, with the launch of his team’s license free open source software platform for geospatial (FOSS4G) sensors, called OpenSensorHub (www.opensensorhub.org).  But, both these terms fail to grasp what I consider to be the key elements of the emerging future.  What platforms like OpenSensorHub will enable is what I have taken to calling “Space Time Robotics”.  After all, what do you get when you integrate sensors, actuators and processes?  Robots.  Thats what.  And, not just anthropomorphic robots like Twiki from Buck Rogers or “Robot”(full name, B-9, Class M-3 General Utility Non-Theorizing Environmental Control Robot) from Lost in Space (did I just date myself?).  Robots will manifest in a variety of forms, including massively distributed networks of sensors, actuators and processes that shape how we experience the landscape in which we live.  And, when they are geographically-enabled and location aware…or more specifically, spatio-temporally aware…what will happen?  Their existence, observations, assessments and actions will span time and space.  The same time and space in which we as human will exist.  So, sure.  The Internet of Things is great.  But, the era of Space Time Robotics will arrive sooner than we imagine, and the term IoT will feel insufficiently descriptive when it does.

Leave a Reply

Your email address will not be published. Required fields are marked *