Tuesday, January 13, 2015

Striking the New Work Balance Between Man & Machine in the IoT

A challenge faces us for the rest of this decade and the next. "How will we balance work between man and machines" and how will they cooperate. Depending on whose numbers you believe, there are going to be tens of billions of new things to deal with in the growing machine world. This begs a number of interesting questions. How are we going to manage them or will they manage us? If we give more work to things and machines, what happens to the people and the people / machine interface when they share work?  I don't pretend to have the answers, but I see a big role for processes, policies and constraints in the future working together. I'd like to start out the discussion by identifying a number of new things for processes to interact with in the next few years,  I have classified them into two areas to help me understand and differentiate the IoT. One for Things / Bots and the other for Learning Machine Intelligence.






















Things / Bots:

The basic difference between things and bots is the amount of planned logic that is embedded in them. This is where humans have added logic for predictable responses and have pre-programmed them to take appropriate actions based on precursor data or situations. I've listed them in level of rather static and planned intelligence with an example. For this example I will use a blind spot detection system on many modern vehicles.

Sensors / Sensor Clusters      Sensors aimed at blind spots
Pattern Detectors (planned)  Determine of another vehicle is close enough to threaten
Communication Controllers  Fire a set of warning devices alarm
Agent Hubs                              Determine if the drivers eyes are aimed at the visual warning
Robots                                      Vibrate the seat of the driver and advise a monitoring org


Learning Machines/ Systems:

The basic difference between things/bots is that the intelligence used is determined in real time and my not have been planned (emerging). This is where conditions and actions can be determined as they occur and rely on the interaction of machines and people together. For this example I will use a human stock trader that is supplied a workbench of active features, software and intelligence.

Business Intelligence              Report market trends in light of portfolio and yield goals
Big Data Poly-Analytic           Run in flight analysis of stocks in context of yield & mix
Pattern Detectors (emerging) Sense interesting patterns that might need the traders attention
Machine Learning                   Note reports, analytic and patterns of interest to the trader
Personal Assistants / Agents   Buy and sell personal assistants give advice on trade options


Net; Net:

Our challenge over the next few years is how to manage all these new capabilities (or let some of them self manage) and how to get people comfortable with interfacing with them and building trust. Really trust that they can assist us and can do a portion of management by themselves. In some cases they will be taking work that is not appeal to us or we are not capable of doing in a real time fashion ourselves. This is an evolutionary revolution that we will all be learning how to deal with in the work place, in our cars, in our houses, with our appliances and all together. I believe that process, policy, decision, agent and constraint technologies will have to work together in new ways to get into advanced levels of comfort, productivity and trust.






No comments:

Post a Comment