Fortunately, researchers have been working steadily for decades to make computers and communication devices nearly vanish by embedding them into our surroundings and networking them so they can sense the environment and interact with us in a manner that would integrate better with our living and working situations. 

Early research groups developed scenarios to demonstrate the utility of these systems, which included independence support for the elderly, meeting facilitation, augmented driving and enhanced social interaction.


History: The promise of Smart Ecosystems and Ambient Intelligence

In 2008, the working group on Ambient Computing from the European technology consortium InterLink published a paper summarizing the state of the art in Ambient Computing, focusing on the social awareness of systems and privacy concerns.  Included in this paper are a history of projects and use-case scenarios and a description of the original vision of ubiquitous, pervasive networks based on many "invisible" small computing devices embedded into the environment. These "smart ecosystems" of devices were to provide an intuitive user experience, enabling new types of interaction, communication and collaboration:

"...the degree of diffusion of smart devices...will result in smart ecosystems that might parallel other ecosystems in the not too far future. 
... a major challenge will be to orchestrate the large number of individual elements and their relationships, connect and combine them via different types of communication networks to higher level, aggregated entities and investigate their emerging behaviour."

Similarly, an MIT project called Oxygen, working around 2000-2003, focused on Intelligent Spaces, which would sense the presence of people, their tasks, and even their attention and react appropriately, and Mobile Devices, which presaged today's smart phones to create connections to the physical world through a cluster of technologies, like cameras, sensors, networking, accelerometer, microphone, speaker, phone, GPS, etc.


Ambient Computing Scenarios

Scenarios from the MIT Oxygen project included: a business conference involving people in different countries coordinating a meeting in London using different languages and automated scheduling and travel planning, as well as navigational and data assistance once they arrive, and a "guardian angel" which allows aging-in-place by providing memory and safety support to elderly people living independently.

Over the last decade, many of these scenarios have been at least partially realized, often through the use of device configurations not precisely forseen by the early ambient computing thinkers, such as smart-phones, GPS's, RFID's, and blue-tooth, as the market determines the dominant technologies. These configurations emphasize embedded devices less at this point and portable devices more, but the end result is the same. 

Smart-phones bristling with  data inputs from GPS, accelerometers, networking, video cameras, microphones, keyboards, and multi-touch have accelerated the evolution of pervasive computing by allowing mashups between all of these technologies, resulting in what could be called augmented intelligence for daily living.


Embedded and Pervasive Performance Support Systems

Embedded technologies would be a natural fit in performance support systems, particularly where use of a keyboard or mouse is inconvenient or impossible. Embedded performance support systems have been portrayed for years in movies and science fiction, although not always favorably, usually when they stop being support and start taking over. Intelligent computers portrayed in movies and TV shows like 2001 and Star Trek inhabit the entire environment  and answer questions using natural language. Somehow they always seem to understand the questioner's intent perfectly (how many search engines can do that?) and gather their own data through sensors as well as direct input.  

Hal.jpeg

We are still far from that paradigm, but the closer we get, the more we can use electronic assistants to help us with tasks that currently don't seem good candidates for performance support with existing systems. 

How do you know when it is NOT appropriate to use performance support or job aids? Here are some suggested criteria:

  1. When use of a job aid would damage credibility 
  2. When speedy performance is a priority
  3. When novel and unpredictable situations are involved
  4. When smooth and fluid performance is a top priority
  5. When the employee lacks sufficient reading, listening, or reference skills
  6. When the employee is not motivated

--from Job Aids and Performance Support by Rossett & Schafer

As time goes on we may find that situations that fit these criteria in the past will no longer do so as interfaces and system intelligence improve. An example of an improved interface that made performance support possible in a high-performance situation is GPS navigation system for cars. It used to be difficult and dangerous for drivers to check directions on a map while driving, but GPS's have made it relatively safe and very easy. The support system talks to the driver, and if a mistake is made, it re-calculates the route and starts from the new position automatically. So, it is possible to navigate an unknown route with a GPS, while driving, without ever looking at it once it is started.

This is one case where the interface and intelligence of the supporting system now actually  improves performance instead of interfering with it, so the situation has changed fundamentally.  Similarly, an article in Health Informatics details the development of a a gesture-controlled medical image display system for use in operating rooms. The images are used for reference during neurosurgery, by gloved surgeons who cannot touch anything but the sterile field and instruments with their hands. Currently nursing personnel are utilized for this task, holding bound volumes of images up for review by the surgeons during procedures. 

The Minority Report gestural interface

minority report

Tom Cruise manipulates an advanced visual display system in Minority Report.
Image © 2002 DreamWorks LLC and Twentieth Century Fox.


Video of the Minority Report interface in action


Interfaces of tomorrow

In a pervasive computing scenario, devices sense or communicate with people in order to provide contextually appropriate services. The key here is the interface between the device and the human, since the goal is to make interaction with the devices as natural and intuitive as possible. These days an astonishing amount of creativity and innovation is being directed at the problem of interface design and the results will be life and work-transforming.  

Some of the more interesting modes of human-computer interaction that have been tried include directed attention, voice, gesture tracking, motion (body motion through space or motion of the device, like the Wiimote or the Siftables shown below), haptic feedback-enhanced touch  and even brain-computer interfaces. Some of the most exciting of these innovative interfaces are shown below.

Interactive Floor Projection Screens

These systems project images on the floor, and use a camera to track body motion across the surface. You may be familiar with these displays from seeing them at malls and theaters, but they could be used to select files, control other devices, etc. 


This video shows an interactive floor installation at a Japanese art gallery, showing its use as an interface for retrieving information about works of art. For more information about building this type of interface, see the Natural User Interface Group for setup instructions and code.

   


Multitouch interfaces (like the one on the iPhone) are used in walls, tables, and smartPhones. Microsoft Surface, a multi-touch table that allows multi-user, fine-grained control of objects on screen, also interacts with real objects and other devices using cameras and wifi. This is a great example of "smart ecosystems" of intelligent devices which sense the status of the environment and lower barriers between people and their information. An example is shown transferring an image from a camera to the Surface table to a smartphone, simply by laying the camera and smartphone on the table.


Touch Wall

An intelligent whiteboard that uses cameras to track hand motion across a vertical display allowing zoomable, panning navigation through information in a non-linear format.


  • In February, 2009, Pattie Maes of MIT demonstrated a project "Sixth Sense" led by Pranav Mistry which uses an inexpensive wearable camera with a projector to enable amazing interactions between the real world and the world of data.


  • Interactive Wall from Clarity Nearlife

    Interactive Window





    References:
    A Real-Time Gesture Interface for Hands-Free Control of Electronic Medical Records
    http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1839426

    https://buy.garmin.com/shop/shop.do?pID=37631&ra=true



    http://www.appvee.com/t/iphone-app-review-redlaser
    http://itunes.apple.com/WebObjects/MZStore.woa/wa/viewSoftware?id=321048135&mt=8
    http://itunes.apple.com/WebObjects/MZStore.woa/wa/viewSoftware?id=312720263&mt=8
    http://itunes.apple.com/WebObjects/MZStore.woa/wa/viewArtist?id=284973757
    http://pervasivegaming.ning.com/profile/Sergio
    http://gamesalfresco.com/2008/03/03/top-10-augmented-reality-demos-that-will-revolutionize-video-games/


    http://web.media.mit.edu/~dmerrill/invisible_media.html



    http://www.smashingmagazine.com/2008/08/17/10-futuristic-user-interfaces/
    http://petitinvention.wordpress.com/2008/02/10/future-of-internet-search-mobile-version/

    Being locked to a keyboard, mouse and monitor restricts movement, and limits interaction with people and the environment. In many job situations people simply do not have any hands free for interaction with the computer or device.

    These new interfaces and device clusters have a lot of potential in educational contexts and for performance support.

    "It seems like a paradox but it will soon become reality: The rate at which computers disappear will be matched by the rate at which computer/information technology will increasingly permeate our environments and determine our lives" (Streitz & Nixon, 2005) [LINK].


     


    Picture the scenario of a nurse or doctor caring for a patient with a complex set of conditions in a hospital. Performance support might consist of something like a mashup between the patient's medical record, data from monitors, specific handoff instructions from the last shift, information about whatever is unusual in the care plan and contextual guidance to help prevent errors. Each of these technologies currently exists, but separately and not well integrated. Perhaps augmented technologies could be used to project displays based on imaging data right onto the patient.


    The first reason, "When use of a job aid would damage credibility" is a case in point  An amusing illustration of a credibility-destroying performance support tool can be seen in the film Idiocracy, in the McDonald's style triage keyboard used by the receptionist in the hospital scene. Her fingers hover over the buttons as the patient talks about his symptoms, trying to decide on the closest match with their problem. 



    Number 3. When novel and unpredictable situations are involved. This one suggests a solution like HAL in 2001: A Space Odyssey. Hal was built to adapt to unforeseen situations, and actually did, a little too well.