Computer, Read This

Predicting_crime

In 2002, Tom Cruise waved his arms in swooping fashion to control his Pre-Crime fighting computer in Minority Report , and this was the tip of the iceberg when it comes to consumer interest in moving beyond the traditional keyboard, trackpads, and mice to control our technology. 

For example, there is the Ninetendo Wii and Microsoft Kinect in the gaming arena, where we control the technology with our physical motions rather than hand-held devices. And consumers seem to really like have a controller-free gaming system. The Kinect sold so quickly–at the rate of roughly 133,000 per day during the first three months–it earned the Guinness World Record for fastest selling consumer device. (Mashable, 9 March 2011),

Interacting with technology in varied and natural ways–outside the box–is not limited to just gestures, there are many more such as voice recognition, haptics, eye movements, telepathy, and more.

Gesture-driven–This is referred to as “spatial operating environments”–where cameras and sensors read our gestures and translate them into computer commands. Companies like Oblong Industries are developing a universal gesture-based language, so that we can communicate across computing platforms–“where you can walk up to any screen, anywhere in the world, gesture to it, and take control.” (Popular Science, August 2011)

Voice recognition–This is perhaps the most mature of the alternative technology control interfaces,and products like Dragon Naturally Speaking have become not only standard on many desktops, but also are embedded in many smartphones giving you the ability to do dictation, voice to text messaging, etc.

Haptics–This includes touchscreens with tactile sensations.For example, Tactus Technology is “developing keyboards and game controllers knobs [that actually] grow out of touchscreens as needed and then fade away,” and another company Senseg is making technology that produces feelings so users can feel vibrations, clicks, and textures and can use these for enhanced touchscreens control of their computers. (BusinessWeek, 20-26 June 2011)

Eye-tracking–For example, new Lenovo computers are using eye-tracking software by Tobii to control the browser and desktop applicationsincluding email and documents (CNET, 1 March 2011)

Telepathy–Tiny implantable chips to the brain, “the telepathy chip,” are being used to sense electrical activity in the nerve cells and thereby “control a cursor on a computer screen, operate electronic gadgets [e.g. television, light switch, etc.], or steer an electronic wheelchair.” (UK DailyMail, 3 Sept. 2009)

Clearly, consumers are not content to type away at keyboards and roll their mice…they want to interact with technology the way they do with other people.

It still seems a little way off for computers to understand us the way we really are and communicate.  For example, can a computer read non-verbal cues, which communication experts say is actually something like 70% of our communications?  Obviously, this hasn’t happened yet. But when the computer can read what I am really trying to say in all the ways that I am saying it, we will definitely have a much more interesting conversation going on.

(Source Photo: here)

>The Eyes Have It

>

In the last couple of weeks, a new innovation by Tobii for eye tracking technology built into the lid of laptop computers has been featured on CNN, the New York Times (March 27, 2011), and Bloomberg Businessweek (March 28-April 3, 2011)

Tobii allows users to “control their computers just by looking at them.

The eye tracker uses infrared lights (like those used in a TV’s remote control) to illuminate the pupils, and optical sensors on the computer screen capture the reflection. Tobii can determine the point of gaze and movement of the eyes to within 2 millimeters.

So forget the mouse–“just look at a particular location on the screen, and the cursor goes there immediately.”

This is a natural user interface that is fast and intuitive, generally halving the time needed for many chores.”

Eye tracking is being tested and planned by Tobii and others for the following

– Read text down the screen and it automatically scrolls. – Look at a window or folder to choose it. – Use a map by eyeing a location and then touching it to zoom. – Activate controls by holding a glaze for a quarter to half a second. – Play video games by moving through with your eyes. – Gaze at a character and they will stare back at you. – Leave your TV and it pauses until you return.

This technology has the potential to help disabled people (who cannot use a traditional mouse) as well as prevent strains and injuries by reducing some repetitive stress movement.

Within a couple of years, the cost of eye tracking technology is seen as coming down from tens of thousands of dollars to a couple of hundred dollars for a laptop clip-on device or even less for those built right in.

I think another important use for eye tracking is with augmented reality technology, so that as people navigate and look around their environment, sensors will activate that can provide them all sorts of useful information about what they are seeing.

Ultimately, where this is all going is the addition of a virtual 4th dimension to our vision–where information is overlaid and scrolling on everything around us that we look at, as desired.

This will provide us with an information rich environment where we can understand more of what we see and experience than ever before. Terminator, here we come! Augmented_reality