Hey, Gesture Like This!

This new gesture-recognition technology from Leap Motionis amazing.”For the first time, you can control a computer in three dimension with your natural hand and finger movements.”

The closest yet to get us to the vision in the movie, Minority Report.

“Leap is more accurate than a mouse, as reliable as a keyboard, and more sensitive than a touchscreen.”

Scroll, pinpoint, pan, play, shoot, design, compose, fly–just about everything you do onscreen, but more in sync with how we generally interact with our environment and each other.

I like when the guy in the video reaches forward and the hands on the screen reach right back at him!

I’d be interested to see how this can be used to replace a keyboard for typing or will it be augmented by a really good voice recognition and natural language processing capability–then we would have an integration of the verbal and non-verbal communications cues.

In the future, add in the ability to read our facial expressions like from a robot and then we may have some real interaction going on mentally and perhaps dare I say it, even emotionally.

According to Bloomberg BusinessWeek (24 May 2012), the Leap is just the size of a “cigarette lighter that contains three tiny cameras inside” and costs just $70–“about half the price of a Kinect.”

The Leap is so sophisticated that it can “track all 10 of a user’s fingers and detect movements of less than one-hundredth of a millimeter.”

At their site, I see you can even preorder these now for estimated shipping at the end of the year.

I think I’ll put this on my holiday gift list. ūüėČ

Seeing Is Believing

This robotic seeing eye dog from Japanese company NSK is an incredible display of how technology can help the blind and was profiled in PopSci on 9 November 2011.
While there are reports of many advances in returning sight to the blind through such breakthroughs as stem cell molecular regeneration and camera-like retinal implants, there will unfortunately be medical cases that cannot be readily cured and herein lies the promise for robotic guide dogs.
These dogs do not provide the same companionship that perhaps real dogs do, but they also¬†don’t require the same care and feeding¬†that can be taxing, especially, I would imagine, on someone with a handicap.
The Robotic Seeing Eye Dog can roll on flat surfaces and can climb stairs or over other obstacles.
It is¬†activated by a person holding and putting pressure on it’s “collar”¬†handle bar.
The robotic dog can also speak alerting its handler to specific environmental conditions and potential obstacles, obviously better than through a traditional dog bark.
The dog is outfitted with Microsoft Kinect technologyfor sensing and navigating the world.
It is amazing to me how gaming technology here ends up helping the blind. But every technological advance has the potential to spur unintended uses and benefits in other areas of our life.
Recently, I saw an advertisement for MetLife insurance that proclaimed “for the ifs in life” and given all the uncertainties that can happen to us at virtually anytime, I feel¬†grateful to G-d for the innovation and technology that he bestows on people¬†for helping us handle these; sometimes the advances are direct like with Apple’s laser-like focus on user-centric design for numerous commercial technologies, and other times these are more indirect like with the Kinect being used for helping the blind, or even the Internet itself once developed by the military’s DARPA.
I imagine the technology cures and advances that we achieve are almost like a race against the clock, where people come up with counters to the ifs and threats out there, adapting and adopting from the latest and greatest technology advances available.
Advances such as Kinect and then taking us to the robotic seeing eye dog, bring us¬†a little closer–step by step, each time incrementally–to handling the next challenge¬†that calls.
This week, I was reminded again, with the massive asteroid YU55 speeding past us at 29,000 mph and within only 202,000 mile of a potential Earth collision (within the Moon’s orbit!), how there are¬†many more ifs to come and I wonder will we be ready, can we really, and whether through direct or indirect discoveries to handle these.

Display It Everywhere


We are getting closer to the day when mobile computing will truly be just a computer interaction anywhere–on any surface or even on no surface.
In this video we see the OmniTouch, developed by Microsoft in conjunction with Carnegie Mellon University, display your computer interface on everyday objects–yourself, a table, wall, and so on.
This takes the Kinect gaming technology to a whole new level in that the OmniTouch doesn’t just detect and sense your motions and gestures, but it meshes it with the way people typically interact with computers.
Using a wearable pico projector and a depth camera,the OmniTouch creates a human-computer interface with full QWERTY keyboard and touch pan, zoom, scroll capabilities.
It’s amazing to see the person demonstrating theinteraction with the computer practically in thin air–oh boy, Minority Report here we come. ūüėČ
Of course, to become a viable consumer solution, theshoulder-mounted contraption has got to go really small–no bigger than a quarter maybe, and able to be mounted, with processors and connectivity, unobtrusively in clothing, furniture, or right into building construction in your home or office.
At that point, and it hurts to say it given how much I love my iPhone, computers will no longer be device-driven, but rather the application takes center stage.

And the ability to project and click anywhere, anytime helps us reach a new level of mobility and convenience that almost boggles the senses.

Fitting Every Consumer A VIP

Check out this new augmented reality virtual fitting room technology called Virtual Interactive Podium (VIPodium) by Russian Company, Fitting Reality.
Using Kinect, motion-sensing technology, you use simple gestures to:
– Select, mix and match, and try on clothes in 3-D.
– Twirl around and see yourself in 360-degrees
Take pictures, email them or share them on Facebook
Get outfit information including sizing, and
– Place clothing in wish lists for future consideration or into the online shopping cart for purchase
Here is another video of a very cool implementation of this technology at the men and women’s TopShop store in Moscow.
!
The people are visibly engaged and excited shopping and trying on clothes using the latest here for “tech-savy fashionistas.”
Honestly though, I see this more as an augmentation to the physical fitting rooms, in terms of helping select clothes, rather than a replacement, since really seeing how something fits, means actually putting the outfit on.
VIPodium beats simply holding up an new outfit to yourself and looking in the mirror, but it doesn’t come close to seeing how it really feels when it’s when you put it on.
However, add in the interactive social media features, available information, and ability to shop online and I think you got something that makes every consumer feel like a VIP.
Happy shopping!

Computer, Read This

Predicting_crime

In 2002, Tom Cruise waved his arms in swooping fashion to control his Pre-Crime fighting computer in Minority Report , and this was the tip of the iceberg when it comes to consumer interest in moving beyond the traditional keyboard, trackpads, and mice to control our technology. 

For example, there is the Ninetendo Wii and Microsoft Kinect in the gaming arena, where we control the technology with our physical motions rather than hand-held devices. And consumers seem to really like have a controller-free gaming system. The Kinect sold so quickly–at the rate of roughly 133,000 per day during the first three months–it earned the Guinness World Record for fastest selling consumer device. (Mashable, 9 March 2011),

Interacting with technology in varied and natural ways–outside the box–is not limited to just gestures, there are many more such as voice recognition, haptics, eye movements, telepathy, and more.

Gesture-driven–This is referred to as “spatial operating environments”–where cameras and sensors read our gestures and translate them into computer commands. Companies like Oblong Industries are developing a universal gesture-based language, so that we can communicate across computing platforms–“where you can walk up to any screen, anywhere in the world, gesture to it, and take control.” (Popular Science, August 2011)

Voice recognition–This is perhaps the most mature of the alternative technology control interfaces,and products like Dragon Naturally Speaking have become not only standard on many desktops, but also are embedded in many smartphones giving you the ability to do dictation, voice to text messaging, etc.

Haptics–This includes touchscreens with tactile sensations.For example, Tactus Technology is “developing keyboards and game controllers knobs [that actually] grow out of touchscreens as needed and then fade away,” and another company Senseg is making technology that produces feelings so users can feel vibrations, clicks, and textures and can use these for enhanced touchscreens control of their computers. (BusinessWeek, 20-26 June 2011)

Eye-tracking–For example, new Lenovo computers are using eye-tracking software by Tobii to control the browser and desktop applicationsincluding email and documents (CNET, 1 March 2011)

Telepathy–Tiny implantable chips to the brain, “the telepathy chip,” are being used to sense electrical activity in the nerve cells and thereby “control a cursor on a computer screen, operate electronic gadgets [e.g. television, light switch, etc.], or steer an electronic wheelchair.” (UK DailyMail, 3 Sept. 2009)

Clearly, consumers are not content to type away at keyboards and roll their mice…they want to interact with technology the way¬†they do with other people.

It still seems a little way off for computers to understand us the way we really are and communicate.¬† For example, can a computer read non-verbal cues, which communication experts say is actually something like 70% of our communications?¬† Obviously, this hasn’t happened yet. But when the computer can read what I am really trying to say in all the ways that I am saying it, we will definitely have a much more interesting conversation going on.

(Source Photo: here)