Touch Free, Just Use Your Head

Israel Innovation News is reporting a very simple but cool new technology for the disabled.

It enables them to “read, play games, search the web, and make calls without the need for touch.”

Sesame Reader, from the Google App Store, “tracks your face and allows you to turn [eReader] pages with the movement of your head.”

You can also dial a number or type of a keyboard by using movement of the head to control the cursor movement and by hovering over a button to “click it.”

This helps people to function in a digital world, when otherwise they couldn’t.

Hence, the name Sesame from Ali Baba’s magic phrase “Open Sesame.”

Now people can read, write, and interact with others online–even when they don’t have use of their limbs because of neurological, muscular, and other structural defect, or if they simply want hands-free use.

Touchscreens, keyboards, and keypads are now accessible to anyone with the simple turn of the head–up, down, left, and right is all all it takes to navigate, touchless. ๐Ÿ˜‰

Picture Frames For The Computer Geek In U

Picture Frames For The Computer Geek In U

Found these 2 interesting pictures frames.

They seem the perfect gift for that computer geek in your life.

The one on the left looks like a memory or circuit board in the computer.

While the one on the right is made from a recycled keyboard.

I like that they are both made from recycled material and of course, the computer theme.

While not exactly the computer geek, I think they are very cool, indeed. ๐Ÿ˜‰

(Source Photo: Andy Blumenthal)

They Aren’t Smartwatches…they Are Dumbwatches

They Aren't Smartwatches...they Are Dumbwatches

The Wall Street Journal asks “Is it Time for Smartwatches?”

With the arrival of the first generation of smartwatches–Samsung Galaxy Gear, Pebble, and Sony Smartwatch–we have hit the rock bottom in innovate and design thinking.

These watches look cheap–flimsy plastic or ultra-thin aluminum or even stainless doesn’t cut it as a fashion statement when larger and substantial is in.

The screens are too small to be user-centric–let along there being any room for a physical or soft keyboard.

You can’t really read on it and you can’t type on it (any significant form of email, texting)–except by voice command. Ah, let me talk into my wrist, no!

Also, for videos or gaming, the small rectangular screens aren’t of any useful function–how much of Madonna’s new wild getup can you see or how far can you fling that angry bird on your wrist?

Downloading music on the Gear, uh, also no.

Taking photos with a 1.9 megapixel camera on the Galaxy Gear at a time when the 8 megapixels on the iPhone is running way short is good for maybe a James Bond, but not anyone else.

Plus for smartwatches like the Gear, you still need to pair it with a companion smartphone for it to work, so you now have added expense (between about $150 for the Pebble and $299 for the Gear smartwatch) with no significant added benefit.

For the Gear, you also have a separate charger because the watch only has a battery life of about a day, while for the Pebble and Sony Smartwatch 2, you have between half a week to a week.

And believe it or not, the Galaxy Gear is not compatible with their own Galaxy S4 smartphone–oh, so very smart.

My 16-year old daughter said, “If they had this 10 years ago maybe, but now, who needs it!”

No, Google Glass has it right–concept yes, fashion still to be worked out–and the smartwatches for now, have it wrong, wrong, wrong.

If you buy it, you’ve bought yourself a very dumb watch.

Maybe the iWatch can save the day? ๐Ÿ˜‰

(Source Photo: here with attribution to Nathan Chantrell)

Magic Computer Displays

This is some awesome technology from Tactus Technology.

It is called a dynamic tactile touchscreen.

Here’s how it works:

When you want to type with a tablet or other touchscreen display, not only do you see a QWERTY keyboard, but also the buttons actually rise out of of the flatscreen display–for a tactile typing experience.

Using microfluidics, the fluids in the screen actually change shape–and form buttons.

When your done typing, the keyboard buttons melt away back down into the screen.

It all happens in a split second and has negligible impact on power consumption (i.e. less than 1%).

This type of tactile experience with computer displays can be used for tablets, smartphones, gaming devices, and I would imagine even SCADA devices (e.g. for turning a dial, pulling a level, etc. all virtually on a monitor).

Goodbye physical controls and hello magic touchscreen–presto chango. ๐Ÿ˜‰

Hey, Gesture Like This!

This new gesture-recognition technology from Leap Motionis amazing.”For the first time, you can control a computer in three dimension with your natural hand and finger movements.”

The closest yet to get us to the vision in the movie, Minority Report.

“Leap is more accurate than a mouse, as reliable as a keyboard, and more sensitive than a touchscreen.”

Scroll, pinpoint, pan, play, shoot, design, compose, fly–just about everything you do onscreen, but more in sync with how we generally interact with our environment and each other.

I like when the guy in the video reaches forward and the hands on the screen reach right back at him!

I’d be interested to see how this can be used to replace a keyboard for typing or will it be augmented by a really good voice recognition and natural language processing capability–then we would have an integration of the verbal and non-verbal communications cues.

In the future, add in the ability to read our facial expressions like from a robot and then we may have some real interaction going on mentally and perhaps dare I say it, even emotionally.

According to Bloomberg BusinessWeek (24 May 2012), the Leap is just the size of a “cigarette lighter that contains three tiny cameras inside” and costs just $70–“about half the price of a Kinect.”

The Leap is so sophisticated that it can “track all 10 of a user’s fingers and detect movements of less than one-hundredth of a millimeter.”

At their site, I see you can even preorder these now for estimated shipping at the end of the year.

I think I’ll put this on my holiday gift list. ๐Ÿ˜‰

Computer, Read This

Predicting_crime

In 2002, Tom Cruise waved his arms in swooping fashion to control his Pre-Crime fighting computer in Minority Report , and this was the tip of the iceberg when it comes to consumer interest in moving beyond the traditional keyboard, trackpads, and mice to control our technology.ย 

For example, there is the Ninetendo Wii and Microsoft Kinect in the gaming arena, where we control the technology with our physical motions rather than hand-held devices. And consumers seem to really like have a controller-free gaming system. The Kinect sold so quickly–at the rate of roughly 133,000 per day during the first three months–it earned the Guinness World Record for fastest selling consumer device. (Mashable, 9 March 2011),

Interacting with technology in varied and natural ways–outside the box–is not limited to just gestures, there are many more such as voice recognition, haptics, eye movements, telepathy, and more.

Gesture-driven–This is referred to as “spatial operating environments”–where cameras and sensors read our gestures and translate them into computer commands. Companies like Oblong Industries are developing a universal gesture-based language, so that we can communicate across computing platforms–“where you can walk up to any screen, anywhere in the world, gesture to it, and take control.” (Popular Science, August 2011)

Voice recognition–This is perhaps the most mature of the alternative technology control interfaces,and products like Dragon Naturally Speaking have become not only standard on many desktops, but also are embedded in many smartphones giving you the ability to do dictation, voice to text messaging, etc.

Haptics–This includes touchscreens with tactile sensations.For example, Tactus Technology is “developing keyboards and game controllers knobs [that actually] grow out of touchscreens as needed and then fade away,” and another company Senseg is making technology that produces feelings so users can feel vibrations, clicks, and textures and can use these for enhanced touchscreens control of their computers. (BusinessWeek, 20-26 June 2011)

Eye-tracking–For example, new Lenovo computers are using eye-tracking software by Tobii to control the browser and desktop applicationsincluding email and documents (CNET, 1 March 2011)

Telepathy–Tiny implantable chips to the brain, “the telepathy chip,” are being used to sense electrical activity in the nerve cells and thereby “control a cursor on a computer screen, operate electronic gadgets [e.g. television, light switch, etc.], or steer an electronic wheelchair.” (UK DailyMail, 3 Sept. 2009)

Clearly, consumers are not content to type away at keyboards and roll their mice…they want to interact with technology the wayย they do with other people.

It still seems a little way off for computers to understand us the way we really are and communicate.ย  For example, can a computer read non-verbal cues, which communication experts say is actually something like 70% of our communications?ย  Obviously, this hasn’t happened yet. But when the computer can read what I am really trying to say in all the ways that I am saying it, we will definitely have a much more interesting conversation going on.

(Source Photo: here)