You Can’t Hide Your Feelings


You can try to hide your feeling, but it won’t work…



Your emotions are now an open book to anyone with facial-recognition software, such as from Emotient, Affectiva, and Eyeris.



This video from Emotient shows examples of Dr. Marion Bartlett demonstrating very well how the system is able to pick up on her expressions of joy, sadness, surprise, anger, fear, disgust, and contempt. 



From broad displays of emotion to subtle spontaneous, natural displays, to micro, fast and involuntary expressions, the system detects and clearly displays it. 



Described in the Wall Street Journal, the software, in real time, successfully uses “algorithms to analyze people’s faces” and is based on the work of Dr. Paul Ekman, who pioneered the study of facial expressions creating a catalog in the 1970s with “more than 5,000 muscle movements” linked to how they reveal your emotions. 



A single frame of a person’s face can be used to extract 90,000 data points from “abstract patterns of light to tiny muscle movements, which get sorted by emotional categories.”



With databases of billions of expressions from millions of faces in scores of countries around the world, the software works across ethnically diverse groups. 

Emotion-detection has a myriad of applications from national security surveillance and interrogation to in-store product marketing and generally gauging advertising effectiveness, to helping professionals from teachers to motivational speakers, executives, and even politicians hold people’s attention and improve their messaging.



Then imagine very personal uses such as the software being used to evaluate job applicants or to tell if a spouse is lying about an affair…where does it end?



Of course, there are serious privacy issues in reading people’s faces unbeknownst to or unwanted by them as well as possibilities for false positives, so that people’s feelings are wrongly pegged or interpreted. 



In the end, unless you wear a physical mask or can spiritually transcend yourself above it all, we can see you and soon we will know not just what you are feeling, but also what you are thinking as well…it’s coming. 😉

>Balancing Freedom and Security

>

There is a new vision for security technology that blends high-tech with behavioral psychology, so that we can seemingly read people’s minds as to their intentions to do harm or not.

There was a fascinating article (8 January 2010) by AP via Fox News called “Mind-Reading Systems Could Change Air Security.”

One Israeli-based company, WeCU (Read as we see you) Technologies “projects images onto airport screen, such as symbols associated with a certain terrorist group or some other image only a would be terrorist would recognize.”

Then hidden cameras and sensors monitoring the airport pickup on human reactions such as “darting eyes, increased heartbeats, nervous twitches, faster breathing,” or rising body temperature.

According to the article, a more subtle version of this technology called Future Attribute Screening Technology (FAST) is being tested by The Department of Homeland Security—either travelers can be passively scanned as they walk through security or when they are pulled aside for additional screening are subjected to “a battery of tests, including scans of facial movements and pupil dilation, for signs of deception. Small platforms similar to balancing boards…would help detect fidgeting.”

The new security technology combined with behavioral psychology aims to detect those who harbor ill will through the “display of involuntary physiological reactions that others—such as those stressed out for ordinary reasons, such as being late for a plane—don’t.”

While the technology married to psychology is potentially a potent mix for detecting terrorists or criminals, there are various concerns about the trend with this, such as:

1) Becoming Big Brother—As we tighten up the monitoring of people, are we becoming an Orwellian society, where surveillance is ubiquitious?

2) Targeting “Precrimes”—Are we moving toward a future like the movie Minority Report, where people are under fire just thinking about breaking the law?

3) Profiling—How do we protect against discriminatory profiling, but ensure reasonable scanning?

4) Hardships—Will additional security scanning, searches, and interrogations cause delays and inconvenience to travelers?

5) Privacy—At what point are we infringing on people’s privacy and being overly intrusive?

As a society, we are learning to balance the need for security with safeguarding our freedoms and fundamental rights. Certainly, we don’t want to trade our democratic ideals and the value we place on our core humanity for a totalitarianism state with rigid social controls. Yet, at the same time, we want to live in peace and security, and must commit to stopping those with bad intentions from doing us harm.

The duality of security and freedom that we value and desire for ourselves and our children will no doubt arouse continued angst as we must balance the two. However, with high-technology solutions supported by sound behavioral psychology and maybe most importantly, good common sense, we can continue to advance our ability to live in a free and secure world—where “we have our cake and eat it too.”

>Architecting Politics With Mind Reading

>

Enterprise architecture is good when it help the organization improve mission execution and achieve desired results of operation. However, what happens when it is used to influence politics and elections, such as through the application of brain scanning technology to get votes?

The Wall Street Journal, 14 December 2007 reports that neuroscience and mind reading is being used for “monitoring voters’ brains, pupils and pulses, [and] may be more effective than listening to what they have to say.”

How is this monitoring taking place?

Technology is being applied to support the information requirements of the mission, which in this case is to see what provokes positive images and voting tendencies from candidates to the populace. For example, recently, “volunteers watched the debate while wearing electrode-studded headsets that track electrical activity in the brain.”

How widespread is this?

Neuromarketing firms across the country are pitching their services to presidential campaigns.” One example is “a biofeedback program that tracks brain waves, pupil dilation, perspiration, and facial-muscle movements.”

One Stanford political scientist, who works with the American National Election Studies, that “we need a tricky way to get into people’s minds and find out who they’re going to vote for instead of asking directly.” Another research director stated, “traditional methods of polling voters are sometimes inaccurate — ‘people may say one thing in a focus group and do another thing in the voting booth.’”

How long has this been going on?

“In the 1980s, focus groups became popular, as did ‘dial groups’ where participants register their reactions to candidates with electronic dials. The most cited innovation in 2004 was microtargeting, a strategy borrowed from corporate marketing firms that involves tailoring specific messages to individual households based on their consumer profiles…in recent years, advances in brain scanning technology have allowed researchers to identify areas of the brain involved in political beliefs.”

The neuroethics program director at the University of Pennsylvania stated that “taken to its logical limit, it’s a kind of mind reading.” While a democratic strategist stated that “the wide adoption of these techniques is inevitable.”

From a User-centric EA perspective, one has to wonder whether this type of “mind reading” and architecting of politics using the latest in brain scanning technologies is really appropriate. Why are we “tailoring the message” to voters, based on the technical scanning of their brains, instead of being honest and forthright about political means and ends. Perhaps, we are getting to slick with the technology and applying it a little too liberally to an area that should be quite personal and important – people’s politics and voting behaviors.