Emotient, which specializes in facial expression analysis, and iMotions, an eye-tracking and biometric software platform company, have announced that Procter and Gamble, The United States Air Force and Yale University are its first customers for a newly integrated platform that combines facial expressions recognition and analysis, eye-tracking, EEG and GSR technologies.
According to the companies, the new cobmbined solution is designed for usability research, market research, neurogaming as well as academic and scientific research.
Google has filed a patent suggesting users stick out their tongue or wrinkle their nose in place of a password.
It says requiring specific gestures could prevent the existing Face Unlock facility being fooled by photos.
…and then think about Google Glass (or something similar offered by another brand) and the things that become knowable as these technologies are combined and others are added. Iris and face for backward-facing and front-facing ID, knowing precisely what (or whom) someone is looking at when a certain change in neurological activity is noted. Or, precise targeting of weaponry controlled by the eye’s movement along with detailed observations of the neurological states of combatants.
Right now, all of it seems like a long way off, and it is. Significant scientific, technological, and organizational barriers exist. The technology of measurement; the science of interpretation; the fact that a lot of small players own small pieces of the puzzle; integrating the pieces: each present significant challenges. But…
“Most people overestimate what they can do in one year and underestimate what they can do in ten years.”
Stay tuned. Ubiquitous multi-modal sensors and the real-time ability to interpret and act on the data they collect would have profound effects.