Augmented Reality Glasses

An article1 describes the development of glasses that allow the wearer to read the emotions from the face being viewed. They are the result of research done by Rosalind Picard of the Massachusetts Institute of Technology’s Media Lab. The glasses use a vision algorithm to analyze 24 points on the face of the person being viewed. Head gestures and facial expressions (e.g., head tilt, lip part, pucker, smile, frown) are integrated over time to identify facial emotions (e.g., confused, agreeing, disagreeing, tinkering, concentrating, interested). These analyses are rolled up and portrayed as a traffic light system: red=negative, amber=neutral, green=positive. These are displayed on the glasses via an earpiece and LED traffic lights providing a summary of information about how the person you are talking to is responding. Eventually a full range of information could be displayed graphically, although its display would be challenging.

Unfortunately no data were providing regarding the performance of these glasses. Did they miss or misread cues? Perfect performance strains credulity, mine at least, so I would like to have seen some data. However, it does seem clear that the augmented glasses improved upon our normal unaugmented performance. They also used auditory inputs that use variations, in the pitch, tone, clip, and volume of the voice. These auditory inputs were recorded using in a small electronic badge that hangs around the neck. It was called the “jerk-o-meter.” This provided good feedback to users regarding whether they were being obnoxious or too self-effacing. They also provided good feedback to group performance regarding who was talking too much and who was being ignored.

The commercial world has expressed substantial interest in these devices. Some were interested in trying to identify units of speech that make a person sound more persuasive so that they could be taught to sales representatives to make them more persuasive. Research has also indicated that wearers retain some ability to read emotions after they removed the glasses.

Although the business case for this technology is clear, there are questions that should be raised regarding their general use. In our normal unaugmented state we can misread facial expressions. These misreadings can lead to problems in personal interactions. Would these augmentations increase our accuracy and enhance personal interactions or would we become too sensitive so that more tiffs broke out. Sometimes we do need to suppress the expression of our feelings to avoid offending people or precipitating an argument. These augmentations would make this suppression more difficult. There is much for careful considerations and discussion here.

1Adee, S. (2011). Your Seventh Sense. New Scientist, 2 July, 32-36.

© Douglas Griffith and, 2011. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and with appropriate and specific direction to the original content.

Tags: , , , ,

One Response to “Augmented Reality Glasses”

  1. Michelle Says:

    The article says that “The software, by contrast [to people] correctly identifies 64 per cent of the expressions…on real, non-acted faces”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: