While humans are known for the ability to recognize emotions on people's faces, computers are just beginning to identify emotions communicated by the facial expressions in an image. Some call this face tracking, and it will help Microsoft target better ads. Not only for display advertisements, but search ads as well.
Machine-learning tools improve as they ingest and process data. Advancements in these tools have allowed computer scientists to create smarter apps that can identify sounds, words, images, and now facial expressions. So Microsoft is making available separate application program interfaces for vision, speech and language.
Last week, the Microsoft Project Oxford team released a new tool that analyzes eight emotional states including anger, contempt, fear, disgust, happiness, neutral, sadness, and surprise. Each emotion is identifies through something Microsoft engineers call "universal facial expressions." The tools was trained on a set of facial images portraying different human emotions. The tool can categorize the emotions of anyone visible in an image.
These APIs are designed to show off Microsoft's growing AI and machine learning capabilities for developers wanting to incorporate them into their projects. "The Emotion API takes an image as an input, and returns the confidence across a set of emotions for each face in the image, as well as bounding box for the face, using the Face API," according to Microsoft. "If a user has already called the Face API, they can submit the face rectangle as an optional input.
The technology can be used as a search tool to group collections of phones based on the emotion on the faces in the images.
Microsoft released the emotion tool to developers as a public beta, with plans to move the other tools into public beta by the end of the year. The forthcoming tools include spell check, video, speaker recognition, and custom recognition intelligent services.