From an Amazon Echo-killer to a VR headset, Google’s I/O conference was oozing with big news this week. At least in the short term, however, some of the more modest developments may be of
greater interest to mobile developers.
Take Google’s new Awareness API, which was designed to help developers build apps that intelligently respond to users’ real-world
situations.
The API encompasses seven types of context, including location, weather, user activity, and nearby beacons.
Making things really interesting, developers are being
encouraged to combine context signals to make inferences about a user's present situation, and then use this information to provide customized experiences.
The possibilities seem endless.
Imagine Pandora suggesting a mellow track when you plug in your headphones on a week night, but something livelier as you’re approaching a favorite running trail.
Or, imagine Uber
asking you if you’d like a ride home from a friend’s house because it’s getting late, and freezing temperatures in your area have dropped to dangerous levels.
The
new API actually consists of two distinct APIs, which apps can use to get context signals to determine a user’s current situation.
One is a Fence API, which lets an app react to a user's
current situation, and then provides a notification when a combination of context conditions are met. Secondly, a Snapshot API makes it possible for apps to request information about a user's current
context.
Needless to say, Google is relying on some pretty advanced algorithms to determine user activity with new levels of accuracy.