Apple has announced an application programming interface (API) to deep link content within apps, making it searchable, through a proactive assistant as part of iOS 9. It leverages Siri and data on the user's phone to push out time- and location-sensitive content. The technology also makes suggestions based on content and apps.
Apple believes these intelligence features make a big difference in iOS 9, Craig Federighi, Apple's senior vice president of software engineering, said at WWDC during the keynote. The company will bring proactive technology across its network of apps and services.
At the Worldwide Developers Conference on Monday, Apple unveiled many key changes to its voice assistant, such as the ability to remind users of appointments and other events to compete with Google Voice, Google Now, and Microsoft Cortana. The changes basically remind the user of appointments without having to tell it through emails or text messages.
Holding down the "Home" button on the iPhone, users can issue voice commands to do a variety of things such as setting alarms, launching apps, and searching the Web. Now the voice assistant plays a more powerful role in search, including searching in video apps.
Siri allows users to find photos taken of someone on a specific day or during a specific event by saying "Show me photos from Wyoming last August." When receiving a phone call from a number the receiver doesn't recognize, Siri will look in the user's email to find out who the caller might be and list the person on the screen. It also will search for music in your iTunes library and play the song.
Microsoft will make Cortana available to iOS owners later this year. Cortana also reminds users of meetings, flights and other activities that are already in schedules.