Commentary

Will AI Do For Search What It's Doing For Robotics?

Geoffrey Hinton, known as the godfather of deep learning -- the tech that helped Google's AlphaGo beat a master at Go -- said the most powerful machines are about a million times smarter than the human brain, and becoming more sophisticated each year.

Hinton, who splits his time between working at Google and the University of Toronto, earned a PhD in AI from Edinburgh in 1978.

While Google uses AI in its search engines to learn how to return smarter query results, Hinton predicts it still will take more than five years before machines possess human-level abilities. Start with search engines and move the technology into android-looking robotics.

The robotics use many of the technologies required for smarter search engine queries, such as artificial intelligence and natural language processing.

advertisement

advertisement

On Wednesday at Google's GCP Next 2016 Cloud conference, the company demonstrated its cloud vision API that recognizes objects, labeling and categorizing them. It also detects faces, and languages. The technology identifies landmarks and returns latitude and longitude. It uses the same vision learning technology in Google photos.

That reality of improving AI for search as engineers improve AI for robotics could come sooner than later. David Hanson, founder of Hanson Robotics, explained in 2009 at the Long Beach Ted conference how robots will learn how to have empathy, not just sentience. The keyword being "learn," gaining knowledge by interacting with humans similar to the way the search engine gains knowledge from queries.

Hanson, who worked as an imaginer at Disney, built a series of small robots intended to facilitate research, and entertain as a companion, among other functions. At another 2012 Ted Conference in Taipei, where he urges an open-source movement to rally developers, Hanson said it would take robots to achieve "human-level brilliance" between 2027 and 2032.

At South By Southwest, Hanson unveiled Sophia, a lifelike robot capable of 62 facial expressions since first activated in April 2015. The robot, which is both creepy and remarkable, uses a combination of Alphabet's Google Chrome voice-recognition technology and other software that helps Sophia process speech, hold a conversation, remember interactions and become smarter over time.

Hanson also works with Intel and IBM to explore using some of their respective technologies.

It turns out that human-looking robots may not be the best thing for society. A study published in the International Journal of Social Robotics reveals when robots look like humans, the similarity blurs boundaries, undermining human uniqueness.

Androids raised the highest concerns for the potential damage to humans, followed by humanoids and then mechanical robots, per the study. How does this technology affect the future use of search technology?

Next story loading loading..