Tapping into All the Smartphone Sensors

For the longest time, only a portion of the power of smartphones has been tapped for commerce.

There are various reasons for this.

Part of it is that the actual processing power and technology capabilities of smartphones have been outpacing the consumer adoption rate.

Another is that many consumers don’t realize how much their smartphones actually can do.

For example, in-store product scanning to check the best prices on products still is not mainstream, even though consumers could routinely save money on every shopping trip by a very quick product scan.

Yet another reason for not utilizing the inherent smartphone capabilities has been the sheer complexity for app developers to tap into the many sensors being included in smartphones.

Sensors in various smartphones can include GPS, accelerometer, gyroscope, ambient light reader, magnetometer, barometer, fingerprint reader, camera and microphone.



“I saw more and more sensors being added to phones, but developers weren’t using a lot of them,” said Eli Portnoy, CEO and co-founder of Sense360, which officially takes the wraps off its new mobile sensor tracking platform today.

Portnoy is somewhat of a mobile location pioneer, who as the CEO of ThinkNear identified that much location information used for delivery of targeted advertising was way off the mark, which I wrote about here at the time (54% of Location-Targeted Mobile Ads Off By More Than Half a Mile).

Since leaving ThinkNear last year, Portnoy has been focused on what he saw as yet another mobile industry shortcoming: the inability to easily leverage the increasing number of sensors being added to smartphones.

Obstacles include issues around privacy, battery optimization, turning data into meaning and the different technical aspects of tapping into each sensor, according to Portoy.

“It’s really, really hard,” said Portnoy, who has been spending the months since leaving ThinkNear figuring out how to do it.

The end product is a tool and platform for app developers so they can more easily use the phone sensors as they create and modify apps. The sensors work even if the app is off, a pretty nifty aside.

This strikes me as an obvious asset for retailers, though Portnoy says at the moment the company is only targeting app developers.

Hypothetically, a retailer could tell if a shopper is driving or walking, whether the smartphone is in a pocket or purse, what floor the shopper is on in a large department store and which way they are heading.

Tied into beacon-triggered information, phone sensor data aggregation could add an entirely new dimension to the mobile shopping experience.

Some developers already have been working with the Sense360 platform

For example, Bankrate subsidiary Wallaby Financial has been using the platform to send users a notification as soon as they drive into a gas station, informing them which of their credit cards would give them the highest reward on that specific day for that particular gas station, according to Matthew Gordman, CEO of Wallaby Financial.

Portnoy himself also uses the platform, though for more of a personal reason.

Since his work hours are long but hardly predictable, when he leaves work an automatic text message is sent home letting his wife know he’s on the way. The sensors in his phone know his location in relation to the office, along with the speed and direction to home, showing that he is on the way.

Various sensors have long been included in smartphones, though the challenge has been how to efficiently leverage all of them.

Time will tell if this new platform can convert phone sensor data into making sense for the market.




Next story loading loading..