Commentary

Kaleidoscope Eyes

LOOKING UP: MOBILE FROM 2010-2015

Sept. 23, 2015- It's been an interesting past five years, but let's put on our AR glasses and take a look back to the not-so-distant past. The evolution in capabilities, functionality, and user behavior in mobile has been nothing short of revolutionary. It's hard to believe that just a few years ago, mobile wasn't the primary computing platform, broadband penetration was below 90 percent, and AR glasses were barely a concept, far from the hot-ticket item they are today.

2010
The changes in 2010 really set the foundation for the years to come. There were three major events: Other smartphones caught up to the iPhone in capability, utility, and popularity, true 4G networks finally began appearing, and the FCC opened up the cellular provider market.

Every golden age must end - even the iPhone's. The end of exclusivity with AT&T and the release of a 4G model on Verizon's network catapulted sales - but Apple met serious competition. Android devices went from rare to dime-a-dozen, greatly increasing the appeal for third-party developers. Palm expanded the carriers for its Pre and released another handful of devices running WebOS. And Windows finally released Windows Mobile 7, closing the gap in the smartphone market overnight. These changes created a true competitive landscape, where innovation took the forefront.

While the level playing field for handsets was important, the most crucial change in 2010 was the release of 4G networks. Verizon rolled out its technology, called Long Term Evolution (LTE), to several major population areas. This release put the United States in the lead for wireless network speed, providing data rates that exceeded DSL and rivaled most home broadband speeds. Networks using this technology were able to perform data-heavy tasks, such as streaming 1080p HD video. Up to this point, the bottleneck for the mobile phone evolution had been the network. The path to personal computing dominance was now paved.

An often-overlooked but critical precursor to today's mobile capabilities was an FCC ruling in 2010. The FCC, investigating carrier exclusivity on handsets, as well as Early Termination Fees (ETF), ruled that carrier exclusivity was an anti-consumer practice, as it locked users into a carrier based on the hardware they owned or chose to purchase, and was unfairly punitive to smaller cellular providers. This opened up the market for cutting-edge phones to the entirety of mobile subscribers. However, the FCC decided to support the practice of ETFs, the considerable fee charged when a consumer ends a contract early. The FCC determined that these fees were a necessary evil, as they allow the providers to sell handsets at subsidized prices, which increases handset turnover and increases penetration of new technology.

2011
As is often the case, the true impact of the changes of 2010 weren't really felt until the aftershocks hit. Cable providers began regional rollouts of 4G networks, non-phone devices hopped on the 4G bandwagon, and the nature of information-seeking began to shift fundamentally.

Cable and broadband providers were in a rut. Consumers canceled plans in droves, opting for broadband-to-TV solutions. Broadband subscriptions began to decline as users jumped to 4G networks for home data connection. A number of cable providers began to roll out their own 4G networks in regions not yet covered by cellular providers. Because of the FCC ruling in 2010, these cable companies were able to be competitive overnight with unhindered access to best-of-breed devices. These efforts expanded 4G offerings much faster than the wireless telecoms would have managed alone.

As 4G speeds quickly became a baseline for all devices, from widget-TVs to tablet PCs, connection ubiquity was on the horizon.

This ubiquity ran in parallel with a larger trend: The way consumers sought information, and how the information itself was categorized, was redefined. Location-based services had become standard by this time, and online information was being restructured to add a layer of location relevance. Across devices, everything from display ads to searches became location-aware. Users began searching for information according to photos of products or landmarks. Touch was taking over as a standard method of interaction. After two years of touch PCs, from Windows 7 devices to Apple tablets, consumers embraced a more visceral method of seeking information. Keyboards still had their uses from time to time, but mice were starting to go the way of the typewriter.

2012
The year 2012 marked the death of the phone in mobile phone. This was perhaps appropriate, as it was also in this year that the mobile device was first viable as the primary computing platform, thanks to a new standard specification.

Google killed the "phone." It actually laid the trap well before 2010. The first step was Google's involvement in the wireless spectrum auction for the signal space made available with the end of analog TV. By participating in that auction, Google forced the spectrum to be open-access regardless of the type of service. So, when cell providers rolled out the 4G network using this same spectrum, that network was fair game for VoIP services. The second step was Google's adoption of number portability for its Google Voice offering in late 2009. This removed certain numbers from the carriers' domain, and when Google finally linked Google Voice and the voice services in their Google chat application, the death of the phone was ensured. It took a few years, but in 2012, it was finally a legacy term.

A new standard made mobile devices fit for much more than talking anyway. The Digital Living Network Alliance (DLNA) standard was, historically, a standard to allow content to move between multiple devices. Download video on a mobile phone and watch it on a TV? DLNA made that easy. The standard was baked into nearly every consumer electronic device by the start of 2012. So when the alliance saw the multitude of input devices and methods consumers were using to seek information, they incorporated input as well as output.

Mobile devices were now about four times as fast as in 2009, and with the new DLNA standards, a long-standing trend in computer purchasing habits finally hit mobile - the preference for ever smaller devices. In about 2006, consumers by and large stopped buying faster computers and started buying smaller computers. Between the increase in computing power and the investment in the cloud, laptops became as capable as desktops. Then netbooks came on the scene, providing most of the capabilities of laptops in a much more portable frame. Finally, in 2012, consumers took the new trend to all mobile devices. Thanks to touch monitors and keyboards that connected to the mobile using the new DLNA standards, the transition was seamless.

2013
Another aftershock year, 2013 was all about the market adapting to a new computing experience. Understanding a consumer that was not only always connected, but seamlessly connected, took some effort for businesses.

An example should illustrate the shift appropriately. In 2012, a user walking down the street and seeing another person wearing a cool shirt could take a photo of the outfit with an image recognition program, find the distributor, and buy the item from the mobile device or find the nearest retailer. In 2013, the user would still have the options of in-app purchase and finding nearby stores. But that user now had a third option: Upon returning home, the user could easily connect the mobile device to the TV and associated infrared-camera control, and create an image of themselves trying on the item in a sort of "magic mirror" experience. The user could then find the retailer's catalog from a home-based touchscreen that would offer a shopping experience much more rewarding than a small-form mobile device. And when the user saw other items he wanted to try on virtually, those could be queued up for the TV experience at any moment.

2014
One change stands out beyond all the others made in 2014: the introduction of Augmented Reality (AR) glasses. These glasses enabled real-life experiences previously limited to movies. Almost instantly, these devices seemed as necessary within the current culture as wristwatches were 50 years before.

The first conceptual promise of AR glasses came from a project at MIT known as "Sixth Sense." The system used a camera hooked up to a phone that output augmented reality to a projector. The utility was limited to a tech demo at first. But the improvements in display technology resulted in high-quality transparent screens capable of being built into glasses through which people could see the illusions of AR. These made much more sense than a projector, as it solved both energy concerns and some of the issues a projector would have raised in settings like a museum. Attaching small cameras to the sides allowed for optic-based interaction with depth perception. Since the DLNA standard solved the issue of incorporating input and output with mobile devices, it was relatively easy to get these glasses working with mobile phones. As users were already used to gathering information using touch and snapping photos, the gesture-based controls came naturally.

The response has been tremendous, and once again it's simply a matter of time until we see the technology reach its as yet unimagined potential.

Next story loading loading..