Google has been somewhat less-than-forthcoming about its Google Glass project, aside from some high-flying demonstrations. But as the augmented-reality glasses evolve from drawing-board concept to reality, the search-engine giant is revealing a bit more—at least for developers.
Earlier in February, Google hosted a pair of “Glass Foundry” summits in New York City and Francisco, inviting small groups of developers to explore how to build applications for Google Glass. A new posting on Google Developers’ Google Plus page offers a bit more detail on what went on at the events. (Hat tip to Ars Technica for posting the link to Google’s media.)
“In San Francisco and New York, the selected Glass Explorers were able to spend two days with Glass and the API we’ve developed,” read the Developers’ posting. “They formed teams and built over 80 new ways to use Glass.” Eight teams won free Glass Explorer Edition sets, an early version of the Google Glass hardware; at last summer’s Google I/O conference, those devices had a sticker price of $1,500.
While the actual APIs remain confidential for the time being, that could quickly change: Google is also planning a Google Glass event for next month’s South by Southwest (SXSW) show in Austin, Texas. “We’ll look at Glass in people’s lives with emphasis on how to use the cloud API to build new experiences and bring people closer together,” reads the official summary of that session. Sounds like the company could have something big in the works.
At least in theory, augmented-reality devices could open up a whole new world to developers—while presenting new challenges. Building an app for smartphones or tablets is one thing, but sending data in a concise and readable way to a tiny screen implanted on the right lens of a pair of eyeglasses is the sort of conundrum that keeps even the most brilliant coders awake late into the night. That being said, if devices such as Google Glass hit the market and prove a sizable hit, the profits from augmented-reality apps could turn out to be enormous—who wouldn’t want a navigation app that overlaid directions across the user’s perspective?