Beyond Google Glass: The Evolution of Augmented Reality

The wearable revolution is heading beyond Google Glass, fitness tracking and health monitoring. The future is wearables that conjure up a digital layer in real space to “augment” reality.

SANTA CLARA, Calif. — Reality isn’t what is used to be. With increasingly powerful technologies, the human universe is being reimagined way beyond Google Glass’ photo-tapping and info cards floating in space above your eye. The future is fashionable eyewear, contact lenses or even bionic eyes with immersive 3D displays, conjuring up a digital layer to “augment” reality, enabling entire new classes of applications and user experiences. 

Like most technologies that eventually reach a mass market, augmented reality, or AR, has been gestating in university labs, as well as small companies focused on gaming and vertical applications, for nearly half a century. Emerging products likeGoogle Glass and Oculus Rift’s 3D virtual reality headset for immersive gaming are drawing attention to what could now be termed the “wearable revolution,” but they barely scratch the surface of what’s to come.

The Sword of Damocles head-mounted display. "The ultimate display would, of course, be a room within which the computer can control the existence of matter," Sutherland wrote in his 1965 essay. (Credit: Ivan Sutherland "The Ultimate Display")
The Sword of Damocles head-mounted display. “The ultimate display would, of course, be a room within which the computer can control the existence of matter,” Sutherland wrote in his 1965 essay. (Credit: Ivan Sutherland “The Ultimate Display”)

The wearable revolution can be traced back to Ivan Sutherland, a ground-breaking computer scientist at the University of Utah who in 1965 first described a head-mounted display with half-silvered mirrors that let the wearer see a virtual world superimposed on the real world. In 1968 he was able to demonstrate the concept, which was dubbed “The Sword of Damocles.”

P1040832_610x458
Steven Feiner of Columbia University and Steve Mann of the University of Toronto at the Augmented World Expo in Santa Clara, Calif., June 4, 2013. Both are now involved in the augmented reality startup Meta. (Credit: Dan Farber)

His work was followed up and advanced decades later by researchers including the University of Toronto’s Steve Mann and Columbia University’s Steven Feiner. In the second decade of the 21st century, the technology is finally catching up with their concepts.

The necessary apparatus of cameras, computers, sensors and connectivity is coming down in cost and size and increasing in speed, accuracy and resolution to point that wearable computers will be viewed as a cool accessory, mediating our interactions with the analog and digital worlds.

Augmented Reality past and future

“You need to have technology that is sufficiently comfortable and usable, and a set of potential adopters who would be comfortable wearing the technology,” said Feiner at the gathering of the fledgling AR industry at the Augmented Reality Expo here Wednesday. “It would be like moving from big headphones to earbuds. When they are very small and comfortable, you don’t feel weird, but cool.” He added that glasses with a “sexy lump of bump” with electronics and display could also be cool to the early adopters, especially the younger generation that has grown up digital. However, he didn’t have any prediction for when wearable computer would reach a mass market.

In the last decade, AR has been primarily focused on immersive gaming that teleports users to another world and on vertical applications, such as tethered, interactive 3D training simulations.

Screen_Shot_2013-06-06_at_2.43.54_PM_610x397
Augmented reality can help in training, such as learning how to weld aided by a 3D environment that tracks user movements precisely. Seabery Augmented Training’s Soldamatic application, pictured here, could be used for medical training, bomb disposal and other industry verticals. (Credit: Dan Farber)

But now augmented reality is about to break out into free space. “AR will be the interface for the Internet of things,” said Greg Kipper, author of “Augmented Reality: An Emerging Technologies Guide to AR.”

“It is a transition time, like from the command line to graphical user interface,” he said. “Imagine trying to do PhotoShop in a command-line interface. Augmented reality will bring to the world things beyond the graphical user interface. With sensors, computational power, storage and bandwidth, we’ll see the world in a new way and make it very personal.”

Will Wright, the man behind The Sims, speaking at the Augmented Reality Expo on June 4, 2013. (Credit: Dan Farber)
Will Wright, the man behind The Sims, speaking at the Augmented Reality Expo on June 4, 2013.
(Credit: Dan Farber)

Will Wright, creator of the popular The Sims family games, likened AR to having super-sensory abilities, like flipping a switch to see what is underground, beneath your feet. “It’s not about bookmarks or restaurant reviews…it’s something that maps to my intuition.” He hoped that instead of augmenting reality, the technology could “decimate” reality, filtering out even more information than the brain already does to engage reality with less cacophony.

Steve Mann, who is rarely seen without one of his wearable computing rigs and is considered the father of AR, views the wearable revolution as a benefit to society. Quality of life can be improved with overlays of information, adding and subtracting it to facilitate improved “eyesight,” he said. “The first purpose is to help people see better,” he said during his keynote at the expo.

Just as the smartphone is compressing a lot of the function from antecedent computing devices into a single product, wearable computing will eventually make the handheld smartphone irrelevant.

“The value proposition of digital eyewear is having all devices in one, with a camera for each eye representing full body 3D, and the ability to interact with an infinite screen. We are architecting the future of interaction,” said Meron Gribetz of Meta, a Ycombinator startup working on a new operating system and hardware interface for augmented reality computing.

“There is no other future of computing other than this technology, which can display information from the real world and control objects with your fingers at low latency and high dexterity. It’s the keyboard and mouse of the future,” he claimed.

Screen_Shot_2013-05-17_at_7.26.10_AM
Meta can project a 3D image on a wall and users interact with their hands. (Credit: Meta)

Atheer, a Mountain View, Calif.-based AR startup, is developing a platform that will work with existing mobile operating systems, such as Google’s Android. “We are the first mobile 3D platform delivering the human interface. We are taking the touch experience on smart devices, getting the Internet out of these monitors and putting it everywhere in physical world around you,” said CEO Sulieman Itani. “In 3D, you can paint in the physical world. For example, you could leave a note to a friend in the air at restaurant, and when the friend walks into the restaurant, only they can see it.”

The company plans to seed its technology to developers this year and have its technology embedded in stylish, lightweight glasses with cameras next year.

The transition to touch and gesture interfaces doesn’t mean that the old modes of human-computer interaction go away. Just as TV didn’t replace radio, augmented reality won’t obliterate previous interfaces. The keyboard might still be the best interface for writing a book. Nor is waving your hands in front of your face all day a good interface.

“Holding hands out in front of self as primary interface is the ‘gorilla arm’ effect,” said Noah Zerkin, who is developing a full-body inertial motion-capture system for head-mounted displays. “You get tired. We need to have alternative interfaces. If not thought-based, it needs to be subtle gestures that don’t require that you to wave hands around in front of your face.”

3Gear Systems is working on technology that allows 3D cameras mounted above a keyboard, like a lamp, to detect smaller gestures just above the keyboard, such as pinching to rotate an object on a screen, and can use input from all 10 fingers with millimeter-level accuracy.

Some companies are taking less radical approaches, focusing on inserting a layer of digital information into scenes via smartphones. Par Works, for example, image recognition technology makes it possible overlays digital imagery on real world data, such as photos and videos, with precision. A person looking for an apartment takes a picture of a building with a smartphone and the app overlays information on the image, or a shopper will see coupons or other information for various products on a shelf in a drug store.

Brands are adopting AR technology to increase performance of ads and sales. Several companies provide ways to turn a print ad into an interactive experience just by pointing the camera at the paper or an object with a marker. Blippar, for instance, recognizes images by pointing a phone camera at ads or object with its mark and inserts virtual layers of content.

The future of augmented reality

And where is all this heading over the next few years? It’s beginning to look like a real business, just as mobile did nearly a decade ago. Mobile analyst Tomi Ahonen expects AR to be adopted by a billion users by 2020. Intel is betting that AR will be big. The chip maker is investing $100 million over the next 2 to 3 years to fund companies developing “perceptual computing” software and apps, focusing on next-generation, natural user interfaces such as touch, gesture, voice, emotion sensing, biometrics, and image recognition.

Apple isn’t in the AR game yet, but the company has been awarded a U.S. patent, “Synchronized, interactive augmented reality displays for multifunction devices,” for overlaying video on live video feeds.

Screen_Shot_2013-06-06_at_2.43.19_PM
AR is looking is might be the 8th mass market to evolve, following print, recordings, cinema, radio, TV, the Internet and mobile, according mobile industry analyst Tomi Ahonen. (Credit: Tomi Ahonen)

Eyewear will evolve over the next year with comfortable stylish glasses with powerful embedded technology. They will range from Google Glass-style glance-at displays that also replace the phone to stereoscopic 3D-viewing wearables for everyday use.

“You’ll get 20/20, perfectly augmented vision by 2020, with movie-quality special effects blended seamlessly into the world around you,” said Dave Lorenzini, founder of AugmentedRealityCompany.com and former director at Keyhole.com, now known as Google Earth. “The effects will look so real, you’ll have to lift your display to see what’s really there. There’s more of the world than meets the eye, and that’s what’s coming.”

He cautioned that the growth of the AR industry could be slowed by a lack of standards to connect disparate players and their formats for bringing a 3D digital layer to life. “The AR industry has to get together to power the hallucination of what’s do come,” Lorenzini said. He added that a key turning point will be the availability of the WYSIWYG (What You See Is What You Get) real-world markup tools needed to bring this digital layer to life.

When the AR industry does take off, Lorenzini envisions a trillion dollar market for animated content, services and special effects layered into the real world. “Imagine people tagging friends with visual effects like a 3D halo and wings, or paying for a face recognition service to scan and add a floating name tag over the head of everyone in a room,” he said. “AR will grow from specific vertical uses to mass market appeal, driven by young, early adopters.

“Anyone reviewing devices like Google Glass needs to take it to their kids’ school before they pass judgement,” Lorenzini added. “This is not a device from our time, it’s from theirs. They love it, use it effortlessly, and are totally unfazed by ad targeting or privacy concerns. It will be be a natural part of who they are, how they learn, connect and play.”

Eventually, wearable technology will become more integrated with the human body. With advances in miniaturization and nanotechnology glasswear will be replaced with contact lens or even bionic eyes that record everything, make phone calls and allow you to use parts of your body, or even your thoughts, to navigate the world.

“Contact lenses are difficult now but the bionic eye will become commonplace and AR will just be a feature,” Kipper said. “Some may choose to have eyes in back of their heads, and some won’t. Some will want to be cyborgs. We will always use tools as advanced as they can be to help ourselves.”

Brian Mullins, CEO of Daqri, an augmented reality developer of custom solutions, went even further in melding humans and technology. “Thinking is the future of AR,” he said. Mullins talked about measuring “thought intensity” with EEG machines and focusing the mind to manipulate objects during a panel discussion at the Augmented Reality Expo.

Of course, the technical challenges are accompanied by issues of social etiquette and privacy. Smartphones are now a well-accepted part of daily life in most countries, but issues around data ownership and access to the data abound. The subtlety and potentially always-on capacity of wearable technologies will create more privacy concerns and challenges to acceptance.

Feiner acknowledged that it’s “scary” in terms of the information available, especially when billions of people with cameras and microphones can capture anything in public. “There are no laws against it,” he noted.

He gave Google some compliments for not overloading Glass with features. “It not suffering from doing too much too soon,” he said. Whether Google Glass is the tip of the spear for the mass adoption of far more powerful AR is uncertain, but it is doing a good job of surfacing the issues around the introduction of a disruptive, new way of computing.

Nicola Liberati, a Ph.D. student in philosophy at the University of Pisa studying the intersection of humans and technology, suggested another line of thinking about AR in his presentation at the expo. “We should not focus our attention only on what we can do with the such technology, but even on what we become by using it.”

So, when you are strolling down the street wearing the latest digital eyewear from Google, Apple or some as yet unformed or now early-stage company, with your continuous partial attention on the 3D holographic screen feeding you all kinds of personalized information about the environment around you, zeroing in on the people and places in your field of view or piped in remotely from around the real and virtual worlds, and spaces in between, think about what we have become.

It all depends on your perspective.

Written by: by Dan FarberCNET (via Presence)
Posted by: Situated Research

This Post Has 0 Comments

Leave a Reply

Back To Top
Search