Google may have debuted their first smart glasses in 2012, but the technology is only now starting to gain momentum. Microsoft, Apple, and Facebook are all developing their own products. But what are smart glasses, and will they be coming to an event near you?
Like our iPhone and Android devices, smart glasses incorporate computer-like features. But instead of looking at a screen, wearers look at images projected onto one or both sides of their eyeglasses — at least with the higher-end models. (Some smart glasses are designed just for interacting with AI voice assistants, while others focus on GoPro-like video recording functionality.)
For now, smart glasses are mostly being marketed as enterprise products, meaning that they are meant to assist with specialized work processes like surgery, construction, and package sorting in shipping centers. A surgeon, for example, might see an MRI scan projected over a patient in the operating room. As the technology improves and lowers its price point, it may be only a matter of time before it’s marketed as a mass consumer product.
With their mix of corporate backing and social networking, business events could be one of the earliest testing grounds for more widespread use. Would attendees welcome the novel technology or begrudge it as a distraction — or even push to ban it altogether out of privacy concerns?
Where the Technology Is Now
When Alex Kipman, the lead developer of Microsoft’s Hololens smart glasses, spoke to tech reporter Dieter Bohn in 2019, he said that the technology was not yet good enough for mass consumer use. He also claimed, however, that smart glasses would usher in the “third era of computing” — with just as much impact as personal computers and smartphones.
Further, it’s clear that Microsoft is thinking about applying this technology at business events and meetings. Speaking to Microsoft’s ambitions for product improvements, Bohn’s 2019 article noted that “Kipman talks a lot about distinguishing between identically boring conference rooms that are in the same spot on different floors.” This kind of scanning ability would allow the technology to trigger customized augmented reality (AR) effects in response to the environment, such as a sponsor’s logo floating above their exhibitor booth. (To be clear, the technology was already able to create these kinds of responsive and interactive AR effects in 2019 and has even improved since then, but some of the finer points are still a work in progress.)
Since that 2019 interview with Kipman, the technology has only been gaining further popularity. Google’s 2012 prototype might have been widely panned as a failure, but fast forward less than a decade later, and the technology may finally be delivering on its initial promise.
MARKET DEMAND FOR SMART GLASSES
Among a bevy of new competitors, Google released an updated model of its pioneer product in 2019. While the new-and-improved Google Glass device was initially sold exclusively through partnered companies, 2020’s upgraded product is now available for direct purchase. In a similar sign of greater accessibility, Microsoft is now sending its latest Hololens release directly to reviewers rather than insisting they come to Microsoft headquarters for time-limited product trials.
For those questioning whether smart glasses will have an “iPhone moment,” it’s worth noting that Apple is also in on the game. The internet is abuzz with rumors around the company’s much-anticipated Apple Glass product, which has yet to be released.
And major investors are watching. Magic Leap, a company that specializes in smart glasses, recently announced an additional $500 million in funding. Do current sales justify this level of investor confidence? Judging by earnings reports from market competitors, the answer is yes. Vuzix, one of the most established players in the business, has enjoyed 150 percent growth in sales over the past year.
Additionally, the International Data Corp. (IDC) forecasts that virtual and augmented reality (VR and AR) headsets will soon be one of the fastest growing tech markets. Further, IDC executive Tom Mainelli predicts that AR smart glasses will be available for consumer use within the next four years.
WHAT SMART GLASSES CAN ACTUALLY DO
It’s understandable that surgeons might want to keep body scans in their direct line of vision while operating, and that someone in a package sorting plant might want automated visual cues that signal which box to pick — research suggests this function makes workers 30 percent more efficient. But what would the rest of us want with mini overlays interrupting our literal view of the world?
One major unanswered question is whether AR smart glasses will really be something we eventually wear most of the time, or simply remain a device we put on for specialized work.
How it applies to events
Another possibility is that we will wear them exclusively to enhance special experiences like a keynote presentation at an event.
When Julius Solaris wrote a post on AR technology for EventMB in late 2019, his biggest objection to using the technology at a live event was that users “needed a phone to experience it” and “touch the screen to interact with objects”. AR smart glasses can solve this problem, since users can see the effects without shifting their field of view to a tiny phone screen.
Microsoft's Hololens creates such an interactive experience that the company insists on calling its technology “mixed reality” instead of “augmented reality”. It tracks the wearer’s hand movements, allowing users to manipulate 3D holograms and other AR effects and offers a whole new level of coordination between the real physical world and the virtual objects projected onto it. CNET’s editor at large likened the experience to having magical powers:
To control far-off things, I open my hand and cast a beam like I'm Vision. There's a feeling of having supernatural powers that flows through the HoloLens interface.
Scott Stein, editor at large, CNET
As a final point, the product is also able to use artificial intelligence (AI) to correctly identify commonplace objects in the environment, such as couches and desks. It’s worth noting, however, that most other smart glasses on the market do not have this level of functionality, and the Hololens currently retails for $3,500.
How it applies to events
It’s conceivable that some trade show exhibitors and conference presenters might want to take advantage of this technology. Buyers could meaningfully interact with holographic product demos, and audience members could explore 3D charts and graphs in greater detail.
In time, it may be possible to integrate facial recognition with the technology. At B2B events, this could allow for a number of networking benefits. Users could, for example, voluntarily agree to have their contact information display for specific groups of fellow attendees when they pass each other in person. This kind of function could even be integrated with LinkedIn so a user’s full professional history would be easily accessible. No one would ever have to worry about awkwardly forgetting a contact’s name at the wrong moment!
Apple’s yet-to-be-released smart glasses will also have AR capabilities. But instead of capturing hand movements with sensors, Apple Glass is expected to come with patented rings that will track finger movements, in this way allowing for tactile interaction with 3D holograms. It is also expected to include a “color keying” function, meaning that it will be able to replace specific blocks of color with screen overlays. Think of it like changing a Zoom background, only doing it for objects in the real world. Apple is even putting out patents for technology that could replace prescription lenses (presumably by enhancing the wearer’s view with a digital display) and improving night vision with sensors that gauge depths.
The product is also much more affordable at a projected cost of $499, but unlike Google’s self-contained Hololens device, Apple Glass will have to be paired with a smartphone to function.
How it applies to events
While it may still be in development, technology that allows for this level of visual manipulation could have many uses in the world of events. In a keynote presentation, for example, the wall behind the stage could be keyed out, potentially allowing the audience to switch between different close-up shots of panel presenters at will with a simple swipe of their hands in the air.
It’s also conceivable that users could choose to zoom in on specific objects within their view. This could be an advantage for audience members who want a closer view of a keynote speaker, although it’s unclear whether presenters at a conference would want the pores of their skin subject to this kind of scrutiny.
Current Mid-Range Smart Glasses
While the Hololens and anticipated Apple Glass devices are true AR glasses, there are a number of mid-range products on the market that include less advanced visual effects. All of them need to be paired with smartphones to work.
The Google Glass ($999) and Vuzix products (starting at $799) mentioned earlier fall into this category. They offer a feature called “heads up displays”, or 2D projections that appear in the user’s view. Vuzix, for example, has a line of swim goggle attachments that display information like GPS coordinates. Competitive open-water swimmers, for example, can locate their position without having to look up.
How it applies to events
At conferences, this kind of technology could conceivably be used to display real-time subtitles or closed-captioning right in the wearer’s field of vision. It could also be used for more intuitive wayfinding. Rather than following a path on a map, attendees could follow arrows that light up right in front of them. Similarly, important alerts could be displayed right on their lenses.
Not all smart glasses on the market are focused on projecting digital displays in the wearer’s field of vision.
Amazon’s Echo Frames ($225) simply include a “smart” speaker and microphone, allowing wearers to give Alexa voice commands. Facebook’s “Ray-Ban Stories” smart glasses ($299) are designed primarily for recording a user’s surroundings for the purpose of sharing “stories” on social media. Notably, like Vuzix’s “Blade” line of smart glasses, both Amazon’s and Facebook’s glasses are designed to look as much like regular eyewear as possible.
How it applies to events
Facebook’s “Stories” glasses could potentially be useful for organic marketing purposes if attendees are encouraged to share short clips from the event. On the flipside, they could also present the risk of attendees illegally recording protected intellectual property — or of secretly filming fellow attendees without their consent.
There are a number of factors beyond price that could limit the potential for mass adoption of smart glasses. One is battery life — most current models can run for about 3 hours before needing to be recharged.
The biggest objections raised so far center on privacy concerns. Russia’s federal security service, the FSB, has even suggested that Facebook’s smart glasses are a spy device.
Ray-Ban Stories, like most other smart glasses on the market, are designed to flash a light whenever someone is recording, but it’s not clear how easy it will be to disable this safeguard. Notably, Apple has filed patents around special safety mechanisms designed to make it impossible to film when the light is turned off, as well as another for a removable camera.
But perhaps even more to the point, any AR applications that react to the surrounding environment will require some kind of scanning technology. While lasers and other types of sensors might be sufficient for some applications, many will presumably require a video camera to analyze the environment. Further, true AR glasses require so much processing power that they either have to connect to a smartphone through WiFi, or connect directly to the cloud (as is the case with Microsoft’s Hololens). In other words, this technology involves connecting all sorts of information about the user’s immediate surroundings directly to the internet.
Just how secure this information will be remains to be seen. Attendees at a conference might be willing to accept the privacy risks that presumably come with this kind of technology for short periods — for example during a presentation or product demo. Whether they would want to wear them during private business negotiations may be another matter.
The final objection that some have raised is simply a matter of comfort. Will the glasses be too heavy to wear for long periods of time? Will they look awkward? When Mark Zuckerberg recently spoke to The Verge about the idea of a “metaverse” where people can meet together in virtual or augmented reality, he raised these very issues. Zuckerberg theorized that short-duration VR sessions may one day move towards all-day AR experiences, but that the technology would first have to get lighter and smaller:
[...] I think to get AR glasses that we wear around throughout the day, they have to be normal-looking glasses, right? So you’re basically cramming all of these materials to build what we would’ve thought of as a supercomputer 10 years ago into the frame of glasses that are about five millimeters thick [...]’
Mark Zuckerberg, CEO, Facebook
Microsoft’s Hololens gets around the comfort issue by wrapping around the wearer’s head, but the device is far from normal looking. On the other end of the spectrum, there are Amazon’s semi-normal looking Echo Frames, which have limited functionality but are so heavy that tech reviewer Jess Grey described her ears as “feeling about as oppressed as an Amazon warehouse worker”.
Once again, attendees at a conference or business meeting might be open to wearing a futuristic-looking device for short periods, but it may be a while yet before they come to an event sporting devices of their own.
With any new technology comes the potential for misuse. The possibility of recording videos without consent not only raises privacy concerns but also presents the risk of compromising intellectual property rights. Attendees could, for example, easily decide to record and distribute speaker presentations. Google Glass has already been banned by a number of theaters and venues for this reason.
Other kinds of misuse might be even harder to track. Building on AR’s most commonplace use on apps like Snapchat and Instagram, it’s conceivable that wearers could decide to apply face filters — or other kinds of image overlays — to everyone in their view. Alternatively, someone could be sitting in a meeting, looking straight at a presenter, and secretly watching a favorite TV show. Although these issues are likely to be a bigger concern for high school teachers than for event organizers, they are still worth considering.
The signs are clear: Smart glasses are finally proving to have real-world benefits. What’s less certain is how far their utility will go. Will they show up at an event near you? Alternatively, will attendees show up wearing them — or even be banned from entering with them on?
All of the above may come to pass in different scenarios at different times. Smartphones are now so ubiquitous that we actually have a term for the sense of feeling “naked” without one: nomophobia. A recent Australian survey found that 99 percent of smartphone users experience this fear. Will smart glasses one day have the same hold over us? Business events may be one of the first places where the technology’s potential for mass adoption is tested.