Numerous pilot and feasibility trials – along with apps – have demonstrated the utility of Google Glass in the emergency department, laboratory medicine, medical education, and even for day-to-day use for those with hearing impairments.

However, one company is thinking beyond a single screen. Atheer’s device packs a full Android OS system with dual-screen glasses. What’s more, their 3D gesture recognition allows the user to scroll and push buttons displayed in front of them, akin to Minority Report.

Although the device is not yet available to consumers, the company is targeting healthcare professionals, researchers, and other enterprise customers.

We interviewed Sina Fateh, Executive Vice President of Atheer Labs (@atheerlabs), with Theo Goguely (@theogoguely), Senior Product Manager.

Augmented reality and glasses have been around for some time. Why are heads-up displays such a hot item now in healthcare?

Goguely: They’ve been stagnant for a long time because there has not been much R&D into the space, but since Google Glass and Oculus came out, smart glass manufacturers, micro-optics companies, all of the guys who’ve been in this space [for years] are now trying to improve to fulfill the increased demand.

Fateh: I moved 20 years ago [to the United States, from France]. I came to build a heads-up display (HUD) and people said there were no applications. This [concept of an HUD] has been around many years. We’re going to change that and make a HUD useful especially in healthcare. It’s not just about having cool technology, but really, it’s a product trying to solve a problem.

Goguely: These have actually been used in military applications for quite a long time now. It was a tool they had to use… and had to wear for several hours a day, even if it made them sick. Now, we’re trying to make it more easy, comfortable, and ergonomic to use, so people will want to use it on their own.

What makes your particular product unique, especially from Google Glass?

Goguely: Two things: first, it’s a completely immersive mobile experience. It runs Android on a Snapdragon processor, and provides you with a very large stereoscopic 3D display in your natural field of view. It runs on a battery, so it’s a bit bulkier than a smartphone, but you can take it with you, around the hospital, in the OR, in an engineering field, on an oil rig, or wherever your work is.

The second is the gesture interaction. A lot of people have Google Glass, Vuzix, and you see a pretty display through the glasses, but the interaction is still done in a 2D manner with a trackpad on the side of the glasses or on the controller box. There’s a disconnect between what is displayed and the interaction. We do all of our interactions via gestures, using a 3D camera that sees your hands and fingertips extremely precisely and accurately. So, if you see a virtual button in front of you, you can just reach out and touch it, and that is a very intuitive experience.

With Google Glass, people are now realizing it’s a bit limited in terms of what they can do with it. [Our device has] a larger field of view, binocular optics, gestural interaction, and as a result, allows for richer use cases and user experience, even if that means our developer kit is a bit bigger today.

And it runs Android?

Goguely: It’s all based on Android, even though our core technology is platform agnostic. It’s also immediately backwards-compatible with [all] of the Android applications already created and published out there. Partners and developers won’t have to learn a completely new environment; everything is built on standard Android APIs.

How do you see this working for doctors?

Fateh: I think there are so many applications and ways to improve the workflow. I think the primary value-add is ease of access to data when you want it, and wherever you are. Today, the tablet is a very good tool, but in healthcare, contamination is a huge problem. Just imagine you’re in the operating room, and you’re a technician or surgeon-with our system, you can keep your gloves on and still access and manipulate all the data you need. One [particular] application is that you can keep the glasses, wear them for hours, and access with just a virtual click on the 3D image you want. You don’t have to go outside the OR. [With our device, users] don’t touch anything. They just “touch” a space that’s virtual, so [there’s] no risk of contamination.

I’d imagine a lot of academic centers and universities would want to work with this. What kinds of things are researchers interested in with your device?

Theo Goguely: We’re in talks with a lot of universities and researchers in the 3D and UI/UX domain. There’s been all this 2D research over the past 20+ years which has resulted in highly optimized button and UI layouts for 2D GUIs. All that research has already taken place for 2D interfaces, and now we need it for 3D displays and interaction. All these guys in academia know this is the next research frontier and want to tackle it, but there’s never been widely accessible hardware [available to them].

Fateh: We also see surgeons and cancer [centers] approaching us because there are so many applications.

So who can buy this now?

Goguely: Primarily, [we’re targeting] enterprise [customers] right now, [and for] UI and UX research. For consumers, the experience is [not yet ready]. Consumers of today expect an iPad-like experience from their devices, and that will take awhile to develop and perfect. So we’re talking to medical specialists, field engineers, and people who are willing to invest time and effort to build the applications that best take advantage of the platform of the future, Atheer SmartGlasses.

Want to learn more? Watch an interview with Atheer Labs’s Sina Fateh — along with a demonstration of the device — at The Doctor Weighs In. For more info, visit Atheer Labs’s website.