To better understand the future of Google Glass, I recently interviewed Dave Lorenzini, a pioneer in geospatial and augmented reality industries. In 2000, he helped to start Keyhole.com, which was the first global 3D visualization system online. Today it is known as Google Earth and serves over a billion users worldwide. Now, as the founder of The Glassware Foundry, he helps companies create software for Smart Glasses such as Google Glass and prepare for “this revolutionary new way to look at the world.”
One of the many industries Lorenzini thinks will see real benefits from head mounted displays (HMD’s) is medical, especially in the surgical setting. I asked him how Google Glass could work in surgery.
“There’s a practical app for these guys—surgeons, neurosurgeons, heart surgeons—they don’t want to be constantly turning their head to look at another monitor during a procedure. So they really like the idea of being able to reference an X-ray or an MRI or reference image while they’re working on a patient.”
But, he also recognizes that Google Glass has some limitations that are barriers for adoption.
“Real time stats and simple image display are easy, but we’re still working out things like how to pan and zoom around large images while a doctor is working because the current resolution is just so limited.”
Always a visionary, Lorenzini imagines a not-to-distant world where we will have to take off our “glasses” to know what is real and what is not, calling that “Perfect (augmented) vision by 20/20.” With new 360° perspective cameras that can help deliver contextual information, working with other sensor based technology, our interaction with the virtual will become more fluid and blend seamlessly with reality.
“What’s coming in the next couple of years makes the stuff we’re doing now look like a joke, because the ability to push movie quality images to somebody’s view for purposeful visualization is just amazing. This literally changes everything, and people, for the most part, have no concept of what’s going to happen.”
He is also working with Google’s Project Tango device, a cross between a kinect style depth sensor and phone that senses your environment in real time. He says, “This is the first phone that has a sense of what’s around it, where the walls, floor and objects are.” These resulting “space maps” allow for virtual objects to be placed in precise locations and have them react to real world surroundings, allowing developers to create amazing games for the world around you.”
He’s suggesting the time is fast approaching when our virtual and real worlds will merge in ways we have never seen before.
“Virtual gaming will be huge driver. Our kids will be playing things like Call of Duty in their living rooms and public spaces. When they put on next generation head mounted displays, it could look like there are virtual zombies trying to bust into the room they’re in.” He believes that virtual characters of all types will soon come to share the space we’re in.
Just think about how something like this could benefit patient care within our medical space! Imagine healthcare professionals learning to treat virtual patients within their actual hospital walls, using their actual real-life hospital equipment. In emergency settings, this kind of training could be invaluable. Hospital staff, simulated patients and their simulated caregivers can connect effortlessly in a high-stress augmented environment with millions of scenarios programmed for “on-the-job” learning with minimal risk.
While these are really exciting possibilities, the technology is still just in its infancy. “Clearly, the hardware has to grow up. Right now if you hand somebody Glass, there’s a good chance they’ll hit the touch pad by accident and close out of whatever you wanted to show them” he said. “It’s early, but everything’s improving rapidly, with new purpose built hardware that will be more suited for medical and other environments, so stay tuned”
With all its limitations, I’m inspired by innovators, like Lorenzini, who are constantly thinking about where Glass and other AR wearable technology will take us in the future. Advancing the performance of these devices for “critical needs” environments like patient care is, as Lorenzini remarks, “a slam dunk for medical.”
This blog was originally posted by Kristi in Razing Standards for LehmanMillet.
When I met IMC PR professor Dr. Karen Freberg at Integrate 2014, she brought her Google Glass to the event and I had an opportunity to check it out. Very cool technology and I think is something that can be incorporated into a lot of different professions. The biggest thing will be getting the price of this technology down for a larger market.