Pioneering medical diagnostics on Google Glass, behind the scenes at UCLA Ozcan Lab

UCLA engineering researchers have developed and validated a Google Glass point-of-care diagnostic test for in-home HIV tests and a prostate-specific antigen (PSA) test.

They stated that this has the potential to improve laboratory testing in rural and low-resource settings such as disaster-relief areas and quarantine zones.

We covered the study, published in ACS Nano, recently in iMedicalApps–Researchers develop Google Glass app that delivers instant analysis of point-of-care diagnostic tests.

This new platform can simplify the processing of rapid diagnostic tests (RDT), lower training costs, and increase accessibility and availability to such tests.

We interviewed Aydogan Ozcan, Ph.D., principal investigator and the Chancellor’s Professor of Electrical Engineering and Bioengineering at UCLA and associate director of UCLA’s California NanoSystems Institute, along with Steve Feng, MS, first author of the published paper and research lab manager.

From where did the Google Glass idea come?

Steve Feng, MS: We previously published research on diagnosis using smartphones in the past (Editor’s note: see ACS Nano journal publication “Detection and Spatial Mapping of Mercury Contamination in Water Samples Using a Smart-Phone”), and we’ve commercialized that in  Holomic LLC.

After that, we were trying to explore different ways to make things easier to use. With the various [rapid diagnostic test (RDT)] readers out there, they’re very sensitive, but they require a lot of training and it requires both of your hands. You have less of an intuitive feel. So we thought that wearable computers and hands-free modalities like Google Glass or other types of hands-free devices would provide a great way to do this kind of work, [which is] especially [useful in] any other work where you can free up hands and provide interactive interfaces for the longest time.

It’s one of those platforms we’ll probably adopt in the future to do a variety of things that would require a benchtop system or even mobile devices like before–and it requires less training. You just hold up the strip [containing the test material]. You don’t have to train [healthcare workers] on the RDT [device] and turn on the illumination, the whole process. [Just say] “OK glass, Image an RDT.”

Hold it up, [and] once it’s fitted, take the camera [photo] by tapping or voice commands or whatever… [and] the server will automatically process it and store it on the server, and see the result on the Glass. It removes any type of need for complicated procedures.

20140309_155628

How did Ozcan Lab create the device?

Steve Feng, MS: We first prototyped it with rapid diagnostic tests, and it could be used for any kind of RDT as long as you put in the algorithms. Because it’s on the server, you don’t even have to change the application itself. You just take the RDT…put it on the server and people don’t have to change anything on the app side. The server automatically recognizes it with the QR code, it processes it and returns the results. To add more tests, just do the image processing and put it on the server.

We demonstrated a lot of [RDT-based] technology [innovations] like mercury detection, detecting parasites in water, RBC imaging, lab chips, microfluidics, microscope replacements, and we’re expanding and trying new things. I’m hoping that one day this lab [puts] this all into one platform.

How can others get in touch with Ozcan Lab [if they have an idea]?

Steve Feng: Most people e-mail Professor Ozcan, or they meet at conferences. We’re very open to discussion of anything at any time. We internally discuss to see if it’s a good opportunity. People can reach out through e-mail or social media (Facebook page).

What’s in the works next for Ozcan Lab?

Aydogan Ozcan, PhD: Some of the themes that we would like to further explore as a group include, but are not limited to nano-imaging, high-throughput microscopy, computational imaging and sensing, point of care diagnostics, mobile health and telemedicine, wearable sensors, etc. I always look for exciting research ideas and enjoy building interdisciplinary teams and working with the most talented students, postdocs and trainees to solve the hardest problems in biomedical applied sciences.

Steve Feng: We’re always working on expanding our microscopic platform and imaging with our devices. We’re looking for more interesting opportunities to take advantage of technologies.

For instance, the mercury testing application done under Qingshan Wei. We’re trying to expand that out in a variety of different formats–distributing to platforms, or moving into different modalities. We’re interested in Google Glass, pursuing other apps with wearable computers, and we’re hopeful to see other apps come out of the lab in the next few months.

In your opinion, where do you think the whole field of mobile health is headed?

Aydogan Ozcan, PhD: The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones. It is transforming the mobile phone into a cost-effective and yet extremely powerful platform to run e.g., biomedical tests and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally.

Steve Feng: In my personal opinion, I think mobile health is being pushed heavily in the United States but for target applications, it’ll probably be developing in third-world countries for things like the microscopy platform or other such devices. I think to push for adoption, especially inside the developing countries, you have to start from the ground up. I think these kinds of technologies should start from below [i.e. consumers] and from the top — longtime leaders in the medical fields — and meet in the middle.

Once you get enough applications out there, I think you’ll start seeing movements collapse downwards to the general consumer. There’s lots of exciting stuff happening moving healthcare into the home. Theranos is moving hospital tests into Walgreens and drugstores at a fraction of the price and the time, and there’s a wide variety of mobile technologies that do lab-on-a-chip type work, microfluidics on a chip, on paper, all kinds of interesting applications. As these technologies evolve, I predict seeing this in clinics, hospitals, then schools, and migrating into the homes.

Steven Chan, M.D., M.B.A., is a resident physician at the University of California, Davis Health System, researching mobile technology, psychiatry & human behavior. Steve previously worked as a software and web engineer as well as creative designer at Microsoft & UC Berkeley. Visit him at www.stevenchanMD.com and @StevenChanMD.

Author:

Steven Chan (@StevenChanMD)

is a resident physician in psychiatry & human behavior at the University of California, Davis School of Medicine, and is current American Psychiatric Association (APA) & Substance Abuse And Mental Health Services (SAMHSA) MFP Fellow. He's currently researching asynchronous telepsychiatry & mobile healthcare applications. Steve previously worked as a visual designer and software engineer at both Microsoft & UC Berkeley.

Follow Me

View Comments

No comments yet.

Leave a Reply

Your email address will not be published. Required fields are marked *