Machine learning has the potential to fuel major technological developments in countless fields, with Alphabet’s X division already investigating agriculture and food production usage. A team inside Google is now using it for cancer research and detection with a prototype microscope.

A team at Google Brain — the company’s AI research division — today presented an in-review paper about detecting cancer using machine learning and a modified microscope that features an augmented reality display.

The company has done previous work using neural networks to detect breast cancer, with accuracy rates comparable to a trained pathologist. At the moment, cancer diagnosis is primarily achieved with compound light microscopes. Newer deep learning techniques, however, require a digital representation of microscopic tissue.

Google’s augmented reality microscope (ARM) combines both methods, with AI head Jeff Dean noting that it “blend[s] the expertise of automated machine learning systems with human expertise.”

The platform consists of a modified light microscope that enables real-time image analysis and presentation of the results of machine learning algorithms directly into the field of view.

Google notes that it is “bringing AI directly to the user” with a camera feeding the microscope’s view through algorithms that then highlight possible tumors in a green outline for the viewer to focus on and confirm:

This digital projection is visually superimposed on the original (analog) image of the specimen to assist the viewer in localizing or quantifying features of interest. Importantly, the computation and visual feedback updates quickly — our present implementation runs at approximately 10 frames per second, so the model output updates seamlessly as the user scans the tissue by moving the slide and/or changing magnification.

Researchers were able to retrofit existing light microscopes already found in hospitals and clinics with low-cost, readily-available component.

At the moment, ARM has been trained to detect breast and prostate cancer, but it is capable of “running many types of machine learning algorithms aimed at solving different problems such as object detection, quantification, or classification.” Meanwhile, visual feedback can be updated to include “text, arrows, contours, heatmaps, or animations.”

Google believes that “ARM has potential for a large impact on global health, particularly for the diagnosis of infectious diseases, including tuberculosis and malaria, in developing countries.” It can be used in conjunction with existing digital pathology workflows, while it can be expanded to other industries like healthcare, life sciences research, and material science.

FTC: We use income earning auto affiliate links. More.

Check out 9to5Google on YouTube for more news:

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: