A new Guardian report shows where AI is headed next, in a joint venture between DeepMind and the British National Health Service…
The British team behind Google’s AI efforts is teaming up with the UK’s National Health Service and London’s Moorfields Eye Hospital to build a machine learning system capable of recognizing potentially sight-threatening conditions by simply identifying symptoms from a digital scan of the eye. The core of the research will see about a million eye scans (all coming from anonymous patients) being analysed by an AI-fuelled computer, which DeepMind researchers will use to train a special algorithm.
The algorithm will then allow the machine to spot early signs of eye conditions, such as wet age-related macular degenerations and diabetic retinopathy; diabetes, in fact, apparently makes it “25 times more likely to go blind”, as per Mustafa Suleyman, DeepMind’s co-founder.
“If we can detect this, and get in there as early as possible, then 98% of the most severe visual loss might be prevented,” Mustafa said. And indeed, allowing a computer to do most of the hard work would help immensely in increasing both the speed and the accuracy of a diagnosis, potentially helping the sight of thousands to be saved.
The collaboration came about due to a request from one of the doctors at Moorfields, Pearse Keane (a consultant ophthalmologist), who was impressed by DeepMind’s efforts in image recognition, and reckoned that the technology could be of great use in the medical area as well.
He said: “I had the brainwave that deep learning could be really good at looking at the images of the eye. Optical Coherence Tomography is my area, and we have the largest depository of OCT images in the world. Within a couple of days I got in touch with Mustafa, and he replied.”