New DeepMind Project Aims To Improve Eye Disease Understanding

Artificial intelligence is advancing more and more each day, and it seems that people are finding new uses for the technology just as often. The technology can be applied in just about any field, especially with the rise of neural networks and machine learning. From the business sector to technology and even medicine, researchers are finding new things for AI to do, most of which they can do better than humans. Case in point; Google-owned DeepMind's state-of-the-art AI is being used in a partnership with Moorfields Eye Hospital, a renowned ophthalmic institute in London, that will entail using machine learning to detect and identify eye problems and diseases as early on as possible. The plan is to teach the AI how a normal digital eye scan, or OCT, should look, and have it search for any abnormalities.

A traditional OCT scan, or Optical Coherence Tomography, shows the detailed geometry and other data for a given part of the eye. Although tools exist to analyze these scans for abnormalities, they're often unable to comb every detail, or see issues in the scans, leaving the bulk of the task to human doctors. Naturally, given the scans' complexity, a human going over one of these scans takes quite some time, which could be the difference in some cases between catching eye disease early enough to stem its development, or catching it in time to figure out how best to help a patient cope and adapt. Faster analysis could also enable more frequent scanning, allowing patients with ongoing or degenerative issues to be monitored more closely with more frequent testing.

On top of faster analysis, the project is meant to allow the AI to catch things that a human might miss when analyzing an OCT scan. Professor Sir Peng Tee Khaw with Moorfields Eye Hospital says that the research could very well "revolutionise the way professionals carry out eye tests", citing the fact that vision loss is predicted to double in the population by 2050, making it vital to explore any and all avenues to improve current standards of care. Of the thousands of scans taken daily in Moorfields, DeepMind will be fed a set number and, when applicable, told what is wrong with the scan during the learning process, such as in the case of patients with known abnormalities. Once the AI learns to identify a wide range of common issues, the next step will be to begin letting it analyze scans.

You May Like These
More Like This:
About the Author
2018/10/Daniel-Fuller-2018.jpg

Daniel Fuller

Senior Staff Writer
Daniel has been writing for Android Headlines since 2015, and is one of the site's Senior Staff Writers. He's been living the Android life since 2010, and has been interested in technology of all sorts since childhood. His personal, educational and professional backgrounds in computer science, gaming, literature, and music leave him uniquely equipped to handle a wide range of news topics for the site. These include the likes of machine learning, voice assistants, AI technology development, and hot gaming news in the Android world. Contact him at [email protected]
Android Headlines We Are Hiring Apply Now