The risk factors include the person's gender, smoking status, blood pressure and age-estimated to within four years of the patient's actual age.
In this case, Google'sVerily is using eye scans to accurately predict an individual's age, blood pressure, and whether or not they smoke. The results of this study show that, until now, the predictions made by AI can not outperform specialized medical diagnostic methods, such as blood tests. This could be a useful screening tool for doctors. That's impressive, but it's far from ideal. The company has published its findings in the online medical journal - Nature Biomedical Engineering on Monday. He added that this should not be seen as a replacement for diagnostic tests and clinical judgement but merely an analyser that would be an extension of the diagnostic armament.
Google and Verily's scientists used machine learning to analyze a medical dataset of almost 300,000 patients, as per the report.
To train the algorithm, the researchers used a data set of nearly 300 thousand patients.
The new tech could enable doctors to screen their patients easier and give them the proper treatment and could also enable patients to assess their health risks by themselves. From the information gathered they analysed the data and checked for its predictability of a heart condition.
Comparing the image of the fundus in two patients, one of whom within the next 5 years suffered a heart attack, Google's algorithm learned to determine the risk with an accuracy of 70%. That's on par with the common method that requires blood tests and has 72 percent accuracy, according to The Verge.
Experts believe that this is a credible study mainly because the science behind the hypothesis is sound. By studying the appearance algorithms can predict heart disease.
With technological advances in medical and health industry, there are several breakthrough techniques to monitor one's health in a more seamless fashion than what we follow today. In future large data sets could provide deeper insight. The program taught itself how to analyze eyeballs after using machine learning techniques to pore over more than 284,000 retinal images; while studying, the AI used what UPI describes as a visual "heatmap" to learn which parts of the eye's anatomy contained certain predictive factors. Researchers are trying to correct this problem with new versions.