How retinal scans can help in diagnosing cardiovascular risk factors

Nemish Kanwar
4 min readOct 3, 2019

Cardiovascular diseases have become one of the leading cause of death around the globe. Early identification is the first step in putting the cap on the number.
Existing techniques involves risk calculators like- Pooled Cohort Equations¹, framingham² and Systematic Coronary Risk Evaluation (SCORE)³. These techniques take into account attributes to identify the risk- age, gender, smoking status, blood pressure, BMI, glucose, cholesterol level.
The challenges involved with these techniques is unavailability of data like for cholesterol levels in 30% of patients cases, which requires fasting blood draw. This is where an alternate like retinal scan is helpful which is feasible and non-intrusive.
The objective of this analysis is to create an estimator to identify any Major adverse cardiovascular events (MACE) within five years. So, that preventive care can be provided at the right time.

The following article are my notes on an article published by Google as a part of their AI for good initiative. One should follow the link to get a detailed understanding of the topic. This article doesn’t discuss about the model and implementation as a machine learning model, but talks about how an idea, became a theory and which become a research area. The study is not yet complete, there are a lot of areas which needs to be researched to make it more precise.

Exploration of retinal fundus photographs to identify cardiovascular risk

The idea was developed from the fact that hypertensive retinopathy⁴ and cholesterol emboli⁵, can often manifest in the eyes, which can be visualised using non-invasive techniques like using Specialized fundus cameras to obtain Fundus Photographs .The same funda has been applied to diagnose other diseases like- melanoma⁶ and diabetic retinopathy⁷ from medical images

About Data

The data was collected from UK Biobank and EyePACS from US. The data for EyePACS is freely available on Kaggle.

Table describing the dataset

EyePACS population mostly consisted of diabetic patients, and was available for only 60% of the cases, the mean HbA1c level was well above normal range. The Biobank data was taken from UK general population, hence, many factors were unavailable for the study.

Methodology

The model was trained to predict variety of cardiovascular risk factors in order to be able to predict MACE in 5 years. The retinal fundus images were taken as training dataset and individual models were trained for each factor.

The baseline was taken as the mean for continuous variables to test ability of the model.
The predicted age had a major improvement over baseline MAE with good Rsquare. Algorithm also predicted systolic BP, BMI and HbA1c better than baseline but with less prediction.

Performance of models for various cardiovascular risk factors, age is better predicted with high precision
Algorithm had a were able to reduce MAE, but for SBP increases till 150mmHg but levelled off post that

Further optimisation was done to minimize MAE rather than predicted age, which proved to be significantly better than baseline for age, SBP, DBP and BMI. The hypothesis of MACE being correlations with diabetic retinopathy was also nullified.

The attempt to reduce MAE proved a better solution and led to even lesser MAE for many factors

Results

Since, retinal fundus were proved to be a good indicator of cardiovascular risk factors, these images were hypothesised to be correlated to predict onset of MACE as well, the data for which was only available for UK Biobank for 631 patients.
The algorithm achieved 0.7 AUC against 0.72 AUC by SCORE, which is a good figure given it had taken into consideration just the retinal image.

Just Algorithm had a near competitive precision with SCORE benchmark, using SCORE with Algorithm obviously outperforms it

Visualising the risk factors

Next, Anatomical regions were identified using soft attention, 100 samples of which were assessed by ophthalmologists.

A sample example of retinal fundus performance visualisation for cardiovascular risk factors, the model had a very good predictive power for them

Blood vessels were highlighted to predict age, smoking habit, SBP. HbA1c was highlighted in the perivascular surroundings. gender got highlighted in the optic disc. While, DBP and BMI were more diffusely distributed throughout the image.

Conclusion

The study shows a remarkable achievement in field of medicine. The output would have an immense impact on how we currently measure the risk of MACE. This will be able to cut down cost and time for measuring, and also requirement of multiple tests to be done for the same.

A corollary of this study also suggests use of fundus image for other purposes. If the fundus photography is made feasible from mobile devices⁸, all the above testing can be done with the help of smartphone itself. Some ideas would be determining if a person is a smoker or not. Haemoglobin, BMI, Blood pressure computation without the need to go to a doctor.

It is well said “eye is the mirror of the soul”

Cheerio!!

--

--

Nemish Kanwar

Senior Data Scientist @Draup, specialising in Natural Language Processing