Abstract
A wealth of information is contained in images obtained by whole-body magnetic resonance imaging (MRI). Studying the link between the imaged anatomy and properties known from outside sources has the potential to give new insights into the underlying factors that manifest themselves in individual human morphology. In this work we investigate the expression of age-related changes in the whole-body image. A large dataset of about 32,000 subjects scanned from neck to knee and aged 44-82 years from the UK Biobank study was used for a machine-based analysis. We trained a convolutional neural network based on the VGG16 architecture to predict the age of a given subject based on image data from these scans. In 10-fold cross-validation on 23,000 of these images the network reached a mean absolute error (MAE) of 2.49 years (R2 = 0.83) and showed consistent performance on a separate test set of another 8,000 images. On a second test set of 100 images the network outperformed the averaged estimates given by three experienced radiologists, which reached an MAE of 5.58 years (R2 = 0.08), by more than three years on average. In an attempt to explain these findings, we employ saliency analysis that opens up the image-based criteria used by the automated method to human interpretation. We aggregate the saliency into a single anatomical visualization which clearly highlights structures in the aortic arch and knee as primary indicators of age.</p>