There was a lot of media hype recently about a paper published online March 9, 2014 in Nature Medicine that found that a set of 10 blood molecules predicted who in their sample would go on to develop amnestic mild cognitive impairment (MCI) or Alzheimer disease (AD) within three years. In many ways, such a blood test could be immensely valuable. Attempts so far to treat dementia once it is diagnosed have failed, and, as is the case for any disease, prevention is the best medicine. A risk marker such as this would help identify who would benefit most from preventative medicine (once available) or other interventions that reduce the risk of getting MCI or progressing to dementia. Moreover, a blood test is far cheaper and less invasive than other methods that are fairly good predictors of dementia, such as measuring tau and amyloid-β values from cerebrospinal fluid.
Nevertheless, there are reasons to be cautious about this recent finding. The blood test identified who would develop amnestic MCI with 90% sensitivity (meaning that 90% who developed amnestic MCI tested positive) and 90% specificity (meaning that 90% who did not develop amnestic MCI tested negative). These measures of test “accuracy” sound very impressive, but a little bit of arithmetic with their sample size numbers reveals that of the people who were both cognitively normal when the study started and tested positive, only 37% of them went on to develop amnestic MCI. It might be the case that some of the other 63% of people in this category would develop amnestic MCI or dementia if followed for longer, but surely we want a test with better predictive power. As neuropsychologists, we were also disappointed that the definition of memory impairment in this study was based on only a single verbal memory test (albeit on three measures from the same test), as we know from other studies that the diagnostic accuracy of MCI is better if it is based on impairments across multiple tests. Finally, no such blood test for dementia will become common doctor’s office practice until these results are replicated and confirmed in independent and larger studies.
A big question that has been floating around the media and online landscape is whether people would or should get a blood (or any other) test to tell them if they are likely to develop dementia. Any test predicting the typical, late-onset (a.k.a. “sporadic”) form of AD is likely to be only probabilistic, defining the probability that one will develop dementia, and not the certainty of it. Is it ethical for the medical community to be using a measure of “odds” when we’re dealing with such a devastating disease? It is one thing to offer genetic testing for Huntington’s disease, as offspring of a parent with this disease have a 50/50 chance of inheriting it and genetic testing can tell them with near certainty if they did or did not inherit the mutation. But a surprisingly small minority of adult children in this situation choose to be tested. In all likelihood, people would be even less willing to be tested for dementia if the precision of the test is imperfect, and receiving a “highly likely” test result could have a profound psychological impact on people. There are also possible repercussions for medical insurance and other legal matters such as wills and power of attorney.
At the same time, knowing that one has a high chance of developing MCI or dementia may motivate people to fill their brain reserve ‘stores’. As future posts will detail, research has shown that people can enjoy better brain health and reduced risk of dementia by keeping cardiovascular conditions under control (such as hypertension, high cholesterol, and diabetes), and following a healthy lifestyle such as getting aerobic exercise, eating a healthy diet, and being cognitively and socially. While these activities may not prevent or reverse the build up of the brain pathology caused by AD and other neurodegenerative diseases, they will improve our level of functioning so that we enjoy more time in good cognitive health.