Feature - Improving Alzheimer’s research, a million scans at a time
As you read this, your brain is busily working. In a complex but unconscious process, it scans the pixels on your screen, analyzes the images and turns them into meaningful information.
This week, a similar work began, neuGRID, that might help keep our minds whirring wonderfully as we (alas) age.
This massive scanning project will feed 6,300 magnetic resonance (MR) scans from more than 700 patients — a bit less than 200 images per patient, making for an impressive total of 1,260,000 images — through an automated series of calculations. This “pipeline” will analyze the cortical thickness of the brain (a measurement aligned with brain health) and its deterioration over time. The images are from the Alzheimer’s Disease Neuroimaging Initiative in the US, the largest public database of MR scans of patients with Alzheimer’s Disease and a lesser condition termed Mild Cognitive Impairment.
Feeding this entire database through these steps will allow neurological researchers to compare questions across populations, such as: What areas of the of the brain degenerate first? How relevant and accurate are the algorithms we use to model this disease? What morphological changes are disease markers? What drugs might halt or slow this disease?
Alzheimer’s is the second most feared disease associated with aging, following cancer, according to the Alzheimer Society of Canada. Patients with early symptoms have trouble remembering recent events, solving simple math problems, and remembering names of people and places. As the brain degenerates, patients in advanced stages of the disease lose mental and physical functions and need 24-hour care. Doctors do not know what causes Alzheimer’s. Although it affects about half of all people aged 85 and older, it is not considered part of the normal aging process.
neuGRID, a project funded by the European Commission, is analyzing the Alzheimer’s Disease Neuroimaging Initiative data set, squeezing the equivalent of five years of processing time on a single desktop into two weeks, to lay the foundation for much of their work in the future. This project is building a grid to support neuroscience research and hopes to become the flagship e-infrastructure for the community.
“Tweny years ago when I started in this field,” says Giovanni Frisoni, neuGRID partner and doctor who splits his time between reserach and patient care, “Alzheimer’s was equated with aging and regarded as a hopeless disease. Now we are starting to have the first medical treatments. This is an extremely exciting time to be in Alzheimer’s research.”
“This community has great computing needs,” says David Manset of Maat G, neuGRID services area leader. “These needs involve large data storage, fast access and massive processing for developing and testing new markers of disease — which grid technology can significantly help to address.”
The ability to evaluate disease markers — indicators of disease progression — will allow companies evaluate new drugs for treatment, speeding up the availability of new treatments to patients.
Computing clusters at three hospitals currently form the nodes of neuGRID’s infrastructure: Fatebenefratelli in Brescia, Italy, Karolinska Institutet in Stockholm, Sweden, and VU University Medical Center in Amsterdam, The Netherlands. Combined, their resources equal about 500 processing cores and 12 terabytes of storage capacity. Early next year, as the project moves into production, neuGRID hopes to access far more resources by joining with the Enabling Grids for E-sciencE “Biomed VO,” or biomedical virtual organization.
“I hope that in a few years we’ll be able to treat Alzheimer’s patients like we can treat hypertension patients,” says Frisoni. “With the correct drugs and care they can be essentially asymptomatic — showing no evidence of the disease.”
For many patients and their families, this would be a dream.
neuGRID will be showcasing this technology at EGEE’09. Don’t miss their live demo on Tuesday, 22 September at 4:30pm. They will demonstrate the real-time functionality of the infrastructure including data acquisition, algorithm execution and the advanced visualization output, accessible from an off-site desktop.