Skip to main content

Supercomputers Help Accelerate Alzheimer’s Research

‘Comet’ and ‘Stampede2’ Assist Study of Nearly 50,000 Brain Scans

On the left, the transentorhinal cortex is shown as a pair of triangulated surfaces; the curved white lines represent cortical columns, which are used to accurately estimate thickness. On the right side, regions are shown where differences in atrophy patterns are observed between those with mild cognitive impairment (MCI) and those without MCI. Credit: Daniel Tward, UCLA.


  • Kimberly Mann Bruch, SDSC Communications

Media Contact:

Published Date


  • Kimberly Mann Bruch, SDSC Communications

Share This:

Article Content

Since 2009, Daniel Tward and his collaborators at UCLA and Johns Hopkins University have analyzed more than 47,000 images of human brains via MRI Cloud—a gateway created to collect and share quantitative information from human brain images, including subtle changes in shape and cortical thickness. The latter was the topic of a recently published study in the journal Neuroimage: Clinical by Tward and his team.

Entitled Cortical Thickness Atrophy in the Transentorhinal Cortex in Mild Cognitive Impairment, the study detailed new findings related to this particular area of the brain’s thinning during the early stages of Alzheimer’s disease and how it impacts mild cognitive impairment.

“Until now, we haven’t been able to measure these changes in living people,” said Tward, assistant professor of computational medicine and neurology at the University of California Los Angeles. “By using supercomputers like Comet at the San Diego Supercomputer Center at UC San Diego and Stampede2 at Texas Advanced Supercomputing Center, we were able to study a large cohort of patient images over time.”

Specifically, Tward said he and his team used the supercomputers to observe and quantify thinning in the transentorhinal cortex, in a pattern that agrees with autopsy results. Located in the temporal lobe of the brain, the transentorhinal cortex has been believed to be the first area impacted by Alzheimer’s disease, which until now could not be diagnosed until autopsy results were available.

He said that this technology is helping clinicians confirm that the thinning of the transentorhinal cortex is caused by Alzheimer’s, which could help provide patients with an earlier diagnosis.

Additionally, the newfound discovery could result in shorter and less expensive clinical trials, which again allows for faster discovery of potential treatment for those suffering from Alzheimer’s disease.

Tward explained the impact of his and his colleagues’ use of Comet and Stampede2 in conjunction with MRI Cloud, to analyze hundreds of large imaging volumes of human brains—with a focus on the transentorhinal cortex.

“Reducing computation time from months to days allowed this complex neuroimaging project to be feasible,” said Tward. “XSEDE provided us with a platform to exceed our expectations as we conducted a study with significant results for both academic researchers and clinicians working on Alzheimer’s disease diagnoses and treatment.”

This work relied on allocations from the National Science Foundation (NSF) Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by the NSF (ACI-1548562). The research was supported by the National Institutes of Health (P41-EB015909, R01-AG048349, RO1-DC016784, and R01-EB020062).

About SDSC

The San Diego Supercomputer Center (SDSC) is a leader and pioneer in high-performance and data-intensive computing, providing cyberinfrastructure resources, services, and expertise to the national research community, academia, and industry. Located on the UC San Diego campus, SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from astrophysics and earth sciences to disease research and drug discovery. In December 2020 SDSC’s newest National Science Foundation-funded supercomputer, Expanse, entered production. At over twice the performance of CometExpanse supports SDSC’s theme of ‘Computing without Boundaries’ with a data-centric architecture, public cloud integration, and state-of-the art GPUs for incorporating experimental facilities and edge computing.

Share This:

Category navigation with Social links