The Aurora supercomputer in Argonne to drive the construction of a brain map


Newswise – Argonne researchers are mapping the intricate entanglement of brain connections – a connectome – by developing computer applications that will find their rhythm with the advent of exascale computing.

The US Department of Energy (DOE) The Argonne National Laboratory will house one of the country’s first exascale supercomputers when Aurora arrives at 2022. Prepare codes for the architecture and scale of the system, 15 of research teams participate in the Aurora Early Science program through the Argonne Leadership Computing Facility (ALCF), a DOE Office of the installation of science users. With access to pre-production hardware and software, these researchers are among the first in the world to use exascale technologies for science.

Humans have pushed and pushed the brain for millennia to understand its anatomy and function. But even after untold advances in our understanding of the brain, many questions remain.

Using imaging techniques much more advanced than those of their former contemporaries, researchers at the DOEArgonne’s national laboratory in Argonne is working to develop a brain connectome – an accurate map that shows each connection between each neuron and the precise location of the associated dendrites, axons and synapses that help form the pathways of communication or signaling of a brain.

If we don’t improve on today’s technology, the computing time for an entire mouse brain would be something like 1,000,000 days of work on current supercomputers. Using all of Aurora, if everything worked fine it might take 1,000 days. “Nicola Ferrier, senior computer scientist Argonne

Such a map will allow researchers to answer questions such as: how is the structure of the brain affected by learning or degenerative diseases, and how does the brain age?

Led by Nicola Ferrier, senior computer scientist at Argonne, the project,Enabling Connectomics at Exascale to Facilitate Discoveries in Neuroscience, ”is a large-scale collaboration between computer scientists and neuroscientists, and academic and corporate research institutes, including Google and the Kasthuri Lab at the University of Chicago.

It is part of a small group of projects supported by the ALCFAurora Early Science Program (ESP) working on preparing the codes for the architecture and scale of its future exascale supercomputer, Aurora.

And this is the kind of research that was virtually impossible until the advancement of ultra-high-resolution imaging techniques and more powerful supercomputing resources. These technologies allow finer resolution of the microscopic anatomy and the ability to disentangle the size of the data, respectively.

Only the computing power of an Aurora, an exascale machine capable of performing a billion trillion calculations per second, will meet the short-term challenges of brain mapping.

Currently without this power, Ferrier and his team are working on smaller brain samples, some of them only a cubic millimeter. Even that small mass of neurological material can generate a petabyte of data, estimated to be about one-twentieth of the information stored in the Library of Congress.

And in an attempt to someday map a mouse’s entire brain, roughly one cubic centimeter, the amount of data would increase a thousand-fold at reasonable resolution, Ferrier noted.

If we don’t improve on today’s technology, the computing time for an entire mouse brain would be something like 1,000,000 days of work on current supercomputers, ”she said.Using all of Aurora, if everything worked fine it might take 1,000 days.”

Thus, the problem of reconstructing a brain connectome requires exascal resources and beyond, ”she added.

Working primarily with mouse brain samples, Ferrier ESP The team is developing an IT pipeline to analyze data obtained from a complicated process of coloring, slicing and imaging.

The process begins with samples of brain tissue that are stained with heavy metals to provide visual contrast, then sliced ​​extremely thin with a precision cutting tool called an ultramicrotome. These slices are mounted for imaging with the Argonne big data-producing electron microscope, generating a collection of smaller images, or tiles.

The resulting tiles must be digitally reassembled, or stitched together, to reconstruct the slice. And each of these slices must be stacked and aligned properly to reproduce the 3D volume. At this point, the neurons are traced through the 3D volume through a process known as segmentation to identify neural shape and synaptic connectivity, ”Ferrier explained.

This segmentation step is based on an artificial intelligence technique called a convolutional neural network; in this case, a type of network developed by Google for the reconstruction of neural circuits from electron microscopy images of the brain. Although it has demonstrated better performance than previous approaches, the technique also comes with a high computational cost when applied to large volumes.

With the larger samples expected over the next decade, such as the mouse brain, it is essential that we prepare all the compute tasks for the Aurora architecture and be able to scale them efficiently on its many nodes. . This is a key part of the work we undertake in the ESP project, ”said Tom Uram, a ALCF computer scientist working with Ferrier.

The team has already extended parts of this process to thousands of nodes on ALCFThe Theta supercomputer.

Using supercomputers for this job demands efficiency at all scales, from distributing large data sets on compute nodes to running algorithms on individual nodes with high bandwidth communication, to writing the final results to the parallel file system, ”Ferrier said.

At this point, she added, the large-scale analysis of the results really begins to probe questions about what emerges from neurons and their connectivity.

Ferrier also believes his team’s preparations for exascale will benefit other users of the exascale system. For example, the algorithms they are developing for their electron microscopy data will find application with X-ray data, especially with the upcoming upgrade of Argonne’s advanced photon source (APS), a DOE Office of the installation of science users.

We evaluated these algorithms on x-rays and found early success. And the APS The upgrade will allow us to see a finer structure, ”notes Ferrier.So, I foresee that some of the methods we have developed will be useful beyond this specific project. “

With the right tools in place and exascale computing at hand, the development and analysis of large-scale precision connectomes will help researchers fill in the gaps of some age-old questions.

The Argonne Leadership Computing Center provides high-performance computing capabilities to the scientific and engineering community to advance fundamental discovery and understanding across a wide range of disciplines. Supported by the US Department of Energy (DOE‘s) Office of Science, Advanced Scientific Computing Research (ASCR) program, the ALCF is one of the two DOE Leading IT facilities in the country dedicated to open science.

Argonne National Laboratory seeks solutions to urgent national problems in science and technology. The country’s leading national laboratory, Argonne conducts cutting-edge fundamental and applied scientific research in virtually all scientific disciplines. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state, and municipal agencies to help them solve their specific problems, advance U.S. scientific leadership, and prepare the nation for a better future. With employees over 60 nations, Argonne is managed by UChicago Argonne, SARL for the Office of Science of the US Department of Energy.

The Office of Science of the United States Department of Energy is the largest proponent of basic physical science research in the United States and strives to address some of the most pressing challenges of our time. For more information, visit https: // ener gy .gov / s c ience.

About Keith Tatum

Check Also

Solving the labor shortage in the construction industry requires change

Addressing scarcity and attracting and retaining People Y, Z and beyond will require breaking with …

Leave a Reply

Your email address will not be published.