Demand Outpacing Supply for Supercomputers

By Yasmin Tadjdeh

Illustration: Getty

There is a growing demand for the computational power that supercomputers offer researchers in the United States, but access to them is being stifled by a number of factors, according to a new report by the Center for Data Innovation.

Supercomputers are a subset of high-performance computing, or HPC, which refers to systems that can solve difficult computational problems, according to the report, “How the United States Can Increase Access to Supercomputing.” They can be harnessed for a number of different research areas and are particularly important in the development of artificial intelligence systems.

Japan’s Fugaku system is considered the No. 1 supercomputer in the world, and U.S.-based Oak Ridge National Laboratory’s Summit the No. 2.

The demand for the platforms in the United States is growing rapidly but the government is not investing enough money into the technology, said Hodan Omaar, a policy analyst at the Center for Data Innovation and author of the report.

“There isn’t enough funding in HPC to support the acquisition of systems and software that can support AI researchers,” she said during an online event in December. This inhibits the ability of AI researchers to develop new products that are vital to maintaining U.S. competitiveness, the report said.

In the United States, funding for high-performance computing comes from both the Department of Energy — which oversees 17 national labs across the country — and the National Science Foundation, she noted.

The Energy Department typically invests in the most powerful systems available, but those often only support a small number of researchers, Omaar said. The NSF is responsible for systems that are not quite as powerful but are used by the majority of researchers.

The Energy Department has increased its investment in large-scale HPC resources over the last decade by about 90 percent, from $277 million in 2010 to $538 million in 2019 in constant 2010 dollars, according to the report. Meanwhile, the NSF has decreased its funding by 50 percent, from $325 million in 2010 to $167 million in 2019.

“This discrepancy has led to a U.S. HPC portfolio weighted toward very powerful systems that can only support a smaller number of researchers,” the report said. “However, both funding sources fail to meet current demand.”

Both the Energy Department and the National Science Foundation are meeting only about a third of the demand for HPC systems, Omaar said. The report measured demand in what it called service units, which are essentially units of compute time.

This gap is “likely to keep growing unless something is done,” she said.

The study calls for Congress to increase funding for supercomputing to $10 billion over the next five years by increasing total National Science Foundation funding in HPC infrastructure to at least $500 million per year and Energy Department funding to $1.5 billion per year.
Omaar noted that if policymakers balk at spending such a sum, they need only look at the investments that other nations are making. For example, the European Union announced in September that it would invest about 8 billion euros — or $9.7 billion — in supercomputing in the coming years.

Countries such as China and Japan are also making significant investments, she noted.

However, increasing funding is only one piece of the puzzle, Omaar said. Another critical need is determining which states and institutions should receive funding.

“Some states already have more access to HPC than others,” she said. To figure out where resources are needed most, policymakers should measure how states are currently using the computers for AI research.

According to the report, the government should be investing in states where there are low-levels of HPC availability but evidence that institutions within them are conducting high levels of AI research.

“This will allow the government to address instances wherein the gap between demand and supply is greatest,” the report said.

Such states include Alabama, Indiana, Utah, Georgia, Florida, Texas and Hawaii, Omaar said.

However, even if the government gets the funding and locations right, there is still another large obstacle to overcome in increasing access to HPC, and that’s the who, she said. Some groups are underrepresented, which means that not all individuals have an equal opportunity to succeed at becoming the next generation of AI researchers.

For example, women only make up 17 percent of the HPC workforce, she noted. There is also a lack of participation from Black, Hispanic and indigenous people in high-performance computing.

“A significant portion of these researchers are at minority-serving institutions and yet these institutions lack access to resources,” Omaar said.

To address this, both the National Science Foundation and Energy Department should establish partnerships that coordinate the sharing of computing resources with minority-serving institutions, which include Historically Black Colleges and Universities, Hispanic-serving institutions and tribal colleges and universities, the report said.

HPC systems are a critical tool for enabling AI applications for national security purposes, the study noted.

For example, the military is using high-performance computing systems at the Navy DoD Supercomputing Resource Center in Mississippi.

The Pentagon is investing in a new AI supercomputer for the center along with two similar systems for the Army and Air Force.

Additionally, the government uses supercomputers to test nuclear weapons through modeling and simulation.

“The nature of warfare is rapidly changing to one wherein weapons systems and warfighters need to have an algorithmic and informational advantage to outmaneuver adversaries,” the report said.

Topics: Infotech

Comments (0)

Retype the CAPTCHA code from the image
Change the CAPTCHA codeSpeak the CAPTCHA code
Please enter the text displayed in the image.