From upgrading the Global Forecast System (GFS) to acquiring new supercomputers, the National Oceanic and Atmospheric Administration (NOAA) has been making big moves in the HPC sphere over the last few years—but now it’s setting the bar even higher. In a new report, NOAA’s Science Advisory Board (SAB) highlighted several dozen priorities for weather research over the next decade. Among them: expanding NOAA’s HPC capacity a hundred times over by 2031.
“Weather forecasts produced by the NWS over the years have saved thousands of lives and provided billions of dollars in economic benefits,” the document reads. “However, the United States does not currently have the best possible weather forecast capabilities, in part because its numerical weather modeling portfolio does not represent the best the science can achieve.” By way of example, the authors show that NOAA’s GFS currently “lags the models of two to three other forecast centers” (those of the European Centre for Medium-Range Weather Forecasts [ECMWF], which holds the top spot; the UK’s Met Office; and, about tied, the Canadian Meteorological Center).
A comparison of the five-day forecast skill of popular global weather forecasting models. The United States’ GFS (black line) trails Europe’s ECMWF model (red line) and the UK’s Met Office model (orange line) and more or less paces alongside the Canadian Meteorological Center model. Image courtesy of NOAA.
“This indicates that not only are we under-serving the American public but also that the United States has the potential to provide more accurate and reliable weather information,” the authors argue. “The public benefits of NOAA regaining a leadership role would be increased forecast accuracy, longer lead times, and finer-scale detail for severe weather, flooding and hurricanes.”
To that end, they identify three key priorities: remedying observation gaps in existing networks; more comprehensive modeling of the entire Earth system; and “major investments … in computing resources[.]” These computing investments, they say, should include “cloud computing, next-generation computers, storage and bandwidth, especially for research computing, but also for the operational implementation of more comprehensive models[.]”
“Without increased computing resources,” they write, “none of the recommended new models and data assimilation that improve the forecasts will be able to run on time. … Improvements in weather forecasts are directly limited by the availability of sufficient computing resources to develop, test and operate next-generation forecasting technologies.”
As to the hundred-fold increase, it’s not arbitrary—NOAA did its homework on that, too.
“From an operational NWP perspective, a four-fold increase in model resolution in the next ten years”—which the report points out would be sufficient for reaching major NWP capability milestones—”requires on the order of 100 times the current operational computing capacity. Such an increase would imply NOAA needs a few exaflops of operational computing by 2031. … To achieve a 3:1 ratio of research to operational HPC, NOAA will need an additional five to ten exaflops of weather research and development computing by 2031.”
Vis-a-vis the agency’s current portfolio: last we heard, NOAA was set to triple its operational weather and climate supercomputing capacity by early 2022 with the introduction of two 12-petaflops HPE Cray systems, reaching an aggregate 40 petaflops of HPC power when taking into account 16 petaflops of research and development supercomputing spread across various smaller systems. A hundred-fold increase against those 24 petaflops of operational weather and climate supercomputing would yield 2.4 exaflops and the desired 3:1 research-to-operational ratio would yield an additional 7.2 exaflops, totaling some 9.6 peak exaflops of desired capacity by 2031.
These numbers might sound exorbitant on the face of things, with the U.S.’ first exascale system not even yet fully deployed, but NOAA argues that these numbers represent a small—but crucial—share of the HPC pie. “[It] is likely that … national HPC laboratories will approach 100 exaflops by 2031,” the report reads. “Because HPC resources are essential to achieving the
outcomes discussed in this report, it is reasonable for NOAA to aspire to a few percent of the computing capacity of these other national labs at a minimum.”
Beyond capacity growth, NOAA insists that it must become better prepared to adopt and apply new technologies like cloud computing, AI, GPUs and quantum computing through aggressive investments in technologies and the trained workforces to operate them. “NOAA is insufficiently prepared to leverage these new computing technologies from both an application and modeling, and workforce perspective,” the authors write, “and as a result, will be inhibited in its ability to advance weather forecasting in the coming decades unless it becomes a more proactive and not reactive adopter of new computing technologies.”
Speaking to the current reactivity, the board highlights a “lack of long-term (decadal) and sustained Congressional commitments to advance NOAA’s computing portfolio” that “inhibits NOAA’s ability to be more proactive in developing next-generation HPC strategies, expertise and applications.” The answer, the report reiterates: “NOAA must immediately invest in long-term programs to convert, prepare for and leverage new and emerging high-performance computing architectures such as cloud, GPUs, exascale and quantum. … HPC must be an immediate and ongoing investment.”
“Without sufficient HPC investments,” the authors conclude, “the loss of potential advancements is tremendous and cannot be overstated.”
To read the full report, click here.
The post Citing ‘Shortfalls,’ NOAA Targets Hundred-Fold HPC Increase Over Next Decade appeared first on Technovanguard.