supercomputers Archives | FedScoop https://fedscoop.com/tag/supercomputers/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Fri, 06 Oct 2023 16:52:19 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 supercomputers Archives | FedScoop https://fedscoop.com/tag/supercomputers/ 32 32 One threat facing the world’s fastest supercomputer? Raccoons https://fedscoop.com/oak-ridge-supercomputers-raccoons-possums/ Fri, 06 Oct 2023 16:15:45 +0000 https://fedscoop.com/?p=73371 At times, wildlife have gotten into infrastructure used to provide power to the supercomputers at the Oak Ridge National Lab.

The post One threat facing the world’s fastest supercomputer? Raccoons appeared first on FedScoop.

]]>
At the Oak Ridge National Laboratory, an unlikely threat faces two of the world’s fastest supercomputers: raccoons and possums. 

The national lab, which is famous for its role in the Manhattan Project, sits within a rural Tennessee campus of just over 4,400 acres. That location is strategic and helps protect the facility, home to the Summit and Frontier supercomputers, the latter of which became the fastest supercomputer in the world last year. These computers are a critical part of the Department of Energy’s research agenda and play a critical role in helping scientists across the world build advanced models and process large datasets. 

But nearby nature also means that wildlife is able to access infrastructure that supports the lab’s science operations. At times, animals can get into power lines or substations, creating a dip in voltage. That voltage dip can ultimately push a supercomputer offline, according to Bronson Messer, the director of science at the Oak Ridge Leadership Computing Facility. 

“There’s lots of raccoons and possums around here. And unfortunately, we’ve taken a few out,” Messer told FedScoop in a recent interview. “Power feed? I mean, it tastes good to wildlife. They bite into a 480-volt line and it’s not a good scene.” 

Animal-related outages have only happened two or three times, according to Messer. When an outage occurred this past August, animals only took a handful of computing nodes offline because of the redundancy built into the system. Still, small mammals prone to climbing, along with any animal looking to nibble on a power feed, could theoretically be at risk. 

While the problem is relatively small, the threat of wildlife is a reminder that physical security, and not just cybersecurity, remains a part of protecting digital infrastructure. At the same time, the lab says it’s also taking steps to combat this challenge. 

“In the past five years, we’ve made significant improvements in preventing interruptions from animal or tree-imposed voltage sags,” Messer said in an email to FedScoop. “For example, we added bands around power poles that keep animals from climbing them. We also improved fencing around substations. Finally, we have done some infrastructure cleanup to remove obvious risks from trees and limbs.”

The post One threat facing the world’s fastest supercomputer? Raccoons appeared first on FedScoop.

]]>
73371
Department of Energy launches ESnet6 high-performance data-sharing network https://fedscoop.com/doe-unveils-esnet6-network/ Tue, 11 Oct 2022 20:00:00 +0000 https://fedscoop.com/?p=61489 ESnet6 increases the Energy Science Network's bandwidth to more than 46 terabits per second for handling data from experiments, models and simulations.

The post Department of Energy launches ESnet6 high-performance data-sharing network appeared first on FedScoop.

]]>
The Department of Energy on Tuesday launched the next generation of its high-performance network for sharing data across its multi-billion-dollar research portfolio.

ESnet6 increases bandwidth to more than 46 terabits per second for processing, storing, visualizing, analyzing and sharing data from experiments, models and simulations between all of DOE’s national laboratories, tens of thousands of researchers, premier scientific instruments and supercomputing centers. It is the latest iteration of the Energy Science Network.

Lawrence Berkeley National Lab has maintained ESnet for 35 years, but complex instruments such as genome sequencers, telescope observatories, X-ray light sources and particle accelerators produce a large volume of data. Network traffic increases 10 times every four years.

“As scientific instruments grow in complexity and supercomputers simulate scientific phenomena at higher resolutions, the science community is facing a growing challenge: data volumes that are increasing exponentially, coupled with the need to move, share and process this data globally and faster than ever before,” said Barbara Helland, associate director of the Office of Science’s Advanced Scientific Computing Research program, in the announcement. “With ESnet6, DOE researchers are equipped with the most sophisticated technology to help tackle the grand challenges we face today in areas like climate science, clean energy, semiconductor production, microelectronics, the discovery of quantum information science and more.”

ESnet6 launched ahead of schedule and boasts data transfers between 400 gigabits and a record 1 terabit per second, a new automation platform for customizing network services, a forthcoming application programming interface platform for scientists to request those services, high-precision telemetry to improve its performance, and improved cybersecurity.

Engineers developed the automation platform to handle current multi-petabyte dataflows and eventually support the exabyte data era by rapidly configuring new network paths for sharing massive datasets in under two minutes. ESnet already carried more than 1.1 exabytes of science data in 2021 in preparation for exascale computers like Frontier, which launched this year.

“ESnet6 is particularly important obviously in this stage as we enter this new era where we will regularly be producing exabytes of data that must be moved seamlessly between facilities, collaborators, laboratories no matter where they may be,” said Asefaw Berhe, director of the Office of Science, during the ESnet6 launch event at LBNL on Tuesday.

ESnet is already the exclusive network carrying data — 480 petabytes of it over the last decade — from the Large Hadron Collider in Geneva, Switzerland, to DOE’s experimental facilities. The world’s largest instrument is receiving upgrades expected to increase its data output 10 times over, and ESnet6 had to be prepared.

Lumen Technologies provided DOE with data center space and dark fiber connections serving as the backbone for its 15,000-mile fiber-optic network. Companies like Ciena, Infinera, Nokia and AMD helped DOE navigate supply chain delays by offering other enabling technologies like optical switching and routing platforms and an extreme scale packet monitoring system.

ESnet6 “pushed the envelope” of commercially available resources, said Vint Cerf, vice president and chief internet evangelist at Google, during the launch event.

“It has the capability to do much more refined resource allocation and, I would say, the configuration of its resources to support unusual — either very high speed or certain timely — demands and the ability to adapt quickly at need,” Cerf said.

Google shares DOE’s zeal for ESnet6’s high speeds, he added.

DOE increasingly relies on ESnet’s virtual integration of scientific experimental facilities, supercomputers and global science teams to make research discoveries — including exascale research into wind energy and earthquake simulations.

“ESnet6 represents a transformational change in the way networks are built for research with improved capacity, resiliency and flexibility,” said Inder Monga, executive director of ESnet, in a statement. “Together, these new capabilities make it faster, easier and more efficient for scientists around the world to conduct and collaborate on ground-breaking research.”

The post Department of Energy launches ESnet6 high-performance data-sharing network appeared first on FedScoop.

]]>
61489
NOAA unveils 2 weather and climate supercomputers for improved forecasting https://fedscoop.com/noaa-unveils-2-supercomputers/ Tue, 28 Jun 2022 22:16:23 +0000 https://fedscoop.com/?p=54646 The new computing power will allow the service to get further ahead of catastrophic weather events like hurricanes.

The post NOAA unveils 2 weather and climate supercomputers for improved forecasting appeared first on FedScoop.

]]>
The National Oceanic and Atmospheric Administration’s two new weather and climate supercomputers, expected to improve forecasts and warnings protecting life and property, became operational at 8 a.m. Eastern time Tuesday.

Dogwood in Manassas, Virginia, and Cactus in Phoenix are configured identically: operating three times faster than their predecessors at 12.1 petaflops and boasting double the storage at 26 petabytes each.

NOAA awarded General Dynamics IT the first, $150 million task order on the $505.2 million Weather and Climate Operational Supercomputing System (WCOSS) contract in February 2020 with the goal of improving models guiding forecasts.

“As forecasts become more accurate, and weather and climate events become more extreme, the public needs more detailed forecast information further in advance,” Ken Graham, director of the National Weather Service, said on a press call Tuesday. “And this takes more advanced computing.”

The twin Hewlett Packard Enterprise Cray supercomputers, ranked 49th and 50th fastest in the world by the TOP500 project, will provide NOAA’s NWS with:

  • high-resolution models that better capture small-scale features like severe thunderstorms,
  • more realistic model physics that better represent the formation of clouds and precipitation,
  • a larger number of individual model simulations to better quantify confidence in results, and
  • improved use of billions of weather observations to better forecast.

Dogwood and Cactus will further pave the way for fall upgrades to the Global Forecast System (GFS), air quality models, and ocean-going and Great Lakes wave prediction systems.

NOAA is moving from deterministic models to ensemble-based systems that couple atmosphere and oceans, giving forecasters the ability to assess the probability something might happen.

“Being able to provide probabilistic information to the public, through the use of ensemble-based modeling systems, is going to be a very exciting change coming up,” said Brian Gross, director of NOAA’s Environmental Modeling Center.

NOAA further plans to launch a new hurricane forecast model, the Hurricane Analysis and Forecast System (HAFS), ahead of the 2023 hurricane season — pending tests and evaluations. HAFS replaces two legacy systems, and will predict the track and intensity of tropical cyclones.

NWS will be able to extend hurricane forecasts to seven days, Graham said. 

Dogwood and Cactus’ predecessors were located in Reston, Virginia, and Orlando, Florida — a problem if a catastrophic weather event hit the East Coast downing both. That’s why the new supercomputers are hosted on opposite sides of the country.

WCOSS is an eight-year base contract with a two-year renewal. While its predecessor included performance enhancements on the front end — requiring IBM to guarantee price performance 10 years out — NOAA only required GDIT to propose the first task order award for this contract.

“I’m actually really excited about that because it leaves us open to be able to look at what are experiences on the existing system, where we need to make improvements in balance in the computing system, or as different technologies evolve we can take advantage of those and not be strapped to that initial price performance guess,” said Dave Michaud, director of NWS’s Office of Central Processing.

NOAA anticipates the second phase task order, covering the last five years of the contract, will be awarded in the 2024-25 timeframe.

The agency will work with GDIT to identify industry trends and incorporate those, along with any new computing requirements, into the next phase of WCOSS.

“We’ve actually left that wide open,” Michaud said.

The post NOAA unveils 2 weather and climate supercomputers for improved forecasting appeared first on FedScoop.

]]>
54646
Aging infrastructure the ‘single, greatest threat’ to NASA missions and technology https://fedscoop.com/nasa-aging-infrastructure-funding/ https://fedscoop.com/nasa-aging-infrastructure-funding/#respond Thu, 29 Jul 2021 20:05:20 +0000 https://fedscoop.com/?p=42946 Artemis Program investments are needed to maintain the agency's technical capabilities, according to one NASA official.

The post Aging infrastructure the ‘single, greatest threat’ to NASA missions and technology appeared first on FedScoop.

]]>
NASA infrastructure should be part of the wider effort to fund federal research and development infrastructure, said Rep. Eddie Bernice Johnson, D-Texas, during a House Science Subcommittee on Space and Aeronautics hearing Thursday.

The chair of the full committee said NASA‘s infrastructure needs include one of the nation’s most powerful supercomputers, utility and access systems across nine centers and other research and test facilities, wind tunnels for developing subsonic and hypersonic aircraft, and clean rooms and vacuum chambers for building sensitive interplanetary spacecraft.

NASA Administrator Bill Nelson has previously said the agency’s full list of infrastructure needs is more than $5.4 billion, which includes $2.6 billion in deferred maintenance — roughly 7% of its $39 billion asset value.

“NASA’s infrastructure represents the single, greatest threat to mission success,” said Robert Gibbs, associate administrator for NASA’s Mission Support Directorate. “Practically 82% of our facilities are beyond their designed life.”

Annual maintenance requirements increase every year and exceed NASA’s resources, which will begin jeopardizing efforts to return to the moon, establish a permanent lunar base and reach Mars, understand the impacts of climate change, and make engineering breakthroughs in the “near future,” Gibbs added.

Many of NASA’s buildings and laboratories date back to the National Advisory Committee for Aeronautics and Mercury, Gemini and Apollo projects, and Artemis Program investments are needed to maintain technical capabilities.

In addition to the hypersonic tunnel at Langley Research Center with high-speed air travel implications, NASA wants increased funding for its new robotics lab within the Jet Propulsion Laboratory benefitting Mars exploration and its recently opened Health and Human Performance Laboratory at Johnson Space Center studying the impacts of space travel.

JSC also runs NASA’s space sample return program, which hasn’t yet sited a new facility for processing samples returned from Mars. Missions like that are consistently prioritized over NASA’s deferred maintenance backlog across 5,000-plus buildings and structures.

“The reality is I’ve had to delay, defer, descope 47 [construction] projects over the last two budget cycles because I just didn’t have the resources,” Gibbs said.

The House Science Committee hadn’t examined NASA’s infrastructure since 2013, when the deferred maintenance backlog was $2.1 billion, and the agency received less funding than it’s requested 10 out of the last 12 budget cycles.

Rep. Brian Babin, R-Texas, said NASA’s infrastructure funding request needs to be formalized, rather than remaining an “off-budget wishlist.”

NASA’s fiscal 2022 request represents a $6.3% increase on last year with an additional $3 billion for safety, security and mission services that include maintenance and operations. But the agency’s request for construction, environmental compliance and remediation represents a 9% reduction.

“If NASA’s facilities and infrastructure are in need, they should be appropriately prioritized in the agency’s budget request,” Babin said.

Riggs said mission, safety and health remained higher priorities in the budgeting process, but there’s an opportunity to invest in state-of-the-art facilities and demolition of outdated ones to shrink NASA’s footprint.

Exactly how lawmakers plan to approach the problem remains up in the air.

“While the path forward in Congress might not yet be totally clear, my commitment to addressing our R&D infrastructure needs is steadfast,” Johnson said. “Science, research and innovation are our future.”

The post Aging infrastructure the ‘single, greatest threat’ to NASA missions and technology appeared first on FedScoop.

]]>
https://fedscoop.com/nasa-aging-infrastructure-funding/feed/ 0 42946
Energy awards $28M to 5 supercomputing projects https://fedscoop.com/doe-28-million-supercomputing-projects/ https://fedscoop.com/doe-28-million-supercomputing-projects/#respond Mon, 19 Jul 2021 20:10:37 +0000 https://fedscoop.com/?p=42781 Winners will research quantum information science and chemical reactions with clean energy applications.

The post Energy awards $28M to 5 supercomputing projects appeared first on FedScoop.

]]>
The Department of Energy will give $28 million to five research projects developing software for its supercomputers, the Scientific Discovery Through Advanced Computing (SciDAC) program announced Friday.

The projects DOE selected will develop computational methods, algorithms and software benefitting research into quantum information science and chemical reactions with clean energy applications.

SciDAC brings together interdisciplinary groups of experts to make use of DOE’s high-performance computing resources, and the five teams will partner with one or both of its institutes, FASTMath and RAPIDS2, out of the Lawrence Berkeley and Argonne national laboratories.

“DOE’s national labs are home to some of the world’s fastest supercomputers, and with more advanced software programs we can fully harness the power of these supercomputers to make breakthrough discoveries and solve the world’s hardest to crack problems,” said Secretary of Energy Jennifer Granholm in an announcement. “These investments will help sustain U.S. leadership in science, accelerate basic research in energy, and advance solutions to the nation’s clean energy priorities.”

The five awardees are:

  • California Institute of Technology for its project on traversing the “death valley” separating short and long times in non-equilibrium quantum dynamical simulations of real materials;
  • Florida State University for its project on relativistic quantum dynamics in the non-equilibrium regime;
  • Lawrence Berkeley National Lab for its project on large-scale algorithms and software for modeling chemical reactivity in complex systems;
  • University of California-Santa Barbara for its project on real-time dynamics of driven correlated electrons in quantum materials; and
  • University of California-Riverside for its Data-driven Exascale Control of Optically Driven Excitations (DECODE) project dealing with chemical and material systems.

The projects were chosen through a competitive, peer review process under a DOE Funding Opportunity Announcement open to universities, national labs and other research organizations. DOE has yet to negotiate final project details for the awardees, but $7 million of the total funding has been allocated for fiscal 2021, contingent upon congressional appropriations.

The post Energy awards $28M to 5 supercomputing projects appeared first on FedScoop.

]]>
https://fedscoop.com/doe-28-million-supercomputing-projects/feed/ 0 42781
Biden administration proposes record $171.3B for R&D across civilian agencies https://fedscoop.com/biden-2022-budget-rd-spending/ https://fedscoop.com/biden-2022-budget-rd-spending/#respond Fri, 28 May 2021 19:34:19 +0000 https://fedscoop.com/?p=41468 The proposed sum would come alongside extra funding for research included in the American Jobs Plan.

The post Biden administration proposes record $171.3B for R&D across civilian agencies appeared first on FedScoop.

]]>
The Biden administration proposed the largest-ever increase in non-defense research and development spending as it looks to out-compete China in emerging technologies with its first budget released Friday.

Officials likened the $171.3 billion proposed for R&D across more than 20 federal agencies —which represents a 9% increase from prior year proposals — to “space race”-era spending. It comes in addition to American Jobs Plan investments of $50 billion for the National Science Foundation, $40 billion for laboratory upgrades across the country, and $30 billion for general innovation and job creation.

Historic increases in foundational R&D at scientific agencies like NSF, NASA, the Department of Energy, and the National Institute of Standards and Technology accompany growing concern that the U.S. has lost its global edge in the development of key technology such as satellites and supercomputers.

“[T]he nation is falling behind its biggest competitors in research and development, manufacturing and training,” reads the budget. “It has never been more important to invest in strengthening the nation’s infrastructure and competitiveness, and in creating the good-paying, union jobs of the future.”

Officials cited “overly restrictive” budget caps during the last decade for diminishing the country’s global advantage in emerging technologies.

The fiscal 2022 budget proposes a $7.7 billion increase to the Department of Health and Human Services‘ R&D budget, bringing the total to $51.2 billion, an 18% jump.

Of that figure, $8.7 billion would go to the Centers for Disease Control and Prevention, its largest increase in two decades, for improving disease surveillance — notably by modernizing public health data collection nationwide. A separate $1 billion is proposed for foreign assistance in crosscutting research and viral discovery programs to detect outbreaks that could lead to pandemics.

An additional $6.5 billion is proposed for the creation of the Advanced Research Projects Agency for Health, which would be modeled after the Defense Advanced Research Projects Agency that helped create the internet. ARPA-H would sit within the National Institutes of Health and research advances in cancer, diabetes and Alzheimer’s treatments.

If the proposals are enacted, the Department of Veterans Affairs would receive a $78 million R&D budget increase to $1.5 billion, $882 million of which would go toward medical and prosthetic, traumatic brain injury, and toxic exposure effects research benefitting disabled veterans.

Another big winner in the Biden budget is the Department of Energy, which would receive a $2.1 billion, or 11% increase, to its R&D budget — bringing the total to $21.5 billion. A portion of that would go toward upgrading the National Laboratories seeking climate and clean energy breakthroughs, as well as competing with China to develop the first exascale supercomputers.

The Biden administration is prioritizing climate R&D with $36 billion in discretionary climate investments, including a proposed $10 billion, a 30% increase, for non-defense clean energy innovation. The goal is to achieve a net-zero carbon economy by 2050.

Another $7 billion, representing a $1.5 billion increase, would go to the National Oceanic and Atmospheric Administration for improving climate observation and forecasting and data provided to decision-makers. Meanwhile the Department of the Interior, NASA and NSF would receive another $4 billion to fund climate science.

The budget also proposes $1 billion for an Advanced Research Projects Agency for Climate and invests in the existing Advanced Research Projects Agency for Energy.

A smaller $600 million is proposed for electric vehicles and charging infrastructure at 18 agencies. The General Services Administration would see dedicated funds for other agencies, and the U.S. Postal Service would receive money for charging infrastructure.

Other R&D budget upticks of note would be a $1.3 billion increase at NASA to $14.6 billion, $765 million increase at NSF to $8.2 billion, and a $621 million increase at the Department of Commerce that houses NIST to $2.7 billion.

NASA would bolster everything from its next-generation satellite projects to its efforts to broaden diversity and inclusion in the science, technology, engineering and mathematics fields.

An additional $50 billion proposed for NSF in the American Jobs Plan would go, in part, toward creating a technology directorate that will collaborate and build on existing semiconductor, advanced computing and communications, energy, and biotechnology programs across government.

The request for NIST includes program increases for measurement research and services across advanced communications, climate and energy, trustworthy artificial intelligence, quantum information science, engineering, semiconductor metrology, and the bioeconomy.

Biden’s budget reemphasized his administration’s effort to foster scientific integrity and evidence-based decision-making across agencies.

“The administration’s commitment to evidence-based policymaking and program evaluation is reflected in the prioritization and design of the budget’s historic investments in addressing climate change, environmental justice, health security, and pandemic preparedness and will be equally central to implementing these initiatives,” reads the document. “Agencies’ learning agendas and annual evaluation plans should reflect their plans to build evidence in these and other priority areas.”

Another administration goal with the budget was to improve the country’s long-term finances while making critical investments in emerging technologies, said Shalanda Young, acting director of the Office of Management and Budget, on a call with reporters Friday morning.

Officials expect the budget’s corresponding American Jobs Plan and American Families Plan, as budgeted, to fully offset within 15 years.

“This is a very important and forward-looking budget,” said Cecilia Rouse, chair of the Council of Economic Advisors, on the same call. “The policies proposed are premised on the idea that to move forward as a country we need to invest in innovation, and the public sector is critical to building a robust and inclusive economy.”

Speaking to FedScoop, Grant Thornton Public Sector director Kelly Morrison, said the increase in R&D spending suggests an increased commitment from federal government to strength testing projects in their early stages.

“I think it’s a recognition that more needs to be done up front to prove out value and fail fast so that we’re not spending millions of dollars on initiatives that are behind schedule and don’t produce the value intended.

“In order to fix that symptom, more money needs to be invested in that R&D phase,” she said.

 

 

 

The post Biden administration proposes record $171.3B for R&D across civilian agencies appeared first on FedScoop.

]]>
https://fedscoop.com/biden-2022-budget-rd-spending/feed/ 0 41468
Next Generation Computing Act proposed to speed U.S. supercomputer development https://fedscoop.com/beyond-exascale-computing-program-bill/ https://fedscoop.com/beyond-exascale-computing-program-bill/#respond Mon, 24 May 2021 21:42:48 +0000 https://fedscoop.com/?p=41325 The Beyond Exascale Computing Program would provide support for a new generation of supercomputers.

The post Next Generation Computing Act proposed to speed U.S. supercomputer development appeared first on FedScoop.

]]>
Rep. Jay Obernolte, R-Calif., has introduced legislation to bolster one of two high-priority, advanced scientific computing programs at the Department of Energy (DOE).

The Next Generation Computing Research and Development Act would create the Beyond Exascale Computing Program for developing systems with capabilities that exceed those of the fastest supercomputers in the U.S., set to start arriving at the National Laboratories later this year.

The proposed legislation was introduced last Thursday in the House of Representatives, and comes as U.S. lawmakers devote increasing attention to the country’s supercomputing arms race with China. The DOE is targeting the launch of a new exascale computing platform, which is known as Frontier, in October this year.

Exascale refers to a computing system that can perform at least one exaflop – or one quintillion (a billion-billion) calculations per second.

DOE has spent $460 million on its Exascale Computing Project to date. And the proposed successor program would refocus those efforts on new computing architectures, models and simulations, artificial intelligence, and quantum computing.

Commenting on the proposed legislation, Obernolte said: “The future of innovation lies in our ability to unlock new answers about the workings of our world.”

“Those answers will only come with the help of the next generation of supercomputers,” he added.

Obernolte’s bill would establish a special energy efficient computing program, where national labs partner with industry and academia to develop technology and applications that decrease supercomputers’ energy needs. Federal partners would be selected through a competitive process.

DOE would have a year from the bill’s passage to report back on the progress of both new programs.

The bill also requires an upgrade to a user facility designed for the secure transport of researchers’ data, as well as a workforce development program out of the Office of Advanced Scientific Computing Research. It would also create a computational science fellowship with a $21 million-a-year grant.

The post Next Generation Computing Act proposed to speed U.S. supercomputer development appeared first on FedScoop.

]]>
https://fedscoop.com/beyond-exascale-computing-program-bill/feed/ 0 41325
Argonne National Lab adds ‘AI supercomputer,’ boosting work of COVID-19 consortium https://fedscoop.com/argonne-national-lab-ai-supercomputer/ https://fedscoop.com/argonne-national-lab-ai-supercomputer/#respond Thu, 14 May 2020 19:56:10 +0000 https://fedscoop.com/?p=36632 The NVIDIA DGX A100 cluster consists of 24 nodes for 120 petaflops of compute power, making it the fastest "AI supercomputer" at the consortium's disposal.

The post Argonne National Lab adds ‘AI supercomputer,’ boosting work of COVID-19 consortium appeared first on FedScoop.

]]>
The COVID-19 High Performance Computing Consortium has added another supercomputer to its effort to speed the work of vaccine researchers, with the NVIDIA DGX A100 now operational at Argonne National Laboratory.

The Department of Energy‘s Office of Science is developing a new generation of exascale computers at the Argonne, Oak Ridge and Lawrence national labs, and Argonne is the first DGX A100 buyer, NVIDIA announced Thursday. CEO Jensen Huang says the machine is designed for artificial intelligence applications, “for the end-to-end machine learning workflow — from data analytics to training to inference.”

The compute power “will help researchers explore treatments and vaccines and study the spread of the virus, enabling scientists to do years’ worth of AI-accelerated work in months or days,” said Rick Stevens, an associate lab director at Argonne, in the announcement.

Convened by the White House Office of Science and Technology Policy, the consortium of government, industry and academia already boasts the No. 1 and No. 2 fastest supercomputers in the world: Summit and Sierra.

The DGX A100, meanwhile, has the distinction of being the “fastest AI supercomputer” being used by the consortium, said Kimberly Powell, vice president of health care at NVIDIA, during a Wednesday news briefing. A single node costs $199,000 and delivers 5 petaflops of compute power, and Argonne’s cluster contains 24 nodes — several of which are operational — for 120 petaflops.

For comparison Summit accounts for 200 petaflops, previously 50% of the consortium’s compute power, meaning it remains the fastest supercomputer overall.

Argonne received its system on May 6, began installation May 7 and had it operational May 9, according to a DOE spokesperson.

The DGX A100 can screen 1 billion small molecule drugs in a day, which is “unprecedented” as researchers look for the molecule capable of blocking the coronavirus from binding with cells, Powell said. The faster that drug is identified, the sooner it can be moved into experimentation and clinical trials.

Supercomputers are also being used to assemble genomic data fragments to understand how COVID-19 affects the microbiomes of the lungs and digestive system, and the DGX A100 just helped NVIDIA set a genomics analysis speed record.

“We have just achieved the fastest gold standard in sequencing analysis, and we can sequence a whole genome in less than 20 minutes,” Powell said.

Sequencing the viral genome globally will show researchers how the coronavirus is migrating and mutating, a process that takes NVIDIA seven hours with the help of Oxford Nanopore Technologies’ handheld sequencer. Deployed worldwide, the devices send sequences to a GISAID database of about 25,000 viral genomes.

NVIDIA made its Clara Parabricks software package free to more than 650 COVID-19 researchers now sequencing entire populations.

The tech company also built its own in-house DGX A100 supercomputer, consisting of 50 nodes, to work with the COVID-19 ecosystem in a matter of three weeks.

“That is the beauty of the DGX architecture,” Powell said. “This is a data center-level supercomputer in a node.”

The post Argonne National Lab adds ‘AI supercomputer,’ boosting work of COVID-19 consortium appeared first on FedScoop.

]]>
https://fedscoop.com/argonne-national-lab-ai-supercomputer/feed/ 0 36632
Air Force wants turn its computers into brains for AI https://fedscoop.com/air-force-ai-computer-power-structure/ https://fedscoop.com/air-force-ai-computer-power-structure/#respond Fri, 01 May 2020 16:53:12 +0000 https://fedscoop.com/?p=36469 The Air Force Research Lab wants insights into how to configure computers to field AI. One idea is to model computer systems after the brain.

The post Air Force wants turn its computers into brains for AI appeared first on FedScoop.

]]>
The Air Force is asking for industry input on how best to use and upgrade its compute power and model it after the brain to meet the needs of developing artificial intelligence.

The service is especially interested in how it can reconfigure its computational architecture to increase the efficiency of its systems. For algorithms to come to life in neural networks and other AI approaches, engineers need massive computing power to churn through the data. That massive computing power could be realized by more efficient computing structures, and the Air Force wants to know how to do it.

“Unconventional computing architectures are necessary to achieve advanced and new capabilities,” according to the Broad Agency Announcement.

The Air Force Research Laboratory will review submitted white papers, offering between $1 million and $3 million for effective solutions.

Some of the ideas the Air Force is toying with to better use its computing systems are modeling them after “brain-inspired computing architectures.” These architectures are known as “neuromorphic” and were developed in the late 1980s to design analog circuits that imitate how synapses in the brain transmit information.

The lab’s intent is to develop innovative modular computing systems, potentially like neuromorphic computing, to meet increased future demand for data bandwidth. More data usually means more accurate AI.

This type of research has been championed by outside groups that have urged the department to invest in foundational research on AI. The white papers the lab is requesting will be used to improve the Defense Department’s future fielding of AI.

Air Force and other IT leaders have said that the department is in its first, “foundational” phase of collecting and governing data more effectively to turn that data into the fuel that AI algorithms can run on. So far, data in the department has been scattered and difficult to use properly without uniform cloud-based computing systems, the officials have said.

The call for white papers will be open for a long time. The Air Force has set aside $99 million to fund papers through September 2023.

“The overarching objective is to achieve orders of magnitude improvement in size, weight and power for deploying robust artificial intelligence and machine learning (AI/ML) capabilities in an embedded computing environment,” the BAA states.

The post Air Force wants turn its computers into brains for AI appeared first on FedScoop.

]]>
https://fedscoop.com/air-force-ai-computer-power-structure/feed/ 0 36469
New supercomputer will let nuclear security agency do ‘1.5 quintillion calculations per second’ https://fedscoop.com/nnsa-supercomputer-ai/ https://fedscoop.com/nnsa-supercomputer-ai/#respond Tue, 13 Aug 2019 15:01:29 +0000 https://fedscoop.com/?p=33383 Imagine a computer that can do 1,500,000,000,000,000,000 calculations every second. It's coming to the Lawrence Livermore National Lab in 2022.

The post New supercomputer will let nuclear security agency do ‘1.5 quintillion calculations per second’ appeared first on FedScoop.

]]>
The National Nuclear Security Administration is upping its computing power with the purchase of new supercomputer it says will help drive the development of artificial intelligence to maintain the U.S. nuclear stockpile.

NNSA announced the $600 million deal Tuesday morning with Cray, a supercomputing company that regularly contracts with the government. The computer, dubbed El Capitan, is expected to arrive at the Lawrence Livermore National Laboratory in California in late 2022 and be up and running by 2023.

Cray said the supercomputer will have a peak performance of 1.5 exaFLOPS, or 1.5 quintillion calculations per second—that’s 1.5 trailed by 17 zeros—and run some applications 50 times faster than Lawrence Livermore’s current system. All that added computing power is critical to ensuring the NNSA’s mission of maintaining the nation’s nuclear stockpile, NNSA Administrator Lisa Gordon-Hagerty said.

“El Capitan will allow us to be more responsive, innovative and forward-thinking when it comes to maintaining a nuclear deterrent that is second to none in a rapidly-evolving threat environment,” Gordon-Hagerty said.

The enhanced capabilities will be especially significant in NNSA’s development and use of artificial intelligence and machine learning, said Bill Goldstein, Lawrence Livermore’s lab director. The lab will use the supercomputer to expand simulations using advanced statistical sampling in physical and chemical “uncertainties,” Goldstein said. The math behind the simulations is ideally suited to be done by machine learning, he added.

The simulations run with the assistance of machine learning are dubbed “cognitive simulations,” Goldstein said. With the nuclear stockpile aging, and without real-world testing, NNSA will use cognitive simulations powered by the El Capitan system to better maintain the warheads and radioactive materials.

“Machine learning can be a huge boost to our abilities,” Goldstein said.

The threat is growing in complexity and scale. China and Russia continue to field new nuclear weapons and capabilities, Gordon-Hagerty said. Securing nuclear deterrents is at the heart of the U.S.’s national security apparatus as it pivots from focusing on counterterrorism to defending against so-called great power conflict.

“We really think that this is the next generation,” Goldstein said.

Last week Cray announced a deal with the Air Force to provide supercomputing power to their weather predictions. That system will be housed in the Department of Energy’s Oak Ridge lab with capabilities delivered as-a-service to the Air Force. The El Capitan system will be installed directly in the Lawrence Livermore lab and be air-gapped after an initial general research test period.

The post New supercomputer will let nuclear security agency do ‘1.5 quintillion calculations per second’ appeared first on FedScoop.

]]>
https://fedscoop.com/nnsa-supercomputer-ai/feed/ 0 33383