Supercomputing Archives | FedScoop https://fedscoop.com/tag/supercomputing/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Tue, 02 Jan 2024 18:27:59 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 Supercomputing Archives | FedScoop https://fedscoop.com/tag/supercomputing/ 32 32 How one national lab is getting its supercomputers ready for the AI age https://fedscoop.com/how-one-national-lab-is-getting-its-supercomputers-ready-for-the-ai-age/ Wed, 29 Nov 2023 18:50:45 +0000 https://fedscoop.com/?p=75009 As the Biden administration pushes to invest in AI, the Department of Energy is turning to its fleet of supercomputers, including Frontier, the world’s fastest.

The post How one national lab is getting its supercomputers ready for the AI age appeared first on FedScoop.

]]>
OAK RIDGE, Tenn. — At Oak Ridge National Laboratory, the government-funded science research facility nestled between Tennessee’s Great Smoky Mountains and Cumberland Plateau that is perhaps best known for its role in the Manhattan Project, two supercomputers are currently rattling away, speedily making calculations meant to help tackle some of the biggest problems facing humanity.

You wouldn’t be able to tell from looking at them. A supercomputer called Summit mostly comprises hundreds of black cabinets filled with cords, flashing lights and powerful graphics processing units, or GPUs. The sound of tens of thousands of spinning disks on the computer’s file systems, and air cooling technology for ancillary equipment, make the device sound somewhat like a wind turbine — and, at least to the naked eye, the contraption doesn’t look much different from any other corporate data center. Its next-door neighbor, Frontier, is set up in a similar manner across the hall, though it’s a little quieter and the cabinets have a different design. 

Yet inside those arrays of cabinets are powerful specialty chips and components capable of, collectively, training some of the largest AI models known. Frontier is currently the world’s fastest supercomputer, and Summit is the world’s seventh-fastest supercomputer, according to rankings published earlier this month. Now, as the Biden administration boosts its focus on artificial intelligence and touts a new executive order for the technology, there’s growing interest in using these supercomputers to their full AI potential.

“The more computation you use, the better you do,” said Neil Thompson, a professor and the director of the FutureTech project at MIT’s Initiative on the Digital Economy. “There’s this incredibly predictive relationship between the amount of computing you use in an AI system and how well you can do.”

The Frontier supercomputer. Credit: Oak Ridge National Laboratory.

At the department level, the new executive order charges the Department of Energy with creating an office to coordinate AI development among the agency and its 17 national laboratories, including Oak Ridge. Critically, the order also calls on the DOE to use its computing and AI resources for foundation models that could support climate risk preparedness, national security and grid resilience, among other applications — which means increased focus on systems like Frontier and Summit. 

“The executive order provided us clear direction to, first of all, leverage our capabilities to make sure that we are making advances in AI, but we’re doing it in a trustworthy and secure way,” said Ceren Susut, the DOE’s associate director of science for Advanced Scientific Computing Research. “That includes our expertise accumulated in the DOE national labs, and the workforce, of course, but also the compute capabilities that we have.” 

The government’s AI specs

Supercomputers like Summit and Frontier can be measured in performance. Often, they’re measured in exaflops, defined as their ability to calculate a billion a billion — no, this isn’t a typo — floating point operations per second. Frontier sits at 1.194 exaflops, while Summit’s is a little less impressive, at 148.60 petaflops. But they can also be measured in their number of GPUs: Summit has slightly more than 28,000, while Frontier has nearly 10,000 more.  

These chips are particularly helpful, experts explain, for the types of matrix algebra calculations one might need for training AI models. Notably, the DOE is nearing the completion of its Exascale Computer Project, an initiative across the national labs to rewrite software to be GPU, or AI, enabled. “Many of these applications are integrating AI techniques as one way in which they take advantage of GPUs,” Susut told FedScoop in an email.

In the same vein, one of the biggest requirements for building advanced AI systems, including AI tools developed with the help of the government, has become “compute,” or computational resources. For the same reason, the technical needs of the most powerful supercomputers, and most rigorous AI models, can often line up. That’s where systems like Frontier and Summit come in. 

“I’ve read so many papers recently about how AI and [machine learning] need high bandwidth, low latency, high-performance networks around high memory nodes that have really fast processors on them,” said Bronson Messer, the director of science at the Oak Ridge Leadership Computing Facility, which houses the two supercomputers. “I’m like, wow, that’s exactly what I’ve always wanted for 20 years.”

The Summit supercomputer. Credit: Oak Ridge National Laboratory.

MIT’s Thompson noted that in the field of computer vision, about 70 percent of the improvements in these systems can be attributed to increased computing power.

There are already efforts to train AI models, including large language models, at the lab. So far, researchers at Oak Ridge have used the lab’s computing resources to develop a machine learning algorithm designed to create simulations to boost greener flight technology; an algorithm to study potential links between different medical problems based on scans of millions of scientific publications; and datasets reflecting how molecules might be impacted by light — information that could eventually be used for medical imaging and solar cells applications. 

There’s also a collaboration with the National Cancer Institute focused on building a better way of tracking cancer across the country, based on a large dataset, sometimes called a corpus, of medical documents. 

“We end up with something on the order of 20 to 30 billion tokens, or words, within the corpus,” said John Gounley, a computational scientist at Oak Ridge working on that project. “That’s something where you can start legitimately training a large language model on a dataset that’s that large. So that’s where the supercomputer really comes in.”

More AI initiatives at the facility will soon go online. The DOE’s support for the Summit supercomputer has been extended, in part, to propel the National Artificial Intelligence Research Resource, which aims to improve government support for AI research infrastructure. Starting next year, several projects focused on building foundation models are set to start on Frontier, including an initiative that plans to use a foundation model focused on energy storage and a large language model built with a Veterans Affairs data warehouse.

How DOE pivots for the AI era

As part of the executive order, the Department of Energy is charged with building tools to mitigate AI risks, training new AI researchers, investigating biological and environmental hazards that could be caused by AI, and developing AI safety and security guidelines. But the agency doesn’t need to be pushed to invest in the technology. 

This past summer, the DOE disclosed around 180 public AI use cases as part of a required inventory. Energy is also working on preliminary generative AI programs, including new IT guidance and a specified “Discovery Zone,” a sandbox for trying out the technology. Earlier this year, the Senate hosted a hearing focused specifically on the DOE’s work with the technology, and the agency’s office of science has requested more resources to support its work on AI, too. 

But as the agency looks to deploy supercomputers for AI, there are challenges to consider. For one, the increased attention toward the technology marks a significant pivot for the supercomputing field, according to Paola Buitrago, the director of artificial intelligence and data at the Pittsburgh Supercomputing Center. Traditionally, research on supercomputers has focused on topics like genomics and computational astrophysics — research that has different requirements than artificial intelligence, she explained. Those limitations aren’t just technical, but apply to talent and workforce as well. 

“Most of the power of the impressive supercomputers could not always be leveraged completely or efficiently to service the AI computing needs,” Buitrago said in an email. “There is a mindset in the supercomputing field that doesn’t completely align with what is needed to advance AI.”

And the government only has so many resources. While there are several supercomputers distributed across some of the national labs, Oak Ridge itself can only support so much research at a time. Lawrence Berkeley National Laboratory’s supercomputer might handle several hundred projects in a year, but Messer said Frontier and Summit have a smaller number of projects than other labs because the projects tend to run significantly longer. 

There’s also more demand for supercomputing facilities than supply. Only a fraction of projects proposed to Oak Ridge are accepted. Meanwhile, while training foundation models is incredibly computationally demanding and only the largest supercomputers support developing them, building these systems is just one of several priorities that the agency must consider. 

“DOE is actively considering these ideas and must also balance the use of our supercomputers across a range of high-priority mission applications,” said Susut, the DOE supercomputer expert. “Our supercomputers are open to the research community through merit-based competitive allocation programs, and we have a wide diversity of users.” 

Even as the Department of Energy plans potential successors to Frontier, MIT’s Thompson noted that there are still other hurdles ahead. 

For one, there’s a tradeoff between the flexibility of these computers and efficiency, especially as the agency seeks even greater performance. Supercomputers, of course, are extremely expensive systems — and costs aren’t dropping as fast as they used to. And they take time to build. At Oak Ridge, plans for a new computer, which will have AI as a key area of focus, are already in the works. But the device isn’t expected to go online until 2027

“The reality is that the U.S. private sector has led research in AI starting in the past few years, as they have the data, the computing capacity and the talent,” Buitrago said. “Whether or not that continues to be the case depends on how much the government prioritizes AI and its needs. To extend, some may say the government is slowly catching up.”

The post How one national lab is getting its supercomputers ready for the AI age appeared first on FedScoop.

]]>
75009
NOAA supercomputer gets a 20% boost to help make better weather predictions https://fedscoop.com/noaa-supercomputer-gets-a-20-boost-in-capadity/ Thu, 10 Aug 2023 21:40:13 +0000 https://fedscoop.com/?p=71803 NOAA's forecast system can now process 29 quadrillion calculations per second.

The post NOAA supercomputer gets a 20% boost to help make better weather predictions appeared first on FedScoop.

]]>
The National Oceanic and Atmospheric Administration on Thursday announced the completion of upgrades that will expand the capacity of its weather supercomputing system by 20%. 

With this upgrade, NOAA’s twin supercomputers, located in Manassas, Virginia, and Phoenix, Arizona, will now operate at a speed of 14.5 petaflops each, and together, the forecast system can process 29 quadrillion calculations per second.

The boost to NOAA’s Weather and Climate Operational Supercomputing System (WCOSS) through increased computing power and storage will help improve U.S. forecast model guidance in the coming years by allowing more data to be fed into and analyzed by the system.

“This increased supercomputing power allows for upgrades to specific modeling systems that will help weather forecasters deliver more accurate weather forecasts, watches and warnings and improved certainty in a forecast,” said Ken Graham, director of NOAA’s National Weather Service.

The faster supercomputer will permit upgrades to NOAA’s weather forecasting systems and models over the next few years such as: upgrades to the U.S. Global Forecast System to make it higher-resolution; a new Rapid Refresh Forecast System which will allow for larger ensembles with more data included; upgrades to the Global Ensemble Forecast System, which will more accurately capture what is known as radiatively active aerosols to better model emissions such as wildfire smoke, dust and fog; and expansions in compute power and storage, which will provide operational capacity to implement research and development advancements made through NOAA’s Earth Prediction Innovation Center.

The post NOAA supercomputer gets a 20% boost to help make better weather predictions appeared first on FedScoop.

]]>
71803
Oak Ridge National Lab officials view new innovation push as modern day ‘Manhattan Project’ https://fedscoop.com/oak-ridge-national-lab-leaders-view-current-innovation-efforts-as-modern-day-manhattan-project/ Wed, 17 Aug 2022 23:55:06 +0000 https://fedscoop.com/?p=58340 During her first stop on a four-state, technology-focused tour, Deputy Defense Secretary Kathleen Hicks heard from multiple senior Oak Ridge lab officials that the massive investments from the recently passed Inflation Reduction Act and CHIPS and Science for America Act, present a once-in-a-lifetime chance to innovate.

The post Oak Ridge National Lab officials view new innovation push as modern day ‘Manhattan Project’ appeared first on FedScoop.

]]>
OAK RIDGE, Tenn. — The U.S. government has perhaps its best chance in recent decades to drive technological innovation, Oak Ridge National Laboratory (ORNL) officials told leaders of the departments of Defense and Energy on Wednesday, with some likening it to the push for new capabilities during the World War II era.

That Tennessee-based lab has roots that trace back to massive investments during the 1940s supporting the Manhattan Project that led to America’s development of the atomic bomb — a feat of research, development and engineering that changed the world and gave the United States a major strategic asset in its competition with advanced adversaries. Now, U.S. leaders say, the nation needs new innovations to compete with China and address other challenges of the modern era.

During her first stop on a four-state, technology-focused tour, Deputy Defense Secretary Kathleen Hicks heard from multiple senior Oak Ridge lab officials that the infusion of funding from the recently passed Inflation Reduction Act and CHIPS and Science for America Act, presents a once-in-a-lifetime chance to innovate. The legislation, championed by the Biden administration, will provide funding for a variety of high-tech initiatives.

“We are treating this as the second Manhattan Project, essentially. We have an urgency to deliver,” Dr. Xin Sun, associate lab director for ORNL’s Energy, Science and Technology Directorate, said during a briefing during Hicks’ visit.

Oak Ridge’s technology focus areas now extend well beyond nuclear science and include applied materials, advanced manufacturing, biosecurity, transportation, supercomputing and more.

During their half-day visit, Hicks and Deputy Secretary of Energy David Turk jointly toured America’s largest open-access battery manufacturing research and development center, based at that lab, and visited the Battery Manufacturing Facility there. They also saw the debut of Frontier, the United States’ first exascale — and currently most powerful — supercomputer.

“U.S. taxpayers have already put substantial R&D dollars down against this. What we want to see now is where that’s paying off and where we need to take it from here,” Hicks told FedScoop during the flight to the lab.

She and Turk also connected with dozens of scientists and engineers during the stop — and met with lab leadership.

“It’s a historic opportunity,” ORNL Director Thomas Zacharia said of the additional financial backing that the nation’s labs are receiving.

“If you look back to the Manhattan Project,” the government had the support, investments and resources that led to the introduction of entirely new technologies and associated fields, he noted.

Now, he said, the national lab system once again has boosted resources and a responsibility to deploy and demonstrate innovative capabilities to drive new breakthroughs supporting national security.

“It’s a historic opportunity,” Zacharia said. “We have a once in a generation opportunity to make this real.”

The post Oak Ridge National Lab officials view new innovation push as modern day ‘Manhattan Project’ appeared first on FedScoop.

]]>
58340
DOD awards $699M contract to support supercomputing resource centers https://fedscoop.com/dod-awards-699m-contract-to-support-supercomputing-resource-centers/ Fri, 15 Jul 2022 20:43:34 +0000 https://fedscoop.com/?p=55830 BAE Systems has been tapped to supply a variety of services and capabilities to bolster high performance computing (HPC) efforts across multiple military components. 

The post DOD awards $699M contract to support supercomputing resource centers appeared first on FedScoop.

]]>
The Pentagon has awarded BAE Systems a five-year, $699 million contract to supply a variety of services and capabilities to bolster high performance computing (HPC) efforts across multiple military components. 

National security-focused scientists and engineers conduct a broad range of research and development, testing and other activities via the Defense Department’s High Performance Computing Modernization Program (HPCMP). This newly awarded contract will provide Defense Supercomputing Resource Center (DSRC) operations, maintenance and management services in several U.S. regions. 

The program is managed by the Army’s Engineer Research and Development Center (ERDC). The program’s growing network of supercomputers enable Pentagon officials to process massive volumes of data quickly to aid a variety of initiatives such as designing cutting-edge military weapons, simulating weather patterns, and more.

“This contract will provide services to our five DSRCs, such as system administration and program support,” HPCMP Deputy Director Kevin Newmeyer told FedScoop on Friday.

The envisioned services may enable specialized analysis or software support for specialized programs, for example.

This newly announced contract replaces an expired one and was awarded through a competitive proposal process, according to Newmeyer. 

“We will not be getting any specialized digital infrastructure,” he noted.

The program’s five centers include:

  • Navy DSRC, Stennis Space Center, Mississippi 
  • Army Corps of Engineers, ERDC DSRC, Vicksburg, Mississippi
  • Air Force Research Laboratory (AFRL) DSRC, Wright-Patterson AFB, Ohio
  • Army Research Laboratory (ARL) DSRC, Aberdeen Proving Ground, Maryland
  • Maui High Performance Computing Center, Kihei, Maui, Hawaii

“We look forward to working with BAE to deliver high-end computing services to [defense science and technology, test and evaluation] and acquisition communities in support of DOD objectives,” Newmeyer said.

The post DOD awards $699M contract to support supercomputing resource centers appeared first on FedScoop.

]]>
55830
Oak Ridge lab leader says further investment key to U.S. leadership in supercomputing https://fedscoop.com/oak-ridge-lab-leader-says-further-investment-key-to-u-s-leadership-in-supercomputing/ https://fedscoop.com/oak-ridge-lab-leader-says-further-investment-key-to-u-s-leadership-in-supercomputing/#respond Fri, 21 May 2021 14:11:27 +0000 https://fedscoop.com/?p=41277 Researcher Georgia Tourassi warns that without additional funding scientific innovation could stagnate.

The post Oak Ridge lab leader says further investment key to U.S. leadership in supercomputing appeared first on FedScoop.

]]>
A supercomputing expert at the Oak Ridge National Laboratory has warned that investment is key to U.S. leadership in exascale computing and that scientific innovation could “stagnate” if it is not forthcoming.

“Without investment, essentially we are going to stagnate scientific innovation,” said Georgia Tourassi, responding to lawmakers’ questions on Wednesday. “We will stop innovating not only across basic sciences but across applied sciences.”

Tourassi is director of the National Center for Computational Sciences at Oak Ridge National Laboratory, which is a multiprogram science and technology laboratory sponsored by the U.S. Department of Energy. The research leader testified at a subcommittee hearing of the House Committee on Science, Space and Technology.

Oak Ridge is developing a new exascale computing system called Frontier, which is expected to be completed in October. It will compute eight times faster than the nation’s current most powerful supercomputer, Summit, which is also housed at the laboratory.

Congress has so far sought to fast-track development of exascale computing by appropriating $1 billion during fiscal 2021 to the Department of Energy’s Advanced Scientific Computing Research program, which is leading development of the Frontier exascale computing system. Exascale refers to a computing system that can perform at least one exaflop – or one quintillion (a billion-billion) calculations per second.

All told, the Department of Energy and the National Nuclear Security Administration within DOE have spent $460 million on their joint Exascale Computing Project to date. The hearing on Wednesday comes as the U.S. races to catch up with China in a supercomputing arms race.

“It is imperative for the United States to expand and enhance the national research computing ecosystem,” added Tourassi, giving evidence at the hearing. “The DOE has asked us to deliver Frontier one year earlier than planned, and we’re focusing our efforts on meeting that effort.”

Another exascale computing system will go to Argonne National Lab in 2022 and a third to Lawrence Livermore National Lab in 2023. But high-performance computing is also an investment priority for U.S. competitors China, Japan and the European Union.

Commenting on the U.S.’s development of supercomputing capabilities, Rep. Frank Lucas, R-Okla., the ranking member of the House Science Committee, said: “We know that our international competitors, like China, are outpacing us in basic research investment and are closing the gap in key computing focus areas like artificial intelligence and quantum sciences.

“Expanding our capacities in these fields requires a strategic effort with strong federal investment and active public-private partnerships,” he added.

Lucas is involved in crafting the Securing American Leadership in Science and Technology (SALSTA) Act that would roughly double ASCR’s funding over the next 10 years.

Lawmakers are also considering the Quantum User Expansion for Science and Technology (QUEST) Act, which would establish a DOE program for forming public-private partnerships around resource use and encourage increased participation in quantum information science.

The post Oak Ridge lab leader says further investment key to U.S. leadership in supercomputing appeared first on FedScoop.

]]>
https://fedscoop.com/oak-ridge-lab-leader-says-further-investment-key-to-u-s-leadership-in-supercomputing/feed/ 0 41277
New AI consortium for first responders fuses efforts from DOE, JAIC and Microsoft https://fedscoop.com/ai-consortium-first-reponders-doe-microsoft-jaic/ https://fedscoop.com/ai-consortium-first-reponders-doe-microsoft-jaic/#respond Tue, 18 Aug 2020 14:26:08 +0000 https://fedscoop.com/?p=37907 The project will focus on getting technology to state and local first responders to better track and respond to natural disasters.

The post New AI consortium for first responders fuses efforts from DOE, JAIC and Microsoft appeared first on FedScoop.

]]>
A new project from the Department of Energy, Department of Defense, the White House and Microsoft aims to provide artificial intelligence-enabled systems for first responders.

It’s called the “First Five Consortium,” a reference to an adage that the first five minutes in a crisis are the most critical. Announced Tuesday, the project aims to have the different organizations “share lessons learned” and work to deliver technology.

Potential areas of concentration include digitizing the mapping of natural disasters, like wildfires, through computer vision. The consortium comes as deadly natural disasters increase in severity and fire season takes hold in Western states. The DOE’s AI and Technology Office and Microsoft will co-chair the consortium.

“The collaboration will include both sharing of best practices as well as commonly shared data training and develop code for algorithms,” a spokesperson for the consortium told FedScoop. “Participants have agreed to make datasets, infrastructure and related resources available.”

Meetings to work out details will begin in early September. For now, the shared technologies include the DOD’s Joint AI Center prototypes that map fire lines and floods. DOE’s Pacific Northwest National Laboratory is now working on scaling the technology.

“These are just the kind of consortia that we like to enter into,” DOE’s lead AI official, Cheryl Ingstad said in a press call. “We think we can bring AI to bear here and help save lives.” DOE operates some of the country’s most powerful supercomputers.

The consortium was formed after the White House made a call for such projects in January. The Trump administration had hosted a forum on AI’s role in “Humanitarian Assistance and Disaster Response” and asked industry, federal agencies and nonprofits to contribute resources and work on the issue.

The JAIC was moved to participate since it had already invested in humanitarian missions as its first stepping stones in AI and is now looking to pivot its efforts to integrating AI into warfighting. It wanted “low-consequences” missions to “prove out” its AI work so that unproven algorithms couldn’t potentially skew lethal operations.

“We are delighted to work alongside our partners in government and private industry to advance the role of AI in battling natural disasters,” Nand Mulchandani, acting director of the JAIC, said in a statement. “The JAIC’s journey with developing AI solutions for humanitarian relief operations began more than a year ago, and we’d like to thank the White House for identifying and encouraging the broader use of government-built technology to directly benefit the American people when disasters strike.”

The post New AI consortium for first responders fuses efforts from DOE, JAIC and Microsoft appeared first on FedScoop.

]]>
https://fedscoop.com/ai-consortium-first-reponders-doe-microsoft-jaic/feed/ 0 37907
Supercomputing consortium adds members as number of coronavirus projects increases, too https://fedscoop.com/coronavirus-supercomputing-consortium-expands/ https://fedscoop.com/coronavirus-supercomputing-consortium-expands/#respond Mon, 06 Apr 2020 20:19:43 +0000 https://fedscoop.com/?p=36158 AMD, NVIDIA and the National Center for Atmospheric Research's Wyoming Supercomputing Center have joined the growing group. A White House official the consortium has provided computing for 15 projects.

The post Supercomputing consortium adds members as number of coronavirus projects increases, too appeared first on FedScoop.

]]>
The White House added three partners Monday to its supercomputing consortium trying to speed the work of coronavirus researchers.

The chipmaker Advanced Micro Devices (AMD) and graphics processing unit (GPU) producer NVIDIA joined, along with the National Center for Atmospheric Research’s Wyoming Supercomputing Center. The growing group of government, industry and academia members is lending infrastructure and resources to studies on limiting the virus’ spread.

The COVID-19 High Performance Computing Consortium has already supplied 15 research proposals with the compute power they’ve requested online since it launched on March 22, said Michael Kratsios, U.S. chief technology officer, in a tweet.

“This is truly a whole-of-America effort to advance COVID-19 research,” Kratsios said.

More than 30 supercomputers are involved in the effort to complete complex bioinformatics, epidemiology, molecular modeling and health care system response calculations in mere hours or days.

NVIDIA plans to help scientific teams ingest and process data faster by optimizing the throughput of supercomputers. A 20% optimization of a 330-petaflop system equals 60 petaflops, rivaling the fourth-fastest supercomputer in the world.

The company’s team of computer scientists will provide molecular biology, medical imaging, genomics, and computational fluid dynamics and visualization expertise.

Packaging of software for artificial intelligence (AI) and life-sciences applications will be done using research tools in NVIDIA NGC, a hub for GPU-accelerated work.

“The COVID-19 HPC Consortium is the Apollo Program of our time. Not a race to the moon, this is a race for humanity,” said Ian Buck, vice president of accelerated computing at NVIDIA, in a statement. “The rocket ships are GPU supercomputers, and their fuel is scientific knowledge. NVIDIA is going to help by making these rockets travel as fast as they can.”

The consortium’s new arrivals join the Department of Energy, National Science Foundation, NASA, IBM, Amazon Web Services, Google Cloud, Microsoft, Hewlett Packard Enterprise, Massachusetts Institute of Technology and Rensselaer Polytechnic Institute.

On March 11, the White House Office of Science and Technology Policy that convened the consortium held an initial call with tech companies seeking AI breakthroughs in coronavirus response.

The Brookings Institution has since warned that AI is only as good as the subject matter experts using it to make predictions and that, without a database of prior COVID-19 outbreaks, it will be tough for algorithms to accurately make pandemic projections.

“Like many tools, AI has a role to play, but its effect on the outbreak is probably small,” wrote Alex Engler, a Rubenstein fellow of governance studies at Brookings. “While this may change in the future, technologies like data reporting, telemedicine, and conventional diagnostic tools are currently far more impactful than AI.”

Some agencies have more experience than others in working on AI.

DOE — which runs the national laboratory system — is already using machine learning to search for drugs that might be effective against the coronavirus, proteins in the virus and its hosts that might make good drug targets, and ways to reduce the burden on the health system and vulnerable populations.

“This effort is enabled by work our labs have been doing for several years leveraging AI in the fight against cancer,” said Paul Dabbar, undersecretary for science, in a department interview. “The lessons learned from that work have been quickly transitioned to support similar efforts for COVID-19.”

The post Supercomputing consortium adds members as number of coronavirus projects increases, too appeared first on FedScoop.

]]>
https://fedscoop.com/coronavirus-supercomputing-consortium-expands/feed/ 0 36158
Supercomputing consortium further solidifies White House partnership with tech on coronavirus response https://fedscoop.com/supercomputing-consortium-white-house-tech/ https://fedscoop.com/supercomputing-consortium-white-house-tech/#respond Mon, 23 Mar 2020 19:40:24 +0000 https://fedscoop.com/?p=35967 Government, industry and academic partners' 16 systems will provide a combined 330 petaflops of supercomputing capacity for select COVID-19 researchers.

The post Supercomputing consortium further solidifies White House partnership with tech on coronavirus response appeared first on FedScoop.

]]>
The White House’s loose partnership with the tech industry on coronavirus response continues to take shape with the announcement of a supercomputing consortium to speed the work of COVID-19 researchers.

Government, industry and academic partners have volunteered supercomputing time and other resources to researchers who are studying ways to limit the virus’ spread. The White House unveiled the COVID-19 High Performance Computing Consortium on Sunday.

Combined, the consortium’s 16 systems equal 330 petaflops of supercomputing capacity that can complete, in hours or days, the required calculations on bioinformatics, epidemiology, molecular modeling and health-care system response. Additional cloud computing resources are forthcoming.

“The Department of Energy is home to the world’s fastest and most powerful supercomputers, and we are excited to partner with leaders across the scientific community who will use our world class innovation and technology to combat COVID-19,” said Energy Secretary Dan Brouillette in the announcement.

Five DOE national laboratories, the National Science Foundation and NASA belong to the consortium, which was convened by the White House Office of Science and Technology Policy.

OSTP encouraged researchers to submit COVID-19 research proposals for review and matching with computing resources via an online portal. A panel of scientists and computing researchers will assess the public health benefits of each proposal.

“Accelerating the process of discovery to unlock treatments and a cure for COVID-19 is of vital importance to us all,” said Dario Gil, director of IBM research, in a statement. “By bringing together the world’s most advanced supercomputers and matching them with the best ideas and expertise, this consortium can drive real progress in this global fight.”

IBM’s Summit is the most powerful supercomputer on the planet, according to the tech company. The POWER9-based supercomputer has already allowed researchers at the Oak Ridge National Laboratory and the University of Tennessee to screen about 8,000 compounds for those most likely to bind to the coronavirus’ main “spike” protein — preventing it from infecting host cells.

The 77 small-molecule drug compounds recommended can now be experimentally tested.

Other industry partners in the consortium are Amazon Web Services, Google Cloud, Microsoft and Hewlett Packard Enterprise. Academic partners are the Massachusetts Institute of Technology and Rensselaer Polytechnic Institute.

OSTP’s consortium comes on the heels of its release of the most extensive collection of machine-readable coronavirus literature for data and text mining by the tech industry on behalf of scientists.

On March 11, the office held an initial call with tech companies seeking artificial intelligence breakthroughs in coronavirus response.

“America is coming together to fight COVID-19,” said Michael Kratsios, U.S. chief technology officer, in a statement. “And that means unleashing the full capacity of our world-class supercomputers to rapidly advance scientific research for treatments and a vaccine.”

The post Supercomputing consortium further solidifies White House partnership with tech on coronavirus response appeared first on FedScoop.

]]>
https://fedscoop.com/supercomputing-consortium-white-house-tech/feed/ 0 35967
DOE creates an AI office to capitalize on the technology’s ‘Golden Age’ https://fedscoop.com/doe-ai-office-aito-rick-perry/ https://fedscoop.com/doe-ai-office-aito-rick-perry/#respond Fri, 06 Sep 2019 19:09:09 +0000 https://fedscoop.com/?p=33618 The office will "transform DOE into a world-leading AI enterprise," but there are few details available beyond that.

The post DOE creates an AI office to capitalize on the technology’s ‘Golden Age’ appeared first on FedScoop.

]]>
Energy Secretary Rick Perry has become a cheerleader for the government’s use of artificial intelligence, and now he has a single office to oversee the department’s efforts with the developing technology.

The new DOE Artificial Intelligence and Technology Office (AITO), announced Friday, will serve as a “coordinating hub for the work being done across the DOE enterprise in Artificial Intelligence,” a press release states. The vision is to “transform DOE into a world-leading AI enterprise,” but there are few details about the office available beyond that.

The rationale seems to be that the development of AI requires computing power, and DOE, with its National Laboratories system and abundant supercomputing resources, has that in spades.

“The world is in the midst of the Golden Age of AI, and DOE’s world class scientific and computing capabilities will be critical to securing America’s dominance in this field,” Secretary Perry said in a statement. The office is new, he added, but it’s work will be focused on supporting existing department projects.

The office currently has a staff of nine detailees from around the agency, a source told FedScoop, and it expects to staff up further as resources become available. The office reports to DOE Undersecretary for Science Paul Dabbar.

DOE cites President Trump’s American AI Initiative as the impetus for the establishment of the office. Indeed, AI has been a big focus of the administration’s tech policy in the past year — the technology even has its own page on Whitehouse.gov. This coming Monday, the Office of Science and Technology Policy plans to host a summit on the use of AI in government.

The post DOE creates an AI office to capitalize on the technology’s ‘Golden Age’ appeared first on FedScoop.

]]>
https://fedscoop.com/doe-ai-office-aito-rick-perry/feed/ 0 33618
Department of Energy puts AI to biomedical use https://fedscoop.com/doe-ai-biomedical-research-weill/ https://fedscoop.com/doe-ai-biomedical-research-weill/#respond Fri, 30 Aug 2019 13:31:41 +0000 https://fedscoop.com/?p=33566 The agency started a public-private partnership with Weill Family Foundation seeking breakthroughs addressing neurological disorders like traumatic brain injuries using supercomputing.

The post Department of Energy puts AI to biomedical use appeared first on FedScoop.

]]>
The Department of Energy wants to lend its world-class artificial intelligence technology for private sector biomedical and public health research to develop better treatments for brain diseases and disorders.

Secretary of Energy Rick Perry and Weill Family Foundation founder Sandy Weill signed a memorandum of understanding this week establishing a public-private partnership seeking supercomputing breakthroughs addressing neurological disorders like traumatic brain injuries.

Research could range from better understanding brain function to finding new ways to treat, prevent and repair damage done by brain diseases.

“By signing this MOU, we’re collaborating at the critical nexus of leading-edge technology, our own national labs, world class capabilities at [University of California, San Francisco] — the research excellence and health care expertise that’s there,” Perry said. “I know [University of California, Berkeley is going to be involved with this as well, and there’s just this vast potential of philanthropy from the private sector.”

DOE plans to build the first of three next generation, exascale computers — $600-million El Capitan — at Lawrence Livermore National Laboratory in California by 2022. Nearby, UCSF is home to the Weill Institute for Neurosciences.

Together DOE and WFF will work to first identify focus areas where they can simultaneously accelerate progress on AI and neuroscience, analyzing shared datasets and others to create a new generation of tools.

DOE; WFF and interested third parties in academia, industry, philanthropy, and government can all coordinate to study additional, related subjects of mutual interest, per the agreement.

“This is where we should have more public-private partnerships,” Weill said. “We should think about the companies — both new ones and established ones — that would be great partners with what we’re doing, as well as the smartest, brightest people.”

DOE and WFF agreed any funding decisions concerning research will be made separately, and either party can terminate the MOU at any time.

If any intellectual property comes out of the partnership, mutually acceptable written agreements will be drawn up dictating ownership.

“DOE’s scientific and technological innovation mission in particular includes the development and deployment of technologies to support economic prosperity, including the application of technologies it develops in areas outside of the energy sphere, such as those benefiting public health and medicine,” reads the MOU. “To this end, DOE’s role is focused on applying artificial intelligence research capabilities and techniques to support innovative learning from ever larger and more complex data.”

The post Department of Energy puts AI to biomedical use appeared first on FedScoop.

]]>
https://fedscoop.com/doe-ai-biomedical-research-weill/feed/ 0 33566