Dave Nyczepir Archives | FedScoop https://fedscoop.com/author/dave-nyczepir/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Mon, 23 Jan 2023 09:20:27 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 Dave Nyczepir Archives | FedScoop https://fedscoop.com/author/dave-nyczepir/ 32 32 HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi https://fedscoop.com/health-information-networks-tefca-success/ Thu, 22 Dec 2022 19:00:00 +0000 https://fedscoop.com/health-information-networks-tefca-success/ About 30% of hospitals remain unconnected to a health information network, but the implementation of network-to-network interoperability may change that.

The post HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi appeared first on FedScoop.

]]>
Multiple applicants expect to have fully operational health information networks for securely sharing clinical data within a year of receiving approval, according to National Coordinator for Health IT Micky Tripathi.

A couple networks are live, and the Office of the National Coordinator for Health IT hopes the first group — among 12 entities that submitted letters of intent — will be officially designated qualified health information networks (QHINs) in early 2023.

Part of the Department of Health and Human Services, ONC published a framework in January for exchanging health information nationwide: the Trusted Exchange Framework and Common Agreement (TEFCA). Required by the 21st Century Cures Act, the framework provides non-binding principles and the agreement technical terms, and now it falls to ONC’s recognized coordinating entity, The Sequoia Project, to approve interoperable QHINs.

“What we’ve heard informally from a number of the prospective QHINs is that they are building in anticipation of getting approved,” Tripathi said, during eHealth Exchange’s annual meeting on Dec. 15. “They think that they would have a pretty good opportunity to do this in the 12-month window and hopefully shorter than that with some of them.”

QHINs will be added on a rolling basis to include electronic health record (EHR) vendors, ambulatory practices, hospitals, health centers, federal and public health agencies, and payers. Epic Systems became the first EHR vendor to announce it would seek QHIN status in August and was later joined by the likes of the eHealth Exchange network and trade association CommonWell Health Alliance.

How TEFCA coexists with other exchanges when it comes to benefits determinations, health care operations, treatment, payment and individual access remains to be seen. But scaling TEFCA will be the “real challenge” and one for which incorporating the Health Level Seven (HL7) Fast Healthcare Interoperability Resource (FHIR) data standard will be key, Tripathi said.

FHIR application programming interfaces streamline health information exchange by eliminating the need for separate data use agreements, and eventually they’ll enable questionnaires, scheduling, links, Clinical Decision Support hooks and subscriptions. That’s why there are already federal deadlines in place for their steady adoption across the public health ecosystem, but QHIN-to-QHIN brokered exchange remains years away.

By the end of 2022, certified EHR vendors must make a FHIR API available to customers.

HL7’s Helious FHIR Accelerator aims to improve the exchange of situational awareness information on hospital and intensive care unit beds available, ventilator counts, personal protective equipment counts, and vaccinations. The HHS Protect system launched during the height of the COVID-19 pandemic provides a lot of that information right now.

“But it’s done via spreadsheets,” Tripathi told FedScoop in July. “A lot of manual work is still done to populate that now.”

The government has spent about $40 billion on EHR infrastructure since the passage of the Health IT for Economic and Clinical Health (HITECH) Act in 2009. Yet clinical operations and health payment systems remain largely rooted in paper because states — most of which still don’t require electronic case reporting — have health authority in the U.S.

Jurisdictional issues and scarce resources are some reasons why about 30% of U.S. hospitals still don’t connect to a health information network, Tripathi said Dec. 15.

Naturally issues with case reports, lab and testing results, and vital records arose early in the pandemic, when they were often being shared by phone or fax.

For all these reasons the Centers for Disease Control and Prevention launched its Data Modernization Initiative (DMI) in 2020 to streamline sharing of electronic health information between care providers and state, local, tribal and territorial (SLTT) health departments. 

The DMI’s first phase has involved getting data from electronic sources into a Microsoft Azure cloud environment, called the Enterprise Data Analytics and Visualization (EDAV) platform, while providing SLTT health departments with automated forecasting analytics tools.

Data standardization is key to improving information sharing between these systems, which is why ONC is working closely with the CDC on its North Star Architecture. The U.S. Core Data for Interoperability (USCDI) Version 4 (v4) that ONC has planned for 2023 will become the de facto minimum set of health data classes and elements for nationwide, interoperable information exchange.

At the same time ONC is developing USCDI+, a nationwide public health data model, for release beyond 2023. Discussions with the CDC and Centers for Medicare and Medicaid Services revealed more than 20 data elements that overlapped, allowing the agencies to agree on a common approach.

ONC is now speaking with the White House Office of Science and Technology Policy and the National Institutes of Health about tailoring a USCDI+ program for President Biden’s Cancer Moonshot program.

EHR vendors support TEFCA and the DMI because they’ll be able to maintain just one customer interface, rather than hundreds to meet the various jurisdictional requirements of SLTT health departments, Tripathi said in July.

Phase I of the DMI is also improving the CDC’s situational awareness, which is based on the Data Collation and Integration for Public Health Event Response (DCIPHER) platform — originally intended to track food-borne diseases. DCIPHER gave rise to HHS Protect and has since had hospital capacity, social vulnerability, mobility, race and ethnicity, social determinants of health, economic, two-on-one, and climate data layered atop it as part of the new Center for Forecasting and Outbreak Analytics’ work, Dr. Dan Jernigan, deputy director for public health science and surveillance, told FedScoop in August.

The center can already do weekly influenza and some Mpox forecasting and has visibility into emerging problems at about 70% of emergency departments.

“To see a fully formed prediction center, it’s going to be a couple years,” Jernigan said. “The numbers of staff that are in the Center for Forecasting right now are in the tens to thirties, but it is anticipated to be a much larger group.”

As part of DMI Phase I, 128 reportable diseases now automatically trigger EHR electronic case reporting, which is routed to the Association of Public Health Laboratories-APHL Informatics Messaging Services (APHL-AIMS) cloud platform and then SLTT health departments. Electronic case reporting increased from 187 facilities pre-pandemic to more than 14,000, more than 30 of which turned on Monkeypox reporting.

While the effort highlights the CDC’s move toward pathogen- and program-agnostic systems through its DMI, electronic case reporting continues to fall short.

“It’s not nearly the volume that we need it to be,” Tripathi said in July. “But at least we’re starting to set up those pathways.”

At the same time the DMI has seen “dramatic improvements” in COVID-19 reporting across immunization information systems (IISs), he added.

IISs were slow to take adult COVID-19 vaccination information, but now they accept line-listed records using privacy-preserving record linkage — even for Monkeypox.

The CDC recently revised its DMI implementation plan, and Phase 2 will focus on improving state health departments’ cloud infrastructure and update the National Electronic Disease Surveillance System (NEDSS) Base System (NBS) that 26 states use for case management.

Cloud migration allows doctors like Phil Huang, director of Dallas Health and Human Services, to match immunization, lab and death records to know if a patient who passed away tested positive for COVID-19 and was vaccinated. 

“That ability to put that data together and integrate it with other kinds of information, even down to the neighborhood level, helps him do his prevention work and his mitigation work in a much more targeted way,” Jernigan. 

CDC proposed the DMI receive about $200 million in fiscal 2023 to continue its “incremental” progress, but the Healthcare Information and Management Systems Society estimated the initiative needs $33 billion over the next 10 years to be successful, he added.

Meanwhile ONC, unable to enforce TEFCA, is working with federal partners to highlight the need for network-to-network interoperability and hoping its rollout leads outside providers to question why they’re still faxing records.

“We were given no dollars and no new authorities to do this,” Tripathi said.

The post HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi appeared first on FedScoop.

]]>
63698
CMS subcontractor breach potentially exposes data of 254,000 Medicaid beneficiaries https://fedscoop.com/cms-subcontractor-data-breach/ Fri, 16 Dec 2022 21:03:17 +0000 https://fedscoop.com/cms-subcontractor-data-breach/ Healthcare Management Solutions, LLC suffered a ransomware attack on its corporate network on Oct. 8, which CMS has been investigating since.

The post CMS subcontractor breach potentially exposes data of 254,000 Medicaid beneficiaries appeared first on FedScoop.

]]>
A Centers for Medicare and Medicaid Services subcontractor experienced a breach that may have exposed Medicare beneficiaries’ banking information, Social Security Numbers and other sensitive data, the agency announced Wednesday.

Healthcare Management Solutions, LLC (HMS), a subcontractor of ASRC Federal Data Solutions, LLC (ASRC Federal), violated its obligations to CMS and potentially 254,000 of its 64 million Medicare beneficiaries whose personally identifiable and protected health information may have been exfiltrated, according to the agency.

President Biden issued an executive order in February 2021 in an effort to shore up agencies’ supply chains, after Russia-linked hackers breached federal contractor SolarWinds’ software supply chain  — compromising nine agencies. Supply chain attacks continue to increase, prompting multiple reviews by the Department of Homeland Security’s Cyber Safety Review Board.

“The safeguarding and security of beneficiary information is of the utmost importance to this agency,” said CMS Administrator Chiquita Brooks-LaSure in a statement. “We continue to assess the impact of the breach involving the subcontractor, facilitate support to individuals potentially affected by the incident and will take all necessary actions needed to safeguard the information entrusted to CMS.”

ASRC Federal resolves system errors related to Medicare beneficiary entitlement and premium payment records and supports premium collection from direct payers for CMS. Subcontractor HMS suffered a ransomware attack on its corporate network on Oct. 8, which it notified CMS of the next day.

After an initial investigation, CMS concluded on Oct. 18 its data that HMS handled was potentially compromised for some Medicare beneficiaries.

CMS continues to notify beneficiaries whose information may have been exfiltrated by letter that they’ll receive an updated Medicare card with a new Medicare Beneficiary Identifier, which also may have been compromised; free credit monitoring services; and incident updates.

No CMS systems were breached or Medicare claims data involved. But names, addresses, dates of birth, phone numbers, Social Security Numbers, Medicare Beneficiary Identifiers, banking information including routing and account numbers, and Medicare entitlement, enrollment and premium information were potentially compromised, according to the agency.

Affected beneficiaries are advised to destroy their old Medicare card upon receipt of the new one, contact their financial institutions and enroll in Equifax Complete Premier credit monitoring for free using the letter’s instructions.

“At this time, we’re not aware of any reports of identity fraud or improper use of your information as a direct result of this incident,” reads the letter sent to affected beneficiaries.

Healthcare Management Solutions was contacted for comment.

The post CMS subcontractor breach potentially exposes data of 254,000 Medicaid beneficiaries appeared first on FedScoop.

]]>
63686
NIST retires an early cryptographic algorithm https://fedscoop.com/nist-retires-sha-1-algorithm/ Fri, 16 Dec 2022 01:56:26 +0000 https://fedscoop.com/nist-retires-sha-1-algorithm/ Modules that still use SHA-1 after 2030 will not be permitted for purchase by the federal government.

The post NIST retires an early cryptographic algorithm appeared first on FedScoop.

]]>
The National Institute of Standards and Technology retired one of the first widely used cryptographic algorithms, citing vulnerabilities that make further use inadvisable, Thursday.

NIST recommended IT professionals replace Secure Hash Algorithm 1 (SHA-1) with more secure algorithms from the SHA-2 and SHA-3 groups to protect electronic information by Dec. 31, 2030.

SHA-1 became part of the Federal Information Processing Standard (FIPS 180-1) in 1995, and its limited use by security applications like website validators continues despite increasingly severe attacks on it by more powerful computers. NIST’s recommendation comes on the heels of the White House’s aggressive deadlines for agencies to develop post-quantum cryptography strategies, given concerns quantum computers capable of cracking the traditional public-key encryption most systems rely on may go live anywhere from three years to a decade from now.

“Modules that still use SHA-1 after 2030 will not be permitted for purchase by the federal government,” said Chris Celi, NIST computer scientist, in the announcement. “Companies have eight years to submit updated modules that no longer use SHA-1.”

NIST’s Cryptographic Module Validation Program (CMVP) assesses whether modules, the building blocks of encryption systems, used in federal encryption work effectively every five years.

The agency plans to publish a transition strategy for validating cryptographic modules and algorithms before Dec. 31, 2030.

“Because there is often a backlog of submissions before a deadline, we recommend that developers submit their updated modules well in advance so that CMVP has time to respond,” Celi said.

NIST also intends to publish a FIPS revision, FIPS 180-5, and revise other publications affected by SHA-1’s retirement by its deadline.

SHA-1 secures information by performing a complex math operation on the characters of a message to produce a short string of characters called a hash. While the original message can’t be reconstructed with just the hash, knowing the hash lets the recipient check if the message was compromised because even a slight change alters the hash significantly.

Recent collision attacks use today’s more sophisticated computers to create fraudulent messages that recreate the original hash to compromise the message. NIST already warned agencies against using SHA-1 to protect critical processes like the creation of digital signatures.

The post NIST retires an early cryptographic algorithm appeared first on FedScoop.

]]>
63682
7 agencies improve FITARA grades amid more scorecard changes https://fedscoop.com/fitara-15-0-scorecard-grades/ Thu, 15 Dec 2022 21:00:00 +0000 https://fedscoop.com/fitara-15-0-scorecard-grades/ All other agencies' grades remained unchanged.

The post 7 agencies improve FITARA grades amid more scorecard changes appeared first on FedScoop.

]]>
Seven agencies improved their FITARA scorecard grades after the Government Accountability Office continued to update its scoring methodology around data center consolidation, cybersecurity and network modernization components.

The grades of the Commerce, Defense, Justice, Transportation, and Treasury departments, as well as the Environmental Protection Agency and NASA rose. All other agencies’ grades remained unchanged.

GAO began issuing grades biannually in November 2015 to monitor agencies’ progress implementing IT modernization and cybersecurity improvements required by the Federal Information Technology Acquisition Reform Act (FITARA). Evolving the scorecard has long been a priority of Rep. Gerry Connolly, D-Va., who aspired to House Oversight Committee chairmanship before Republicans rested control of the House in the November election.

“We must continue to reap dividends from modernizing legacy IT systems, migrating to the cloud and maintaining a strong cyber posture,” Connolly said in a statement. “I look forward to continuing the scorecard and the longstanding tradition of bipartisan FITARA oversight in the 118th Congress.”

The FITARA 15.0 scorecard further modifies the new data center consolidation component to give credit to agencies that justified future data center closures. Agencies responding with no future closures received A grades, and the five that justified their need for future closures received Bs.

GAO changed cyber component scoring to a weighted, rather than traditional, average. The predominant Federal Information Security Modernization Act maturity level among all 24 agencies scored was level four, managed and measurable security, which meant the General Services Administration and National Science Foundation scored more than 100% for their optimized postures and received A grades.

Lastly GAO changed its scoring of agencies’ transition from expiring telecommunications and network contracts to the $50 billion Enterprise Infrastructure Solutions modernization vehicle. GSA expected agencies to be 90% transitioned by March and 100% transitioned by September, so July’s FITARA 14.0 scorecard graded their progress toward the 90% benchmark with 11 receiving Fs.

For FITARA 15.0, GAO cracked down by issuing pass-fail grades based on whether an agency reached the 90% benchmark with 19 receiving Fs. Only the U.S. Agency for International Development achieved 100% transitioned by GSA’s deadline while the Health and Human Services and Treasury departments and NASA and Nuclear Regulatory Commission passed for being more than 90% transitioned.

The post 7 agencies improve FITARA grades amid more scorecard changes appeared first on FedScoop.

]]>
63681
Special committee calls for ‘reboot’ of federal technology accessibility oversight https://fedscoop.com/reboot-technology-accessibility-oversight/ Wed, 14 Dec 2022 23:33:55 +0000 https://fedscoop.com/reboot-technology-accessibility-oversight/ Section 508 of the Rehabilitation Act hasn't been updated since 1998, and in the meantime departments like VA have committed hundreds of thousands of violations.

The post Special committee calls for ‘reboot’ of federal technology accessibility oversight appeared first on FedScoop.

]]>
The Department of Justice must “reboot” critical oversight of federal technology accessibility by resuming required biennial progress reports not issued since 2012, according to the Senate Special Committee on Aging’s Democrats.

Majority staff released its “Unlocking the Virtual Front Door” report, after an 11-month investigation that found the Department of Veterans Affairs committed hundreds of thousands of accessibility violations taking years to address due to insufficient oversight — a problem extending to other federal departments.

Congress added Section 508 to the Rehabilitation Act in 1986, mandating that federal technology be accessible to people with disabilities, and last strengthened the statute in 1998. But oversight and enforcement hasn’t kept pace with the U.S.’s aging population, 80.8 million people expected to be over 65 by 2040.

“The entire federal government needs to wake up to this issue because a whole-of-government approach is what we need to remedy it,” said committee chair Sen. Bob Casey, D-Penn., in a statement. “We would not ask someone using a wheelchair to walk up the courthouse steps, but we are doing something similar when we ask people with disabilities to use federal websites that are not accessible.”

Blind Air Force veteran Ron Biglin from Clarks Summit, Pennsylvania, can’t access the VA’s My HealtheVet website with the screen reader the department provided him, and the Centers for Disease Control and Prevention‘s website for COVID-19 prevalence data was no better, the committee’s majority staff found.

DOJ recently committed to resuming Section 508 progress reports, and the committee’s majority staff is recommending the General Services Administration begin publishing data on Section 508 compliance and inspectors general increase oversight of their agencies’ compliance.

“Increased oversight from inspectors general would likely result in improved accessibility for taxpayers and workers using federal technology,” reads the report. “Such oversight may also lead to cost savings, given that remediating non-compliant websites resulted in additional costs for departments and agencies.”

The report further recommends agencies like the VA maintain their ability to conduct automated Section 508 compliance scans, after the department cancelled a contract undermining that capability for a year.

Agencies should include people with disabilities and older adults in their technology planning and evaluation and consider using human testers to evaluate Section 508 compliance and technology accessibility, akin to the Department of Homeland Security‘s Trusted Tester program. Appointing accessibility officers, whose job it is to ensure Section 508 compliance within their agency, could also help, according to majority staff.

“Rather than locating Section 508 compliance among multiple offices, housing Section 508 compliance responsibilities within existing accessibility offices or creating offices whose responsibilities include oversight of Section 508 compliance could improve the accessibility of technology within these organizations,” reads the report.

The report also recommends agencies ensure Section 508 complaints can be filed and that Congress:

  • consider updating the statute’s language to add accountability measures, grant the Access Board enforcement authority, allocate funding and evaluate complaint resolutions;
  • have authorizing committees and spending bills hold other agencies accountable for Section 508 compliance; and
  • ensure accessibility of its own technology and websites.

“Accessible technology is crucial for people seeking to secure health care, receive Social Security and VA benefits, pay taxes, and navigate federal information ranging from weather forecasts to economic data,” reads the report. “The COVID-19 pandemic increased the nation’s reliance on the internet to access basic services — driving home the importance of accessible federal websites and communications technology.”

The post Special committee calls for ‘reboot’ of federal technology accessibility oversight appeared first on FedScoop.

]]>
63679
Machine-learning models predicted ignition in fusion breakthrough experiment https://fedscoop.com/machine-learning-fusion-ignition/ Wed, 14 Dec 2022 02:40:32 +0000 https://fedscoop.com/machine-learning-fusion-ignition/ Recent ML advances helped ensure Lawrence Livermore National Laboratory's historic achievement on the path to zero-carbon energy.

The post Machine-learning models predicted ignition in fusion breakthrough experiment appeared first on FedScoop.

]]>
Lawrence Livermore National Laboratory’s machine-learning models predicted the historic achievement of fusion ignition the week before its successful experiment on Dec. 5.

The National Ignition Facility’s design team fed the experimental design to the Cognitive Simulation (CogSim) machine-learning team for analysis, and it found the resulting fusion reactions would likely create more energy than was used to start the process — leading to ignition.

LLNL’s laser-based inertial confinement fusion research device is the size of three football fields and shot 192 laser beams — delivering 2.05 megajoules of ultraviolet energy to an almost perfectly round fuel capsule made of diamond — causing 3.15 megajoules’ worth of fusion ignition in a lab for the first time during the latest experiment. The achievement strengthens U.S. energy independence and national security with nuclear testing prohibited, and CogSim machine-learning models helped ensure the experiment avoided the previous year’s pitfalls. 

“Last week our pre-shot predictions, improved by machine learning and the wealth of data we’ve collected, indicated that we had a better than 50% chance of exceeding the target gain of 1,” said LLNL Director Kim Budil, during a press conference at the Department of Energy on Tuesday.

NIF’s design team benchmarks complex plasma physics simulations and analytical models against experimental data collected over 60 years to create designs that will reach the extreme conditions required for fusion ignition. The most recent experiment reached pressures two times greater than the Sun’s and a temperature of 150 million degrees.

CogSim may run thousands of machine-learning simulations of an experimental design in the lead up.

“We have made quite a bit of advancements in our machine-learning models to kind of tie together our complex radiation hydrodynamics simulations of the experimental data and learning,” said Annie Kritcher, principal designer.

But NIF’s August 8, 2021 experiment reached the threshold for ignition, and September’s experiment paved the way for a new laser capability. So the design team relied on traditional methods for the latest experiment and only used machine learning for predictions.

For this experiment the design team thickened the fuel capsule to widen the margin of success and burn more fuel and used improved models to increase the symmetry of the implosion by transferring more energy between laser beams in the second half and readjusting the first half of the pulse.

Kritcher credited those changes for the experiment’s success, though she called capsule defects, which are tougher to model and predict, the “main driver” in performance. While the diamond capsule is 100 times smoother than a mirror, X-ray tomography must be used to see, measure and count defects — generating a lot of data that software now helps analyze.

The robust capsule employed in the most recent experiment was not the most effective option, meaning future experiments should see improved performance, said Michael Stadermann, Target Fabrication program manager.

Firing the laser required an additional 300 megajoules of energy pulled from the power grid, which highlights an important point about the NIF: It’s a scientific demonstration facility, not an optimized one.

“The laser wasn’t designed to be efficient,” said Mark Herrmann, LLNL Weapons, Physics and Design program director. “The laser was designed to give us as much juice as possible to make these incredible conditions possible.”

The NIF is more than 20 years old, and some of its technology dates back to the 1980s.

New laser architectures, target fabrication methods, materials, computation and simulations, and machine learning have federal officials optimistic the U.S. can achieve President Biden’s goal of a commercial fusion reactor within the decade.

DOE invested $50 million in a public-private partnership in September around fusion pilot power plant designs, but Budil said such a plant is likely still four decades away without “concerted effort and investment” on the technology side.

The private sector invested $3 billion in fusion research last year, and DOE is partnering with the White House Office of Science and Technology Policy to map out a vision for commercial fusion with zero-carbon energy powering homes, cars and heavy industry.

To its credit, the Biden administration proposed the biggest research and development budget in U.S. history, and recent investments enabled LLNL’s latest achievement.

“I think this is an amazing example of the power of America’s research and development enterprise,” said OSTP Director Arati Prabhakar.

The post Machine-learning models predicted ignition in fusion breakthrough experiment appeared first on FedScoop.

]]>
63674
Quantum-ready workforce tops White House, scientists’ list of needs https://fedscoop.com/quantum-ready-workforce-david-awschalom/ Tue, 13 Dec 2022 20:26:35 +0000 https://fedscoop.com/quantum-ready-workforce-david-awschalom/ “This is the time to change the model for how you build a technology workforce,” said physicist David Awschalom, after a White House meeting with top quantum scientists.

The post Quantum-ready workforce tops White House, scientists’ list of needs appeared first on FedScoop.

]]>
Workforce was the topic on most quantum scientists’ minds when 30 of the country’s best met at the White House on Dec. 2 to discuss the global quantum race.

Leaders of the five National Quantum Information Science Research Centers (NQISRCs) were among the attendees assessing their success accelerating QIS research and development, technology transfer, and workforce development since their launch mid-pandemic.

The National Quantum Initiative Act of 2018 allotted the Department of Energy $625 million for the centers, which have begun integrating companies into the U.S. QIS ecosystem. Gone are the days when monopolies like Bell Labs and IBM funded basic science in house, meaning U.S. investment in accessing the best quantum engineers is more important than ever to winning what has become a global race with China and Europe.

“This is the time to change the model for how you build a technology workforce,” David Awschalom, professor at the University of Chicago’s Pritzker School of Molecular Engineering and senior scientist at Argonne National Laboratory, told FedScoop. “This is an opportunity to build a very diverse, very inclusive and very equitable workforce.”

Awschalom participated in the Dec. 2 White House meeting, about half of which he said consisted of roundtable discussions on growing a quantum-ready workforce.

Companies participating in the Chicago Quantum Exchange, housed at the University of Chicago, are more concerned about the talent shortage than they are about even a more reliable qubit, Awschalom said.

Awschalom envisions an ecosystem where “quantum” is no longer an intimidating word for students, QIS is taught in high schools and community colleges, and the development of new technologies launches the careers of students at universities like Chicago State. The predominantly Black university regularly sees impressive students working full-time jobs to stay enrolled or making class sacrifices to care for children, he said.

Alleviating those complications could see tens of thousands more students enter the quantum workforce.

“We realized that to really address these questions properly we should probably have another meeting like this, where we bring in members from those communities to tell us what they need,” Awschalom said.

While no date was set, National Quantum Coordination Office Director Charles Tahan was clear that his door is open, and the White House wants to work together more with the QIS ecosystem, he added.

The second major topic of discussion at the White House meeting were “big” delays in obtaining rare-earth elements and unusual materials found outside the U.S. but required for a lot of quantum components, Awschalom said.

Helium-3 is needed for cryogenic experiments but became harder and more costly to obtain due to Russia’s war on Ukraine, while elements critical to quantum memories can only be mined in a few places globally.

Fortunately the U.S. is adept at nurturing startups, which could prove key to developing compact cryogenics and on-chip memories with silicon-compatible materials, Awschalom said.

The universities of Chicago and Illinois partner on the Duality Quantum Accelerator, which has hosted 11 quantum startups — including four from Europe — for research and development. At the NQISRC level, joint programs are forming between centers.

Since the birth of the qubit, rudimentary quantum processors have begun “remarkably fast” computing and prototype networks sending encrypted information around the world, Awschalom said. The University of Chicago’s 124-mile Quantum Link between it, Chicago and the National Labs serves as a testbed for industry prototypes and will eventually extend into south Illinois.

Whether the first beneficiary of quantum computing is precision GPS for microsurgery, improved telescope strength or some as-yet-unrealized application remains to be seen.

“The one thing we all know for sure in this field is that we don’t yet know the biggest impact,” Awschalom said. “So the United States and our centers have to be prepared; we need to be nimble.”

The post Quantum-ready workforce tops White House, scientists’ list of needs appeared first on FedScoop.

]]>
63667
Post-quantum cryptography experts brace for long transition despite White House deadlines https://fedscoop.com/quantum-crytography-experts-long-transition/ Mon, 12 Dec 2022 21:33:15 +0000 https://fedscoop.com/quantum-crytography-experts-long-transition/ Agencies are finally starting to take the threat of quantum computers to their sensitive data seriously, but the task of inventorying vulnerable systems remains daunting.

The post Post-quantum cryptography experts brace for long transition despite White House deadlines appeared first on FedScoop.

]]>
The White House’s aggressive deadlines for agencies to develop post-quantum cryptography strategies make the U.S. the global leader on protection, but the transition will take at least a decade, experts say.

Canada led the Western world in considering a switch to post-quantum cryptography (PQC) prior to the Office of Management and Budget issuing its benchmark-setting memo on Nov. 18, which has agencies running to next-generation encryption companies with questions about next steps.

The memo gives agencies until May 4, 2023, to submit their first cryptographic system inventories identifying vulnerable systems, but they’ll find the number of systems reliant on public-key encryption — which experts predict forthcoming quantum computers will crack with ease — is in the hundreds or thousands. Agencies, software, servers and switches often have their own cryptography, and agencies don’t necessarily have the technical expertise on staff to understand the underlying math.

“This will be the largest upgrade cycle in all human history because every single device, 27 billion devices, every network and communication needs to upgrade to post-quantum resilience,” Skip Sanzeri, chief operating officer at quantum security-as-a-service company QuSecure, told FedScoop. “So it’s a massive upgrade, and we have to do it because these quantum systems should be online — we don’t know exactly when — but early estimates are three, four years for something strong enough.”

Bearish projections have the first quantum computer going live in about a decade, or never, with scientists still debating what the definition of a qubit — the quantum mechanical analogue to a bit — should even be.

QuSecure launched three years ago but became the first company to deploy PQC for the government this summer, when it proved to the U.S. Northern Command and North American Aerospace Defense Command that it could create a quantum channel for secure aerospace data transmissions at the Catalyst Campus in Colorado Springs, Colorado. The company used the CRYSTALS-KYBER cryptographic algorithm, one of four the National Institute of Standards and Technology announced it would standardize, but a quantum computer doesn’t yet exist to truly test the security.

The first quantum security-as-a-service company to be awarded a Phase III contract by the Small Business Innovation Research program, QuSecure can contract with all federal agencies immediately. Customers already include the Army, Navy, Marines and Air Force, and the State, Agriculture, Treasury and Justice departments have inquired about services, Sanzeri said. 

QuSecure isn’t alone.

“We are having discussions right now with various federal agencies around what they should be doing, what they can be doing, in order to start today — whether it’s in building out the network architecture or looking at Internet of Things devices that are being sent into the field,” said Kaniah Konkoly-Thege, chief legal officer and senior vice president of government relations at Quantinuum, in an interview.

Defense and intelligence agencies are better funded and more familiar with classified programs requiring encryption services and therefore “probably in a much better position” to transition to PQC, Konkoly-Thege said.

Having served in the departments of the Interior and Energy, Konkoly-Thege said she’s “concerned” other agencies may struggle with migration.

“There are a lot of federal agencies that are underfunded and don’t have the resources, either in people or funding, to come and do what’s necessary,” she said. “And yet those agencies hold very important information.”

That information is already being exfiltrated in cyberattacks like the Office of Personnel Management hack in 2015, in which China aims to harvest now, decrypt later (HNDL) data with fully realized quantum computers.

Post-Quantum CEO Andersen Cheng coined the term, and his company’s joint NTS-KEM error-correcting code is in Round 4 of NIST’s PQC algorithm competition.

Cheng points to the fact he could trademark his company’s name as proof PQC wasn’t being taken seriously even in 2015 and certainly not the year prior, when he and two colleagues were the first to get a PQC algorithm to work in a real-world situation: a WhatsApp messaging application downloadable from the app store.

They took it down within 12 months.

“One of my friends in the intelligence world called me one day saying, ‘You’re very well known.’ I said, ‘Why?’ He said, ‘Well, your tool is the recommended tool by ISIS,’” Cheng told FedScoop in an interview. “It was a wonderful endorsement from the wrong party.”

While there wasn’t one moment that caused the U.S. government to take PQC seriously, Cheng said the “biggest” turning point was the release of National Security Memo-10 — which OMB’s latest memo serves as guidance for implementing — in May. That’s when the largest U.S. companies in network security infrastructure and finance began reaching out to Post-Quantum for consultation.

Post-Quantum now offers a portfolio of quantum-ready modules for not only secure messaging but identity, quorum sensing and key splitting.

Cheng said the Quantum Computing Cyber Preparedness Act, sent to President Biden’s desk Friday, should become law given PQC’s momentum, but he has “slight” reservations about the OMB memo’s aggressive deadlines for agencies to declare a migration lead and to conduct an inventory audit.

“People are probably underestimating the time it will take because the entire migration — I’ve spoken to some very top-end cryptographers like the head of crypto at Microsoft and so on — our consensus is this is a multi-year migration effort,” Cheng said. “It will take 10 years, at least, to migrate.”

That’s because public-key encryption protects everything from Zoom calls to cellphones, and the National Security Agency isn’t yet recommending hybridization, which would allow for interoperability among the various NIST-approved algorithms and also whichever ones other countries choose. Agencies and companies won’t want to swap PKE out for new PQC algorithms that won’t work with each other, Cheng said.

Complicating matters further, NIST is approving the math behind PQC algorithms, but the Internet Engineering Task Force generally winds up defining connectivity standards. Post-Quantum’s hybrid PQ virtual private network is still being standardized by IETF, and only then can it be added to systems and sold to agencies.

Cheng recommends agencies not wait until their inventory audits are complete to begin talking to consultants and software vendors about transitioning their mission-critical systems because PQC expertise is in short supply. Large consulting firms have been “quietly” building out their quantum consulting arms for months, he said.

OMB’s latest memo gives agencies 30 days after they submit their cryptographic system inventory to submit funding assessments, a sign it won’t be an unfunded mandate, Sanzeri said. 

“This is showing that all of federal will be well into the upgrade process, certainly within 12 months,” he said.

The post Post-quantum cryptography experts brace for long transition despite White House deadlines appeared first on FedScoop.

]]>
63662
Agencies finding their zero-trust priorities vary, funding needs less so https://fedscoop.com/agency-zero-trust-priorities-vary/ Fri, 09 Dec 2022 01:34:20 +0000 https://fedscoop.com/agency-zero-trust-priorities-vary/ HHS OIG is adjusting its zero-trust roadmap, while DOD's CIO for cybersecurity needs a data tagging and labeling standard and USCIS wants a more adaptive trust model.

The post Agencies finding their zero-trust priorities vary, funding needs less so appeared first on FedScoop.

]]>
Civilian and defense agencies have differing priorities in implementing their zero-trust security architectures, and they’re exploring a variety of avenues to fund their projects.

The Department of Health and Human Services Office of Inspector General is adjusting six foundational, zero-trust projects it identified based on the zero-trust strategy the Department of Defense released in November.

HHS OIG already has zero-trust technology procurements underway, though no deployments as of yet.

“We’re going to adjust that roadmap, based on the strategy that was released, because I like some of the 91 points that are in there,” said Chief Information Officer Gerald Caron, during the Fortinet Security Transformation Summit produced by Scoop News Group.

HHS OIG is “chasing” Technology Modernization Fund dollars right now, which would be a “gamechanger” for its zero-trust projects, Caron said. The agency recently entered Phase 2 of that process. 

Meanwhile DOD’s CIO for cybersecurity is working with the Chief Digital and Artificial Intelligence Office to propose a data tagging and labeling standard before the end of fiscal 2023.

“That’s critical to get to the later stages of zero trust, especially if you want to go to advanced zero trust, especially if you want to get sophisticated in the visibility and analytics pieces of zero trust,” said Randy Resnick, director of the ZT PMO. “Because if you don’t know what data you’re sitting on, if it’s not properly tagged or labeled, it’s very difficult to do that analytics.”

The standard will also enable better data sharing across the enterprise.

DOD is in the midst of its Program Objective Memorandum cycle, where components seeking funding for zero-trust projects may place their requests, but they won’t receive the money for two years after approval.  Resnick’s office is willing to offer bridge funding if the component can prove a “legitimate need,” give that zero trust is a “high priority” for DOD, Resnick said.

“I would suggest work with the portfolio office, and we will try to advocate for you for this year’s dollars, move things around,” he said.

U.S. Citizenship and Immigration Services is developing a more adaptive, fluid trust model because often a user or device on the network is simply trusted or it’s not.

Machine learning will soon be making those decisions and “driving a very different type of risk model,” said Chief Information Security Officer Shane Barney.

“You’re going to be adapting trust more in a real-time sense,” Barney said. “And it’s going to be taking a number of very critical factors that do that in your environment.”

The post Agencies finding their zero-trust priorities vary, funding needs less so appeared first on FedScoop.

]]>
63657
Office of Space Commerce’s space traffic coordination pilot commences https://fedscoop.com/osc-space-traffic-coordination-pilot/ Thu, 08 Dec 2022 01:38:21 +0000 https://fedscoop.com/osc-space-traffic-coordination-pilot/ The office is testing commercial space situational awareness services to determine if they can form the core of its new system.

The post Office of Space Commerce’s space traffic coordination pilot commences appeared first on FedScoop.

]]>
Commercial space firms began conducting space situational awareness data analysis for the Office of Space Commerce, as it takes over the role of space traffic coordinator, Monday.

Part of a two-month pilot providing spaceflight safety mission assurance to select spacecraft in medium Earth orbit (MEO) and geostationary Earth orbit (GEO), the analysis supports satellite tracking, safety notifications, and anomaly detection and alerts.

OSC resides within the National Oceanic and Atmospheric Administration within the Department of Commerce, which was given until 2024 to assume the Department of Defense‘s responsibility of coordinating increasingly congested commercial and civil satellite orbits around Earth in a 2018 presidential directive. The office contributed $850,000 to DOD for the award of space situational awareness (SSA) data analysis contracts to seven firms, via the Joint Task Force-Space Defense Commercial Office Operations (JCO) vehicle, enabling the pilot.

“The space traffic coordination pilot project contracts — combined with the approximately $3.1 million in contracts that the Department of Commerce let in the summer for data purchase — are designed to demonstrate commercial capabilities and to help the Office of Space Commerce team develop the structure and operational requirements for the future system,” a NOAA spokesperson told FedScoop.

OSC hopes the pilot will help determine the extent to which commercial SSA services can augment or replace DOD’s existing capabilities, ideally forming the core of the new system.

COMSPOC Corp.; ExoAnalytic Solutions; Kayhan Space; KBR; NorthStar Earth & Space, Inc.; Slingshot Aerospace; and the Space Data Association received contracts.

DOD already awarded five contracts for GEO space object tracking data in September that also support the pilot.

Members of the Space Data Association will gather feedback from commercial satellite operators on the usefulness of the SSA services provided. OSC will then compare results to determine the maturity of SSA services and inform its approach to low Earth orbit (LEO).

“This pilot project helps usher in a new phase in how government and commercial operators work together to coordinate activities on-orbit,” said NOAA Administrator Rick Spinrad in a statement. “NOAA looks forward to continued collaborations that safely enhance the economic and technical potential of the U.S. commercial space sector.”

The post Office of Space Commerce’s space traffic coordination pilot commences appeared first on FedScoop.

]]>
63650