Centers for Disease Control and Prevention Archives | FedScoop https://fedscoop.com/tag/centers-for-disease-control-and-prevention/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Fri, 05 Apr 2024 18:29:30 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 Centers for Disease Control and Prevention Archives | FedScoop https://fedscoop.com/tag/centers-for-disease-control-and-prevention/ 32 32 CDC’s generative AI pilots include school closure tracking, website updates https://fedscoop.com/cdc-generative-ai-pilots-school-closure-tracking-website-updates/ Fri, 05 Apr 2024 18:29:29 +0000 https://fedscoop.com/?p=77030 The Centers for Disease Control and Prevention is testing out use cases for generative AI and sharing its approach with other federal partners as it plans to develop an agencywide AI strategy.

The post CDC’s generative AI pilots include school closure tracking, website updates appeared first on FedScoop.

]]>
An artificial intelligence service deployed within the Centers for Disease Control and Prevention is being put to the test for things like modernizing its websites and capturing information on school closures, the agency’s top data official said. 

The tool — Microsoft Azure Open AI that’s been configured for CDC use within its cloud infrastructure — has both a chatbot component for employees to use and the ability for more technical staff to develop applications that connect to the service via an application programming interface (API), Alan Sim, CDC’s chief data officer, said in an interview with FedScoop. 

“The idea here is that we can allow for our CDC staff to practice innovation and gen AI safely, within CDC boundaries, rather than going out to third-party sites,” Sim said. 

In total, CDC has 15 pilots using the agency’s generative AI capabilities, primarily through the Azure Open AI service, a spokesperson said.

Exploring generative AI uses comes as the CDC, like agencies throughout the federal government, looks to create its own approach to artificial intelligence. Roughly a year ago, CDC leadership got together to develop an AI roadmap, Sim said, and since then, it’s prioritized goals like working on the chatbot and developing guidance that it’s shared with others in the federal government.

Now, the agency is planning to develop an AI strategy that Sim said he’s hopeful will be released in late spring to early summer. That strategy will aim to set “high-level principles” for how the CDC wants to use AI to support the public, Sim said. 

“We’re still learning, but we’re trying our best to be innovative, responsive, and obviously sharing as we learn with our partners,” he said.

Piloted uses

The CDC’s pilots are varied in terms of application and topic, including HIV, polio containment, communications, analyzing public comments, and survey design. So far, there’s been positive feedback from the pilots that generative AI has “significantly enhanced data analysis, efficiency, and productivity,” Sim said.

In one of the more operational pilots, for example, communications staff is using AI to assist with updates to the CDC’s websites across the agency.

That process tends to be “tedious” and “manual,” Sim said. To help make it easier, the Office of Communications is using an application connected to the Azure Open AI API, which was created by a data scientist at the agency.

“This has allowed staff to begin summarizing, leveraging … the benefits of generative AI to help speed up the work,” Sim said. 

CDC is also looking to AI for tracking school closures, which it did during the COVID-19 pandemic to watch for potential outbreaks. 

That tracking — which included monitoring thousands of school district websites and various types of closures, from weather to disease outbreaks — was done manually. And although the funding for those efforts stopped in December 2022, Sim said, there’s “a recognition that it’s still important from a public health perspective to keep track of school closure information.” 

As a result, CDC developed an AI prototype to collect information via social media about closures at roughly 45,000 school districts and schools. That prototype is still being evaluated for effectiveness and for whether it’s something that can be scaled, but it’s something CDC is looking into, Sim said.

While the CDC isn’t using agency data with the generative AI service, training against relevant datasets could happen in the future, Sim said. “We haven’t gotten there yet, but that’s part of our roadmap is to sort of mature and learn from these initial pilots, and then just build upon that work,” he said. 

Generative AI guidance

In addition to working toward potential uses, CDC has also developed guidance for generative AI. That document “gets into some of the details” of leveraging generative AI tools responsibly, safely and equitably, Sim said. 

It’s also something the agency is sharing. Sim said CDC presented that guidance at the Chief Artificial Intelligence Officers Council and he’s shared the guidance with “many federal agencies.”

“We are just trying to do our part,” he said. “We are not necessarily experts, but we are sharing the progress that we’ve made.” 

Throughout the federal government, agencies have been creating their own generative AI policies for their employees that detail things like whether third-party tools are prohibited, what information shouldn’t be used in queries, and processes for approving potential uses of the technology. A recent Office of Management and Budget memo further directs agencies to “assess potential beneficial uses” of generative AI uses and establish safeguards. 

CDC declined to share a copy of its guidance.

Even though deploying an AI tool within CDC’s cloud infrastructure provides more security, Sim said there are always concerns. One of the reasons the agency is focused on machine-learning operations is so it can explore and provide guidance on best practices on things like ensuring developers are being transparent, being able to detect “model drift,” and certifying that a model isn’t amplifying bias.

Ultimately, CDC wants to take a proactive approach to AI and machine learning so the agency is prepared for the next outbreak response and to empower state, local, tribal and territorial partners to leverage their data to gain efficiencies where it’s possible, Sim said.

“Any efficiencies that we can gain through these types of innovations, we’re always trying to support and encourage,” Sim said. 

The post CDC’s generative AI pilots include school closure tracking, website updates appeared first on FedScoop.

]]>
77030
CDC eyeing ‘model cards’ to detail generative AI tool information https://fedscoop.com/cdc-eyeing-model-cards-to-detail-generative-ai-tool-information/ Wed, 31 Jan 2024 22:36:27 +0000 https://fedscoop.com/?p=75845 “Model cards” could provide context for using generative AI tools, an official told attendees at AFCEA Bethesda’s Health IT Summit 2024.

The post CDC eyeing ‘model cards’ to detail generative AI tool information appeared first on FedScoop.

]]>
The Centers for Disease Control and Prevention is weighing the use of so-called “model cards” to detail key information about generative AI models it deploys, an agency data official said.

As part of its broader approach to AI governance, the CDC is considering “at least as a minimum” having model cards — which contain information like what’s in a model and how it’s made — deployed alongside its generative AI tools, Travis Hoppe, associate director for data science and analytics at the agency’s National Center for Health Statistics, said Tuesday.

“There’s always a risk when running a model, and you need that context for use,” Hoppe said at AFCEA Bethesda’s Health IT Summit 2024. “You need all of the quantitative metrics … but you also need this kind of qualitative sense, and the model card does capture that.” That information could be useful for evaluating potential risks when someone is considering new uses for a system years after it was initially deployed, Hoppe explained.

Considering model cards comes as the CDC, along with many other federal agencies, is exploring its own approach to governing generative AI use. The guardrails that agencies develop will ultimately play an important role in how the government interacts with the rapidly growing technology that it’s already using.

The CDC, for example, has started 15 generative AI pilots, Hoppe said, though he noted that those projects “are not particularly focused on public health impact.” Hoppe said the agency wanted to “tease out” things like security, how its IT infrastructure worked, and how employees interact with the tools before thinking about expanding uses in the rest of the agency. 

Meanwhile, Hoppe said the agency is in the process of developing guidance for generative AI. While the CDC is looking to executive orders, NIST’s AI Risk Management Framework, and the Department of Health and Human Services’ Trustworthy AI Playbook, he said much of what already exists isn’t “fully prescriptive” of what agencies should do.

“So we’re starting to write out some of these very prescriptive things that we should be doing, and kind of adapting it for our specific mission, which is obviously focused on public health,” Hoppe said. 

The panel discussion about generative AI featured several other HHS officials and was moderated by Paul Brubaker, deputy chief information officer for strategic integration of emerging concepts at the Department of Veterans Affairs Office of Information Technology.


Kevin Duvall, the Administration for Children and Families’ chief technology officer, said during the panel that his agency’s approach to generative AI is detailed in an interim policy that permits employee use of those tools with some constraints. That approach contrasted with other agencies’ prohibitions of third-party generative AI tools. 

Duvall said he doesn’t find it useful for the “government to artificially constrain something,” though he said there needs to be “checks and balances.” 

“I really make a comparison to probably discussions we were having 20, 25 years ago about search engines. You know, search engines can give unreliable results, so can gen AI,” Duvall said. 

One use case the agency has looked into for the technology is in the grants-making area, much of which is done through text, Duvall said, adding that the agency sees it as a “decision-assisting tool” and “not a decision-making tool.” 

The post CDC eyeing ‘model cards’ to detail generative AI tool information appeared first on FedScoop.

]]>
75845
HHS’s artificial intelligence use cases more than triple from previous year https://fedscoop.com/hhs-ai-use-cases-more-than-triple/ Tue, 15 Aug 2023 17:09:31 +0000 https://fedscoop.com/?p=71917 The Department of Health and Human Services' annual AI use case inventory for fiscal 2023 includes 163 instances — up from 50 the previous year.

The post HHS’s artificial intelligence use cases more than triple from previous year appeared first on FedScoop.

]]>
The Department of Health and Human Services’ publicly reported artificial intelligence footprint nearly tripled from the previous year, adding new current and planned uses to its AI inventory like classification of HIV grants and removal of personally identifiable information from data.

The agency’s updated fiscal year 2023 AI use case inventory — which is required of agencies under a Trump-era executive order — shows 163 instances of the technology being operated, implemented, or developed and acquired by the agency. HHS’s public inventory for the previous fiscal year had 50 use cases

“Artificial intelligence use cases tripling from FY22 to FY23 is indicative of HHS’s commitment to leverage trustworthy AI as a critical enabler of our mission,” HHS’s Chief Information Officer Karl S. Mathias told FedScoop in an email.

The increase in reported uses at the agency comes as the conversations about AI’s possible applications and risks have intensified with the rise in popularity of tools like ChatGPT. The Biden administration, which has made AI a focus, is crafting an executive order to address the budding technology and provide guidance to federal agencies on its use.

The majority of AI tools used by HHS – 47 of them – are managed by the National Institutes of Health, according to FedScoop’s analysis of the data. The FDA manages 44, the second-highest number of uses, and the Administration for Strategic Preparedness and Response follows with 25 AI tools.  

Among the new instances reported in the inventory are tools used by NIH for classifying HIV-related grants and predicting stem cell research subcategories of applications, which were both implemented earlier this year. 

Meanwhile, the Centers for Disease Control and Prevention’s National Center for Health Statistics (NCHS) is exploring using an AI tool to transcribe cognitive interviews, which are used to evaluate survey questions and offer a detailed depiction of respondents’ meanings. According to the inventory, it plans to compare outputs from OpenAI’s automatic speech recognition system Whisper to those of VideoBank, company that provides tools for management of digital assets such as recordings, and manual transcription.

Also at NCHS, the agency is evaluating a tool from Private AI to identify, redact, and replace personally identifiable information “free text data sets across platforms within the CDC network.” The database states that use is in the development and acquisition phase, though it also includes an implementation date of May 2, 2023.

AI use case inventories are required of federal agencies under a Trump-era executive order (EO 13960) aimed at promoting trustworthy AI in government. Under that order, agencies must review their current and planned AI uses annually, check for compliance with the order, share them with other agencies, and post them publicly.

A recent FedScoop review of large agencies’ handling of those inventories showed that efforts across the federal government have so far been inconsistent, varying in terms of process, what they include, and timelines for publication.

The new HHS inventory offers a more detailed look into the agency’s AI uses than its inventory last year and includes nearly every category required under the Chief Information Officers Council’s more expansive guidance for documenting uses in fiscal year 2023.

The agency’s inventory for fiscal 2022 included the name, agency, and description of each use. The fiscal 2023 inventory includes those categories plus the stage of every use case and whether it was contracted. Some uses also include the dates it was initiated, began development and acquisition, and was implemented. 

A little more than a third, 36%, of HHS’s reported AI uses are in the operation and maintenance phase, 28% are in development and acquisition, 20% are in initiation, and 16% are in implementation. 

One key requirement of the executive order was to bring into compliance or retire uses that didn’t comply with its framework for AI use in government.

In response to an inquiry about any use cases that were retired or abandoned by agencies since the last inventory, Mathias said: “Some artificial intelligence use cases, like other technology projects, have pivoted or are no longer pursued for various reasons, but none have been retired because of lack of consistency with principles of Executive Order 13960 of December 3, 2020.”

The post HHS’s artificial intelligence use cases more than triple from previous year appeared first on FedScoop.

]]>
71917
HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi https://fedscoop.com/health-information-networks-tefca-success/ Thu, 22 Dec 2022 19:00:00 +0000 https://fedscoop.com/health-information-networks-tefca-success/ About 30% of hospitals remain unconnected to a health information network, but the implementation of network-to-network interoperability may change that.

The post HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi appeared first on FedScoop.

]]>
Multiple applicants expect to have fully operational health information networks for securely sharing clinical data within a year of receiving approval, according to National Coordinator for Health IT Micky Tripathi.

A couple networks are live, and the Office of the National Coordinator for Health IT hopes the first group — among 12 entities that submitted letters of intent — will be officially designated qualified health information networks (QHINs) in early 2023.

Part of the Department of Health and Human Services, ONC published a framework in January for exchanging health information nationwide: the Trusted Exchange Framework and Common Agreement (TEFCA). Required by the 21st Century Cures Act, the framework provides non-binding principles and the agreement technical terms, and now it falls to ONC’s recognized coordinating entity, The Sequoia Project, to approve interoperable QHINs.

“What we’ve heard informally from a number of the prospective QHINs is that they are building in anticipation of getting approved,” Tripathi said, during eHealth Exchange’s annual meeting on Dec. 15. “They think that they would have a pretty good opportunity to do this in the 12-month window and hopefully shorter than that with some of them.”

QHINs will be added on a rolling basis to include electronic health record (EHR) vendors, ambulatory practices, hospitals, health centers, federal and public health agencies, and payers. Epic Systems became the first EHR vendor to announce it would seek QHIN status in August and was later joined by the likes of the eHealth Exchange network and trade association CommonWell Health Alliance.

How TEFCA coexists with other exchanges when it comes to benefits determinations, health care operations, treatment, payment and individual access remains to be seen. But scaling TEFCA will be the “real challenge” and one for which incorporating the Health Level Seven (HL7) Fast Healthcare Interoperability Resource (FHIR) data standard will be key, Tripathi said.

FHIR application programming interfaces streamline health information exchange by eliminating the need for separate data use agreements, and eventually they’ll enable questionnaires, scheduling, links, Clinical Decision Support hooks and subscriptions. That’s why there are already federal deadlines in place for their steady adoption across the public health ecosystem, but QHIN-to-QHIN brokered exchange remains years away.

By the end of 2022, certified EHR vendors must make a FHIR API available to customers.

HL7’s Helious FHIR Accelerator aims to improve the exchange of situational awareness information on hospital and intensive care unit beds available, ventilator counts, personal protective equipment counts, and vaccinations. The HHS Protect system launched during the height of the COVID-19 pandemic provides a lot of that information right now.

“But it’s done via spreadsheets,” Tripathi told FedScoop in July. “A lot of manual work is still done to populate that now.”

The government has spent about $40 billion on EHR infrastructure since the passage of the Health IT for Economic and Clinical Health (HITECH) Act in 2009. Yet clinical operations and health payment systems remain largely rooted in paper because states — most of which still don’t require electronic case reporting — have health authority in the U.S.

Jurisdictional issues and scarce resources are some reasons why about 30% of U.S. hospitals still don’t connect to a health information network, Tripathi said Dec. 15.

Naturally issues with case reports, lab and testing results, and vital records arose early in the pandemic, when they were often being shared by phone or fax.

For all these reasons the Centers for Disease Control and Prevention launched its Data Modernization Initiative (DMI) in 2020 to streamline sharing of electronic health information between care providers and state, local, tribal and territorial (SLTT) health departments. 

The DMI’s first phase has involved getting data from electronic sources into a Microsoft Azure cloud environment, called the Enterprise Data Analytics and Visualization (EDAV) platform, while providing SLTT health departments with automated forecasting analytics tools.

Data standardization is key to improving information sharing between these systems, which is why ONC is working closely with the CDC on its North Star Architecture. The U.S. Core Data for Interoperability (USCDI) Version 4 (v4) that ONC has planned for 2023 will become the de facto minimum set of health data classes and elements for nationwide, interoperable information exchange.

At the same time ONC is developing USCDI+, a nationwide public health data model, for release beyond 2023. Discussions with the CDC and Centers for Medicare and Medicaid Services revealed more than 20 data elements that overlapped, allowing the agencies to agree on a common approach.

ONC is now speaking with the White House Office of Science and Technology Policy and the National Institutes of Health about tailoring a USCDI+ program for President Biden’s Cancer Moonshot program.

EHR vendors support TEFCA and the DMI because they’ll be able to maintain just one customer interface, rather than hundreds to meet the various jurisdictional requirements of SLTT health departments, Tripathi said in July.

Phase I of the DMI is also improving the CDC’s situational awareness, which is based on the Data Collation and Integration for Public Health Event Response (DCIPHER) platform — originally intended to track food-borne diseases. DCIPHER gave rise to HHS Protect and has since had hospital capacity, social vulnerability, mobility, race and ethnicity, social determinants of health, economic, two-on-one, and climate data layered atop it as part of the new Center for Forecasting and Outbreak Analytics’ work, Dr. Dan Jernigan, deputy director for public health science and surveillance, told FedScoop in August.

The center can already do weekly influenza and some Mpox forecasting and has visibility into emerging problems at about 70% of emergency departments.

“To see a fully formed prediction center, it’s going to be a couple years,” Jernigan said. “The numbers of staff that are in the Center for Forecasting right now are in the tens to thirties, but it is anticipated to be a much larger group.”

As part of DMI Phase I, 128 reportable diseases now automatically trigger EHR electronic case reporting, which is routed to the Association of Public Health Laboratories-APHL Informatics Messaging Services (APHL-AIMS) cloud platform and then SLTT health departments. Electronic case reporting increased from 187 facilities pre-pandemic to more than 14,000, more than 30 of which turned on Monkeypox reporting.

While the effort highlights the CDC’s move toward pathogen- and program-agnostic systems through its DMI, electronic case reporting continues to fall short.

“It’s not nearly the volume that we need it to be,” Tripathi said in July. “But at least we’re starting to set up those pathways.”

At the same time the DMI has seen “dramatic improvements” in COVID-19 reporting across immunization information systems (IISs), he added.

IISs were slow to take adult COVID-19 vaccination information, but now they accept line-listed records using privacy-preserving record linkage — even for Monkeypox.

The CDC recently revised its DMI implementation plan, and Phase 2 will focus on improving state health departments’ cloud infrastructure and update the National Electronic Disease Surveillance System (NEDSS) Base System (NBS) that 26 states use for case management.

Cloud migration allows doctors like Phil Huang, director of Dallas Health and Human Services, to match immunization, lab and death records to know if a patient who passed away tested positive for COVID-19 and was vaccinated. 

“That ability to put that data together and integrate it with other kinds of information, even down to the neighborhood level, helps him do his prevention work and his mitigation work in a much more targeted way,” Jernigan. 

CDC proposed the DMI receive about $200 million in fiscal 2023 to continue its “incremental” progress, but the Healthcare Information and Management Systems Society estimated the initiative needs $33 billion over the next 10 years to be successful, he added.

Meanwhile ONC, unable to enforce TEFCA, is working with federal partners to highlight the need for network-to-network interoperability and hoping its rollout leads outside providers to question why they’re still faxing records.

“We were given no dollars and no new authorities to do this,” Tripathi said.

The post HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi appeared first on FedScoop.

]]>
63698
Special committee calls for ‘reboot’ of federal technology accessibility oversight https://fedscoop.com/reboot-technology-accessibility-oversight/ Wed, 14 Dec 2022 23:33:55 +0000 https://fedscoop.com/reboot-technology-accessibility-oversight/ Section 508 of the Rehabilitation Act hasn't been updated since 1998, and in the meantime departments like VA have committed hundreds of thousands of violations.

The post Special committee calls for ‘reboot’ of federal technology accessibility oversight appeared first on FedScoop.

]]>
The Department of Justice must “reboot” critical oversight of federal technology accessibility by resuming required biennial progress reports not issued since 2012, according to the Senate Special Committee on Aging’s Democrats.

Majority staff released its “Unlocking the Virtual Front Door” report, after an 11-month investigation that found the Department of Veterans Affairs committed hundreds of thousands of accessibility violations taking years to address due to insufficient oversight — a problem extending to other federal departments.

Congress added Section 508 to the Rehabilitation Act in 1986, mandating that federal technology be accessible to people with disabilities, and last strengthened the statute in 1998. But oversight and enforcement hasn’t kept pace with the U.S.’s aging population, 80.8 million people expected to be over 65 by 2040.

“The entire federal government needs to wake up to this issue because a whole-of-government approach is what we need to remedy it,” said committee chair Sen. Bob Casey, D-Penn., in a statement. “We would not ask someone using a wheelchair to walk up the courthouse steps, but we are doing something similar when we ask people with disabilities to use federal websites that are not accessible.”

Blind Air Force veteran Ron Biglin from Clarks Summit, Pennsylvania, can’t access the VA’s My HealtheVet website with the screen reader the department provided him, and the Centers for Disease Control and Prevention‘s website for COVID-19 prevalence data was no better, the committee’s majority staff found.

DOJ recently committed to resuming Section 508 progress reports, and the committee’s majority staff is recommending the General Services Administration begin publishing data on Section 508 compliance and inspectors general increase oversight of their agencies’ compliance.

“Increased oversight from inspectors general would likely result in improved accessibility for taxpayers and workers using federal technology,” reads the report. “Such oversight may also lead to cost savings, given that remediating non-compliant websites resulted in additional costs for departments and agencies.”

The report further recommends agencies like the VA maintain their ability to conduct automated Section 508 compliance scans, after the department cancelled a contract undermining that capability for a year.

Agencies should include people with disabilities and older adults in their technology planning and evaluation and consider using human testers to evaluate Section 508 compliance and technology accessibility, akin to the Department of Homeland Security‘s Trusted Tester program. Appointing accessibility officers, whose job it is to ensure Section 508 compliance within their agency, could also help, according to majority staff.

“Rather than locating Section 508 compliance among multiple offices, housing Section 508 compliance responsibilities within existing accessibility offices or creating offices whose responsibilities include oversight of Section 508 compliance could improve the accessibility of technology within these organizations,” reads the report.

The report also recommends agencies ensure Section 508 complaints can be filed and that Congress:

  • consider updating the statute’s language to add accountability measures, grant the Access Board enforcement authority, allocate funding and evaluate complaint resolutions;
  • have authorizing committees and spending bills hold other agencies accountable for Section 508 compliance; and
  • ensure accessibility of its own technology and websites.

“Accessible technology is crucial for people seeking to secure health care, receive Social Security and VA benefits, pay taxes, and navigate federal information ranging from weather forecasts to economic data,” reads the report. “The COVID-19 pandemic increased the nation’s reliance on the internet to access basic services — driving home the importance of accessible federal websites and communications technology.”

The post Special committee calls for ‘reboot’ of federal technology accessibility oversight appeared first on FedScoop.

]]>
63679
CDC awards Palantir consolidated disease surveillance contract worth $443M https://fedscoop.com/cdc-palantir-common-operating-picture/ Wed, 07 Dec 2022 16:00:00 +0000 https://fedscoop.com/cdc-palantir-common-operating-picture/ Public health infrastructure the tech company deployed at the height of the pandemic is being scaled for generalize response to diseases like Monkeypox.

The post CDC awards Palantir consolidated disease surveillance contract worth $443M appeared first on FedScoop.

]]>
The Centers for Disease Control and Prevention awarded a $443 million contract consolidating and renewing software and digital capabilities Palantir provides for disease surveillance and outbreak response, the technology company announced Wednesday.

Running five years, the contract unites the Palantir-driven Health and Human Services (HHS) Protect, Administration for Strategic Preparedness and Response (ASPR) Engage, Tiberius and DCIPHER programs into what the CDC is calling its Common Operating Picture.

The CDC contracted Palantir to launch the public health infrastructure programs during the height of the pandemic, and the new Common Operating Picture approach will allow for long-term, interagency planning and operational consistency around outbreak and incident preparedness.

“There’s no way Palantir could do what we’re doing in this space without a really deep emphasis on partnership and interoperability, not only with our federal partners but with other technology systems and the other key players in the public health technology landscape,” Hirsh Jain, head of public health federal at Palantir, told FedScoop in an exclusive interview. “By definition a Common Operating Picture really requires that level of engagement with other systems and other entities.”

The Common Operating Picture allows Palantir to scale its modular technology beyond the specifics of COVID-19 for more generalized public health response to diseases like Monkeypox and respiratory syncytial virus (RSV).

Beneficiaries include federal agencies, jurisdictional health departments and industry partners, which rely on the Common Operating Picture for disease surveillance, outbreak preparedness and response, and supply chain visibility and management.

“Every platform is being used for use cases and mission areas beyond COVID,” Jain said. “The underlying modules were configured in a way that allows for pretty immediate expansion beyond COVID and reusability across that broader space of diseases.”

The National Wastewater Surveillance System uses Palantir software for wastewater genomics, while the Predict Division within the CDC’s new Center for Forecasting and Outbreak Analytics is embarking on an “ambitious” effort to deploy models and analytics addressing critical needs as they arise, Jain said.

Advances in such work are more likely given the length of the Common Operating Picture contract.

“We’re really excited about the 5-year commitment here, knowing what the last five years have been like,” Jain said. “Having long-term preparedness and public health response infrastructure in place is so critical, and this gives Palantir the place to support CDC and the broader public health ecosystem in delivering that.”

The post CDC awards Palantir consolidated disease surveillance contract worth $443M appeared first on FedScoop.

]]>
63645
National Center for Health Statistics targeting fall launch of virtual data enclave https://fedscoop.com/nchs-to-launch-virtual-data-enclave/ Thu, 11 Aug 2022 20:46:24 +0000 https://fedscoop.com/?p=57941 The launch of the enclave represents a culture shift as it works to share more data with researchers, says Neil Russell.

The post National Center for Health Statistics targeting fall launch of virtual data enclave appeared first on FedScoop.

]]>
The National Center for Health Statistics is testing a virtual data enclave internally to make its sensitive data available to more researchers, with plans to onboard select pilot projects in the fall, according to its Research Data Center director.

Speaking at the Joint Statistical Meetings in Washington, Neil Russell said researchers will be able to use the virtual data enclave (VDE) from wherever they’re at to find and request data from NCHS.

The launch of the enclave represents a culture shift for a “fairly conservative” federal statistical agency, in response to the Foundations for Evidence-Based Policymaking Act of 2019 encouraging increased data releases, Russell said. NCHS — the Centers for Disease Control and Prevention center that tracks vital statistics on births, deaths, diseases and conditions to inform public health decisions — recognized researchers having to go to one of four secure research data centers (RDCs) or 32 federal statistical RDCs (FSRDCs) nationwide to access its restricted-use data was impractical.

“There is a definite financial hurdle to accessing restricted-use data through the physical data enclave model,” Russell said. “And we’re hopeful that a whole new group, or cohort, of researchers may be motivated to access the restricted-use data through a virtual data enclave.”

A researcher in New Mexico, which lacks any RDCs or FSRDCs, will no longer need to travel to Texas, Colorado or Utah to obtain the restricted-use data they need for their work. And no background investigations will be required of researchers at NCHS, which sponsored the VDE.

RDCs closed at the height of the COVID-19 pandemic, but the VDE can operate 24/7 in theory.

The VDE is 99% built and Windows-based with familiar software — namely SAS so researchers can write code to generate outputs they then request from NCHS — to be customer friendly, Russell said.

NCHS’s sister agency, the National Institute for Occupational Safety and Health, already had an operational VDE, so the former didn’t require a contract. Instead NCHS sent NIOSH its enclave requirements designed for data scientists and payment, which came out of CDC Data Modernization Initiative funds, in September.

NIOSH had no way of handling non-CDC employees logging into the VDE, so the General Services Administration’s Login.gov service was used. Outside researchers must show their driver’s license to create an account, and NCHS conducts virtual inspections of their offsite locations.

NCHS further had NIOSH build a tracking system to create an audit trail for all data released.

NIOSH’s VDE already had an authority to operate at the Federal Information Security Management Act moderate level; encrypted researchers’ connections; required two-factor authentication, and prevented downloading, copy-pasting, printing and emailing of data.

To address the rest of the risk of data exfiltration, NCHS requires researchers and, in some cases, their employers to sign data-use agreements specifying where they’d like to access the data from via a secure server.

While NCHS can’t control violations of that agreement, such as a researcher taking a photo of their output prior to submitting it to NCHS for review, they can be caught.

“I’ve seen journal articles produced through restricted use data that we didn’t know where they got it from; we know it happens,” Russell said. “Your access to the data will be terminated and your employer notified.”

Researchers still must pay a data access fee, and NCHS hasn’t calculated the true operational cost of the VDE just yet.

If more researchers seek VDE access than NCHS can handle, which seems likely, Russell will have to ask the CDC for additional funding to scale the environment.

“It is possible that the demand for this mode of access will outstrip our supply,” Russell said. “Currently I only have approval to stand up 10 virtual machines, which seems ridiculous.”

The post National Center for Health Statistics targeting fall launch of virtual data enclave appeared first on FedScoop.

]]>
57941
White House unveils HEAT.gov to help address record-breaking temperatures https://fedscoop.com/white-house-unveils-heat-gov/ Tue, 26 Jul 2022 19:05:53 +0000 https://fedscoop.com/?p=56492 The website will help agencies share real-time data on extreme heat conditions to improve response in light of climate change.

The post White House unveils HEAT.gov to help address record-breaking temperatures appeared first on FedScoop.

]]>
A group of federal agencies launched Heat.gov to share real-time data on extreme heat conditions and response with each other, state and local officials, and the public.

The website contains information from the National Integrated Health Health Information System (NIHHIS), an interagency partnership developed by the Centers for Disease Control and Prevention and the National Oceanic and Atmospheric Administration, which also includes extreme heat preparedness and response resources.

Heat.gov is one in a series of actions the Biden administration is taking to address record-breaking temperatures exceeding 100 degrees Fahrenheit due to climate change, sending tens of thousands of Americans to emergency rooms and increasing health problems for seniors, children and workers.

“Heat related illnesses and death are largely preventable with proper planning, education and action,” reads the Heat.gov homepage. “Heat.gov serves as the premier source of heat and health information for the nation to reduce the health, economic and infrastructural impacts of extreme heat.”

The website’s first visual depicts the number of people in the U.S. under a National Weather Service extreme heat advisory, watch or warning during the last 30 days. Below that is the current national heat forecast, CDC Heat & Health Tracker and The Climate Explorer derived from global climate models of coming decades.

Meanwhile the Department of Health and Human Services, through the White House’s new Office of Climate Change and Health Equity, launched a Climate and Health Outlook to inform health professionals of climate events in the next 30 to 60 days and improve response.

The post White House unveils HEAT.gov to help address record-breaking temperatures appeared first on FedScoop.

]]>
56492
US, UK governments offer $1.6M for tech that trains AI while preserving privacy https://fedscoop.com/us-uk-ai-privacy-competition/ Wed, 20 Jul 2022 11:00:00 +0000 https://fedscoop.com/?p=55966 The prize money will be split between winners in financial crime prevention and pandemic response tracks.

The post US, UK governments offer $1.6M for tech that trains AI while preserving privacy appeared first on FedScoop.

]]>
The U.S. and U.K. governments launched a competition series Wednesday offering $1.6 million in prize money for developing privacy-enhancing technologies allowing artificial intelligence models to be trained with sensitive data safely.

Entrants will create federated learning solutions that prevent organizations’ raw data from being revealed, shared or combined as it’s used to train AI improving financial crime detection, forecasting a person’s risk of infection during a pandemic, or both.

The series was announced in 2021 at the Summit for Democracy, which will showcase winning solutions in the financial crime and public health emergency tracks, or generalized solutions, when President Biden convenes it in early 2023.

“This important initiative reflects our common purpose of developing technologies and driving innovation in a manner that reinforces our commitment to and expression of democratic values and the fundamental right to privacy,” said Alondra Nelson, outgoing director of the White House Office of Science and Technology Policy, in the announcement.

The financial crime prevention track is aimed at the problem of international money laundering — a $2 trillion-a-year challenge, according to U.N. estimates — and will supply entrants with realistic but synthetic global transaction data created by financial messaging service SWIFT. Solutions should preserve privacy while facilitating information sharing and collaborative analytics to detect anomalous payments.

Entrants in the track will be afforded the opportunity to engage with international regulators: the Financial Crimes Enforcement Network and the U.K.‘s Financial Conduct Authority, Information Commissioner’s Office, and National Economic Crime Centre.

The pandemic response track is intended to strengthen global readiness for current and future disease outbreaks by providing entrants with a synthetic, regional population dataset from the University of Virginia’s Biocomplexity Institute in designing secure infection risk forecasting solutions.

Entrants in the track will be able to contact the Centers for Disease Control and Prevention and the U.K.’s National Health Service and Data and Analytics Research Environments.

“Building on decades of [National Science Foundation] research investment in the field, these prize challenges will accelerate the translation of game-changing privacy-enhancing technologies,” said Sethuraman Panchanathan, NSF director, in a statement. “In this way, these prize challenges … illustrate the synergy of foundational research and translational activities in moving research to practice.”

The post US, UK governments offer $1.6M for tech that trains AI while preserving privacy appeared first on FedScoop.

]]>
55966
Larry Ellison: Oracle will modernize and expand Cerner’s Millennium platform https://fedscoop.com/larry-ellison-oracle-will-modernize-and-expand-cerners-millennium-platform/ Thu, 09 Jun 2022 22:01:03 +0000 https://fedscoop.com/?p=53501 Oracle’s chairman and CTO provides details about the future of the platform behind the VA's troubled electronic health records modernization program.

The post Larry Ellison: Oracle will modernize and expand Cerner’s Millennium platform appeared first on FedScoop.

]]>
Oracle will undertake work to modernize and expand Cerner’s Millennium electronic health records platform, including adding modules such as voice activation and integrated telemedicine, according to the company’s chairman and CTO.

Speaking Thursday at the first public briefing by executives after Oracle’s $28.3 billion acquisition of Cerner closed, Larry Ellison set out a future vision in which the medical records company will play a key role in the cloud giant’s further expansion into global healthcare technology.

“We [a combination of Oracle and Cerner] are going to modernize and expand Millennium substantially. The first thing we are going to do is to make it easy to use. We’re going to have a voice user interface to Millennium that makes it easier for doctors to access medicine and orders.”

He added: “There is also an integrated telemedicine module that allows [users] to consult. If you are living in rural America and you want to consult with a specialist at MD Anderson for cancer, then you can do that via secure video teleconference.”

According to the executive, Oracle will also add disease-specific AI modules — including a recently developed cancer-specific module — to the system.

Details of the changes come a day after Oracle’s $28.3 billion acquisition of medical records company Cerner closed. The U.S. government is among Cerner’s biggest clients, and the technology company currently has contracts with the Coast Guard, Centers for Disease Control and Prevention, Department of Health and Human Services, Centers for Medicare and Medicaid Services, Department of Defense and Department of Veterans Affairs.

At the Department of Veterans Affairs, Cerner’s Millennium platform has formed the backbone of the agency’s troubled electronic health records (EHR) modernization program, which continues to attract scrutiny from lawmakers and government oversight bodies because of persistent outages.

Late last month, Senate lawmakers unanimously passed legislation that would require the Department of Veterans Affairs to report the costs of its EHR modernization program more regularly and in greater detail.

Speaking at Thursday’s briefing event, Ellison also set out Oracle’s wider vision for Cerner, which will be the foundation of a new project which includes plans for the cloud giant to build a unified health database that will hold data belonging to millions of Americans in an anonymized form.

Oracle wants to create a new overarching nationwide system for patient health records across the U.S., which according to Ellison could overcome patient data fragmentation and allow doctors at any hospital to access a patient’s data when needed.

The new health records system would also help public health officials access data more quickly during a global health crisis such as a pandemic, he said.

The post Larry Ellison: Oracle will modernize and expand Cerner’s Millennium platform appeared first on FedScoop.

]]>
53501