Data Modernization Initiative Archives | FedScoop https://fedscoop.com/tag/data-modernization-initiative/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Mon, 23 Jan 2023 09:20:27 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 Data Modernization Initiative Archives | FedScoop https://fedscoop.com/tag/data-modernization-initiative/ 32 32 HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi https://fedscoop.com/health-information-networks-tefca-success/ Thu, 22 Dec 2022 19:00:00 +0000 https://fedscoop.com/health-information-networks-tefca-success/ About 30% of hospitals remain unconnected to a health information network, but the implementation of network-to-network interoperability may change that.

The post HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi appeared first on FedScoop.

]]>
Multiple applicants expect to have fully operational health information networks for securely sharing clinical data within a year of receiving approval, according to National Coordinator for Health IT Micky Tripathi.

A couple networks are live, and the Office of the National Coordinator for Health IT hopes the first group — among 12 entities that submitted letters of intent — will be officially designated qualified health information networks (QHINs) in early 2023.

Part of the Department of Health and Human Services, ONC published a framework in January for exchanging health information nationwide: the Trusted Exchange Framework and Common Agreement (TEFCA). Required by the 21st Century Cures Act, the framework provides non-binding principles and the agreement technical terms, and now it falls to ONC’s recognized coordinating entity, The Sequoia Project, to approve interoperable QHINs.

“What we’ve heard informally from a number of the prospective QHINs is that they are building in anticipation of getting approved,” Tripathi said, during eHealth Exchange’s annual meeting on Dec. 15. “They think that they would have a pretty good opportunity to do this in the 12-month window and hopefully shorter than that with some of them.”

QHINs will be added on a rolling basis to include electronic health record (EHR) vendors, ambulatory practices, hospitals, health centers, federal and public health agencies, and payers. Epic Systems became the first EHR vendor to announce it would seek QHIN status in August and was later joined by the likes of the eHealth Exchange network and trade association CommonWell Health Alliance.

How TEFCA coexists with other exchanges when it comes to benefits determinations, health care operations, treatment, payment and individual access remains to be seen. But scaling TEFCA will be the “real challenge” and one for which incorporating the Health Level Seven (HL7) Fast Healthcare Interoperability Resource (FHIR) data standard will be key, Tripathi said.

FHIR application programming interfaces streamline health information exchange by eliminating the need for separate data use agreements, and eventually they’ll enable questionnaires, scheduling, links, Clinical Decision Support hooks and subscriptions. That’s why there are already federal deadlines in place for their steady adoption across the public health ecosystem, but QHIN-to-QHIN brokered exchange remains years away.

By the end of 2022, certified EHR vendors must make a FHIR API available to customers.

HL7’s Helious FHIR Accelerator aims to improve the exchange of situational awareness information on hospital and intensive care unit beds available, ventilator counts, personal protective equipment counts, and vaccinations. The HHS Protect system launched during the height of the COVID-19 pandemic provides a lot of that information right now.

“But it’s done via spreadsheets,” Tripathi told FedScoop in July. “A lot of manual work is still done to populate that now.”

The government has spent about $40 billion on EHR infrastructure since the passage of the Health IT for Economic and Clinical Health (HITECH) Act in 2009. Yet clinical operations and health payment systems remain largely rooted in paper because states — most of which still don’t require electronic case reporting — have health authority in the U.S.

Jurisdictional issues and scarce resources are some reasons why about 30% of U.S. hospitals still don’t connect to a health information network, Tripathi said Dec. 15.

Naturally issues with case reports, lab and testing results, and vital records arose early in the pandemic, when they were often being shared by phone or fax.

For all these reasons the Centers for Disease Control and Prevention launched its Data Modernization Initiative (DMI) in 2020 to streamline sharing of electronic health information between care providers and state, local, tribal and territorial (SLTT) health departments. 

The DMI’s first phase has involved getting data from electronic sources into a Microsoft Azure cloud environment, called the Enterprise Data Analytics and Visualization (EDAV) platform, while providing SLTT health departments with automated forecasting analytics tools.

Data standardization is key to improving information sharing between these systems, which is why ONC is working closely with the CDC on its North Star Architecture. The U.S. Core Data for Interoperability (USCDI) Version 4 (v4) that ONC has planned for 2023 will become the de facto minimum set of health data classes and elements for nationwide, interoperable information exchange.

At the same time ONC is developing USCDI+, a nationwide public health data model, for release beyond 2023. Discussions with the CDC and Centers for Medicare and Medicaid Services revealed more than 20 data elements that overlapped, allowing the agencies to agree on a common approach.

ONC is now speaking with the White House Office of Science and Technology Policy and the National Institutes of Health about tailoring a USCDI+ program for President Biden’s Cancer Moonshot program.

EHR vendors support TEFCA and the DMI because they’ll be able to maintain just one customer interface, rather than hundreds to meet the various jurisdictional requirements of SLTT health departments, Tripathi said in July.

Phase I of the DMI is also improving the CDC’s situational awareness, which is based on the Data Collation and Integration for Public Health Event Response (DCIPHER) platform — originally intended to track food-borne diseases. DCIPHER gave rise to HHS Protect and has since had hospital capacity, social vulnerability, mobility, race and ethnicity, social determinants of health, economic, two-on-one, and climate data layered atop it as part of the new Center for Forecasting and Outbreak Analytics’ work, Dr. Dan Jernigan, deputy director for public health science and surveillance, told FedScoop in August.

The center can already do weekly influenza and some Mpox forecasting and has visibility into emerging problems at about 70% of emergency departments.

“To see a fully formed prediction center, it’s going to be a couple years,” Jernigan said. “The numbers of staff that are in the Center for Forecasting right now are in the tens to thirties, but it is anticipated to be a much larger group.”

As part of DMI Phase I, 128 reportable diseases now automatically trigger EHR electronic case reporting, which is routed to the Association of Public Health Laboratories-APHL Informatics Messaging Services (APHL-AIMS) cloud platform and then SLTT health departments. Electronic case reporting increased from 187 facilities pre-pandemic to more than 14,000, more than 30 of which turned on Monkeypox reporting.

While the effort highlights the CDC’s move toward pathogen- and program-agnostic systems through its DMI, electronic case reporting continues to fall short.

“It’s not nearly the volume that we need it to be,” Tripathi said in July. “But at least we’re starting to set up those pathways.”

At the same time the DMI has seen “dramatic improvements” in COVID-19 reporting across immunization information systems (IISs), he added.

IISs were slow to take adult COVID-19 vaccination information, but now they accept line-listed records using privacy-preserving record linkage — even for Monkeypox.

The CDC recently revised its DMI implementation plan, and Phase 2 will focus on improving state health departments’ cloud infrastructure and update the National Electronic Disease Surveillance System (NEDSS) Base System (NBS) that 26 states use for case management.

Cloud migration allows doctors like Phil Huang, director of Dallas Health and Human Services, to match immunization, lab and death records to know if a patient who passed away tested positive for COVID-19 and was vaccinated. 

“That ability to put that data together and integrate it with other kinds of information, even down to the neighborhood level, helps him do his prevention work and his mitigation work in a much more targeted way,” Jernigan. 

CDC proposed the DMI receive about $200 million in fiscal 2023 to continue its “incremental” progress, but the Healthcare Information and Management Systems Society estimated the initiative needs $33 billion over the next 10 years to be successful, he added.

Meanwhile ONC, unable to enforce TEFCA, is working with federal partners to highlight the need for network-to-network interoperability and hoping its rollout leads outside providers to question why they’re still faxing records.

“We were given no dollars and no new authorities to do this,” Tripathi said.

The post HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi appeared first on FedScoop.

]]>
63698
Opinion: How do you make the State Department data-driven? One campaign at a time https://fedscoop.com/opinion-how-do-you-make-the-state-department-data-driven-one-campaign-at-a-time/ Wed, 28 Sep 2022 18:00:00 +0000 https://fedscoop.com/?p=61019 State Department Chief Data Officer Matthew Graviss and his deputy Garrett Berntsen explain their progress reshaping the agency's use of data in support of U.S. diplomatic efforts.

The post Opinion: How do you make the State Department data-driven? One campaign at a time appeared first on FedScoop.

]]>
Digital transformation is hard no matter where you do it. In fact, Boston Consulting Group estimates that 70% of digital transformation efforts fail or underwhelm — and that’s in the private sector. Now, imagine trying it inside America’s oldest cabinet agency: the U.S. Department of State. Driven by well-honed intuition, humanistic expertise, and gut instinct, the department sets the standard for “the art of diplomacy.” But as “the second oldest profession,” diplomacy can be rife with nuance, traditions, and rituals — not exactly the stuff of spreadsheets and decimal points. 

Secretary Antony Blinken’s modernization agenda and the department’s first-ever Enterprise Data Strategy — signed one year ago this week — are changing the game. The Enterprise Data Strategy (EDS) helped the Department surge data policy, data engineering, and data science resources to high-priority, high-visibility mission and management topics in successive, six-month “data campaigns.” Already, data is informing decisions across management issues like anomalous health incidents and cybersecurity, crisis operations like the Afghanistan retrograde, and foreign policy issues including strategic competition with China and U.S. engagement in multilateral organizations. We see our unique “campaign” approach to delivering on the data strategy as a key reason the State Department is currently seeing so much success.

What makes our campaigns different? Oftentimes in government, a strategy is blessed by senior leaders and organizations commit to impossible goals, even over the long term. And organizational and cultural change in large government organizations is famously hard, even with senior leader buy-in.

Instead, we recognize that like any technical project, priorities change quickly. No proposed implementation plan would survive first contact with mission realities. The EDS intentionally promotes an abstract three to five-year “implementation roadmap” centered on delivering decision advantage through analytics, effectively governing data as a strategic asset, building a culture of data literacy, and developing the needed tech backbone. 

The data campaigns are vehicles for applying the EDS to targeted priorities with which the department workforce is already intimately engaged. For example, the first two data campaigns were strategic competition with the People’s Republic of China (PRC), and workforce diversity, equity, inclusion, and accessibility (DEIA). Our bet was that if we could prove data and analytics — and a cultural mindset supportive of them — could deliver real, tangible results on strategic competition and workforce DEIA, people throughout the organization would start to trust that data and analytics could help with their mission too. Change by showing, not telling. 

Animating philosophy and campaign structure

The animating philosophy for our data campaigns is “12-8-4”: We plan to accomplish 12 things, we’ll get eight of them done, and that will still be four more than anyone thought possible. We back up this philosophy with an aggressive surge of resources to apply analytics to a topic with a go-fast, sprint-to-the-finish mentality. We believe the core benefit of a campaign-based approach to digital transformation is that high-priority mission and management topics motivate people to partner, get to yes, and deliver results more quickly and robustly than abstract strategic goals. No one gets excited about showing up to a “data quality working group,” but a working group on how to unlock HR data to dismantle structural barriers to racial equity in our diplomatic workforce? That’s an effort worth getting behind. 

Functionally, a campaign is all about integrated, cross-functional delivery across our own teams. Each campaign is assigned not just dedicated analytics teams, but also communications staff, data policy and data-sharing agreement specialists, and full-stack engineers. Cross-functional delivery ensures we are bringing the right tools to the problem. If what we need is a new data policy, not machine learning, we shouldn’t be technological determinists just because it sounds more innovative. Often what our customers need first and foremost is systematized, sustainable data management and information synthesis, not predictive algorithms. Our cross-functional teams ensure we have and deliver the right tools to the problem. 

Crucially, the campaign construct attracts executive attention. Working on the highest-priority issues means analytics teams get the attention and support needed from senior leaders to actually implement the strategy in the face of inevitable technical and bureaucratic blockers. In exchange, campaign teams are accountable for actually delivering value — not producing “shelfware” strategy-implementation status reports. Fortunately, this visibility has helped deliver results. And we’ve been able to build trust among data skeptics and help leaders and staff alike understand the value of data to their own goals – why analytics is not just a “nice to have,” but a “must-have” on foreign assistance, competition with China, diplomatic engagement, or crisis operations.

Data campaigns and beyond

To help bring better data to the massive challenge of diversity, equity, inclusion, and accessibility (DEIA), we took a collaborative approach with Ambassador Gina Abercrombie-Winstanley and her new Office of Diversity and Inclusion, the Bureau of Global Talent Management, the Office of Civil Rights, and the Office of the Legal Advisor. After a six-month DEIA data campaign, we produced a baseline assessment available to the entire department to bring hard numbers to the challenge of DEIA. One career ambassador told us this effort was “the most transparent and actionable information on DEIA” they had seen in their 30 years of service. To build this report, our campaign team worked with partners throughout the agency to develop the first DEIA data policy in the history of the U.S. government. The policy has made HR information more transparent and accessible while protecting individual privacy and meeting all legal requirements. 

Our China work has also been essential to the Department’s growing focus on strategic competition with the PRC. First, we developed an analytics platform tracking PRC activities around the world, which is regularly used to inform our foreign policy, strategic planning, global presence, and resource allocations. We also took a hybrid subject matter expertise survey and algorithmic approach to recommend foreign assistance projects under the Countering PRC Influence Fund, aligning foreign assistance to strategic priorities. We have also built a prototype platform to derive insights from millions of diplomatic cables at scale using machine learning, which helps the State Department make fuller use of our most important novel data asset: on-the-ground reporting from our worldwide network of diplomatic posts. 

The value of data to diplomacy that these campaigns and other efforts have shown has begun to pay dividends elsewhere. We’ve seen record enrollment in the Foreign Service Institute’s training courses on data literacy and analytics and the inclusion of data literacy in promotion precepts for the Foreign Service. We are also successfully competing for the top data science talent in the industry. Over the past year, dozens of new data scientists joined the State Department across an array of mission areas and bureaus – not just the Office of Management Strategy and Solutions’ Center for Analytics. And State’s current initiative to hire at least 50 data scientists for positions across the department received over 400 applications in only a few days. As Deputy Secretary for Management and Resources Brian McKeon said to Congress, it may surprise you to learn that top data scientists want to serve their country at the State Department, and are leaving top jobs in industry and academia to do so. With the unique opportunity afforded by the EDS and our data campaigns, this does not surprise us. 

When British Prime Minister Lord Palmerston received his first telegraph message in the 1860s, he exclaimed, “My God, this is the end of diplomacy!” Yet here we are, and needless to say, technology will never be the “end” of diplomacy. Rather, by infusing the art of diplomacy with modern technology and the science of data, we are strengthening the digital backbone of America’s diplomatic corps and ensuring the country’s oldest cabinet agency delivers results for the American people far into the future — one campaign at a time. 

Matthew Graviss is the chief data officer at the Department of State. Garrett Berntsen is his deputy.

The post Opinion: How do you make the State Department data-driven? One campaign at a time appeared first on FedScoop.

]]>
61019
National Center for Health Statistics targeting fall launch of virtual data enclave https://fedscoop.com/nchs-to-launch-virtual-data-enclave/ Thu, 11 Aug 2022 20:46:24 +0000 https://fedscoop.com/?p=57941 The launch of the enclave represents a culture shift as it works to share more data with researchers, says Neil Russell.

The post National Center for Health Statistics targeting fall launch of virtual data enclave appeared first on FedScoop.

]]>
The National Center for Health Statistics is testing a virtual data enclave internally to make its sensitive data available to more researchers, with plans to onboard select pilot projects in the fall, according to its Research Data Center director.

Speaking at the Joint Statistical Meetings in Washington, Neil Russell said researchers will be able to use the virtual data enclave (VDE) from wherever they’re at to find and request data from NCHS.

The launch of the enclave represents a culture shift for a “fairly conservative” federal statistical agency, in response to the Foundations for Evidence-Based Policymaking Act of 2019 encouraging increased data releases, Russell said. NCHS — the Centers for Disease Control and Prevention center that tracks vital statistics on births, deaths, diseases and conditions to inform public health decisions — recognized researchers having to go to one of four secure research data centers (RDCs) or 32 federal statistical RDCs (FSRDCs) nationwide to access its restricted-use data was impractical.

“There is a definite financial hurdle to accessing restricted-use data through the physical data enclave model,” Russell said. “And we’re hopeful that a whole new group, or cohort, of researchers may be motivated to access the restricted-use data through a virtual data enclave.”

A researcher in New Mexico, which lacks any RDCs or FSRDCs, will no longer need to travel to Texas, Colorado or Utah to obtain the restricted-use data they need for their work. And no background investigations will be required of researchers at NCHS, which sponsored the VDE.

RDCs closed at the height of the COVID-19 pandemic, but the VDE can operate 24/7 in theory.

The VDE is 99% built and Windows-based with familiar software — namely SAS so researchers can write code to generate outputs they then request from NCHS — to be customer friendly, Russell said.

NCHS’s sister agency, the National Institute for Occupational Safety and Health, already had an operational VDE, so the former didn’t require a contract. Instead NCHS sent NIOSH its enclave requirements designed for data scientists and payment, which came out of CDC Data Modernization Initiative funds, in September.

NIOSH had no way of handling non-CDC employees logging into the VDE, so the General Services Administration’s Login.gov service was used. Outside researchers must show their driver’s license to create an account, and NCHS conducts virtual inspections of their offsite locations.

NCHS further had NIOSH build a tracking system to create an audit trail for all data released.

NIOSH’s VDE already had an authority to operate at the Federal Information Security Management Act moderate level; encrypted researchers’ connections; required two-factor authentication, and prevented downloading, copy-pasting, printing and emailing of data.

To address the rest of the risk of data exfiltration, NCHS requires researchers and, in some cases, their employers to sign data-use agreements specifying where they’d like to access the data from via a secure server.

While NCHS can’t control violations of that agreement, such as a researcher taking a photo of their output prior to submitting it to NCHS for review, they can be caught.

“I’ve seen journal articles produced through restricted use data that we didn’t know where they got it from; we know it happens,” Russell said. “Your access to the data will be terminated and your employer notified.”

Researchers still must pay a data access fee, and NCHS hasn’t calculated the true operational cost of the VDE just yet.

If more researchers seek VDE access than NCHS can handle, which seems likely, Russell will have to ask the CDC for additional funding to scale the environment.

“It is possible that the demand for this mode of access will outstrip our supply,” Russell said. “Currently I only have approval to stand up 10 virtual machines, which seems ridiculous.”

The post National Center for Health Statistics targeting fall launch of virtual data enclave appeared first on FedScoop.

]]>
57941
CDC looks to improve internal data sharing with centralized, cloud-based ecosystem https://fedscoop.com/cdc-edav-data-sharing-platform/ Fri, 27 May 2022 18:32:10 +0000 https://fedscoop.com/?p=52950 The Enterprise Data Analytics and Visualization (EDAV) platform lets CDC scientists catalog, analyze and publish findings faster.

The post CDC looks to improve internal data sharing with centralized, cloud-based ecosystem appeared first on FedScoop.

]]>
The Centers for Disease Control and Prevention launched a centralized, cloud-based data ecosystem to streamline intra-agency information sharing, according to the deputy director for public health science and surveillance.

Speaking during the National Center for Health Statistics Board of Scientific Counselors meeting Thursday, Dan Jernigan said the Enterprise Data Analytics and Visualization (EDAV) platform lets CDC scientists catalog, analyze and publish findings faster.

The COVID-19 pandemic revealed the CDC has a data-sharing problem due to its many one-off, proprietary systems tracking individual diseases, but EDAV allows for reuse of forecasting solutions.

“This is an important way that we are trying to have systems that are not unique to the pathogen but are pathogen and program agnostic,” Jernigan said.

Next, the agency’s Data Modernization Initiative (DMI) will connect EDAV core services with its Microsoft Azure cloud environment to rearchitect siloed systems like ArboNET, used to share arbovirus case information.

While Azure is the primary EDAV environment, the CDC sees itself as a multi-cloud environment, and future building blocks and applications will be cloud agnostic “as much as possible,” Jernigan said. The agency brought in architects familiar with different environments and is working with other cloud providers to develop tools that work in multiple environments for the public health benefits.

Amazon Web Services is used for the National Syndromic Surveillance Program, and Amazon provides the platform for the CDC’s biggest data-sharing intermediary, the AIMS Platform used by the Association of Public Health Laboratories.

The DMI is also reimagining data flow into the CDC through a consortium, including the Office of the National Coordinator for Health Information (ONC), developing a North Star Architecture. The future-state public health ecosystem will ensure federal, state and local health department information systems are connected and interoperable.

ONC is developing new data standards, in accordance with 21st Century Cures Act requirements, that the North Star Architecture will use to decrease the reporting burden on health care providers and hospitals, eliminate the need for phone calls, and improve national disease forecasting and mitigations.

The CDC further established a Consortium for Data Modernization, with public health partners and industry associations, that meets biweekly to identify issues and decide who will address them. The agency will also reestablish the Data and Surveillance Workgroup under the Advisory Committee for the Director this summer.

Lastly, the CDC is holding listening sessions with potential private sector partners on the development of prototypes that will further the DMI.

“We don’t have all the funding that we need to do it,” Jernigan said. “But we are going to be targeting that funding to get critical efforts underway.”

The CDC is budgeting for the DMI based on five priorities: building the right foundation, accelerating data to action, workforce, partnerships, and change management.

Building the right foundation involves getting data from the appropriate sources. For instance, the National Center for Health Statistics (NCHS) is part of a $200 million grant that will fund states standardizing vital statistics, immunization, and laboratory case reporting data from electronic health records and other sources.

From there the CDC will ensure there’s a secure, accessible cloud environment for the data to land and that there are tools available to state and local health departments to analyze the information available.

“We want to be able to have rapid outbreak responses and develop common operating pictures,” Jernigan said.

The DMI is integrating data from nontraditional sources, and the CDC received $3 billion through the American Rescue Plan Act for a five-year program grant that will help hire data scientists and other personnel.

Each of the five DMI priorities has an implementation team associated with it that are standing up communities of practice for developing definitions, identifying barriers and risks, and setting objectives and desired results.

“Our ultimate goal is to move from siloed and brittle public health data systems to connected, resilient, adaptable and sustainable. And that sustainable piece is going to be important as we move forward, thinking about how we’re going to keep these efforts going — response-ready systems that can help us solve problems before they happen and reduce the harm caused by the problems that do happen,” Jernigan said. “So essentially better, faster, actionable intelligence for decision-making at all levels of public health.”

The post CDC looks to improve internal data sharing with centralized, cloud-based ecosystem appeared first on FedScoop.

]]>
52950
Report: CDC Data Modernization Initiative remains largely unplanned https://fedscoop.com/cdc-data-modernization-initiative-largely-unplanned/ Wed, 27 Apr 2022 21:24:58 +0000 https://fedscoop.com/?p=51105 The agency hasn't provided actions or deadlines for most projects and lacks a plan for allocating $1.1 billion in federal funding.

The post Report: CDC Data Modernization Initiative remains largely unplanned appeared first on FedScoop.

]]>
The Centers for Disease Control and Prevention lacks a detailed Data Modernization Initiative framework due to projects’ complexity and COVID-19 response being a competing priority, according to a Government Accountability Office report released Wednesday.

While the Department of Health and Human Services agreed CDC needs to provide actions and deadlines for data modernization projects, the agency continues to look for subject matter experts up to the task.

CDC received $1.1 billion for its Data Modernization Initiative beginning in 2020, when the COVID-19 pandemic made clear its infrastructure was lacking, but there’s still no plan for allocating those funds.

“Without more specific, actionable plans, CDC may not be able to gauge its progress on the initiative or achieve key results in a timely manner,” reads the GAO report. “In addition, such lack of progress to implement enhanced surveillance systems could affect the quality and timeliness of data needed to respond to future public health emergencies.”

CDC managed to launch a COVID-19 Electronic Laboratory Reporting system in 2020 and has plans to modernize the National Syndromic Surveillance Program, National Notifiable Diseases Surveillance System and National Vital Statistics System, but it still has more than 100 surveillance systems that require continued updates, maintenance and IT innovations.

Many of these systems are plagued by outdated data collection and transmission methods like manual entry and faxing, a lack of common data standards across state and local systems leading to data inconsistencies, and a lack of interoperability hindering data sharing. The traditional federal funding model led to the rise of disparate surveillance systems among states and localities, the health departments of which struggle to maintain the IT workforces that tend to such systems.

“The federal government lacks an interoperable network of systems for near real-time public health situational awareness 15 years after federal law first mandated that the Department of Health and Human Services establish such a network,” reads the report.

While lawmakers passed three such laws since 2006, the latest being the Pandemic and All-Hazards Preparedness and Advancing Innovation Act of 2019, GAO found HHS had made “little progress” planning a network since 2017  — hindering the ability of public health officials to monitor and respond to COVID-19 or future pandemics. HHS hasn’t even established an integrated planning team and didn’t comment on GAO’s finding.

GAO further found that CDC could be better positioned to coordinate national COVID-19 surveillance if it included in its approach objectives and ways to measure progress, which HHS agreed with.

Some public health officials have called for CDC to update its surveillance strategies to reflect the COVID-19 pandemic shifting from a crisis to control situation.

GAO suggested CDC might set an objective of relying on wasterwater surveillance where clinical testing is limited, to achieve its goal of identifying disease outbreaks, and then set targets for measuring how well it’s filling in reporting gaps. Other objectives and measures might be tied to disease variant surveillance.

HHS noted in its response that defining objectives won’t solve all its surveillance woes.

“Progress toward meeting surveillance goals is affected by agency efforts, as well as jurisdictional activities, funding, data use agreements and reporting authorities that are outside of the control of the agency,” reads the response.

The post Report: CDC Data Modernization Initiative remains largely unplanned appeared first on FedScoop.

]]>
51105