Dan Jernigan Archives | FedScoop https://fedscoop.com/tag/dan-jernigan/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Mon, 23 Jan 2023 09:20:27 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 Dan Jernigan Archives | FedScoop https://fedscoop.com/tag/dan-jernigan/ 32 32 HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi https://fedscoop.com/health-information-networks-tefca-success/ Thu, 22 Dec 2022 19:00:00 +0000 https://fedscoop.com/health-information-networks-tefca-success/ About 30% of hospitals remain unconnected to a health information network, but the implementation of network-to-network interoperability may change that.

The post HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi appeared first on FedScoop.

]]>
Multiple applicants expect to have fully operational health information networks for securely sharing clinical data within a year of receiving approval, according to National Coordinator for Health IT Micky Tripathi.

A couple networks are live, and the Office of the National Coordinator for Health IT hopes the first group — among 12 entities that submitted letters of intent — will be officially designated qualified health information networks (QHINs) in early 2023.

Part of the Department of Health and Human Services, ONC published a framework in January for exchanging health information nationwide: the Trusted Exchange Framework and Common Agreement (TEFCA). Required by the 21st Century Cures Act, the framework provides non-binding principles and the agreement technical terms, and now it falls to ONC’s recognized coordinating entity, The Sequoia Project, to approve interoperable QHINs.

“What we’ve heard informally from a number of the prospective QHINs is that they are building in anticipation of getting approved,” Tripathi said, during eHealth Exchange’s annual meeting on Dec. 15. “They think that they would have a pretty good opportunity to do this in the 12-month window and hopefully shorter than that with some of them.”

QHINs will be added on a rolling basis to include electronic health record (EHR) vendors, ambulatory practices, hospitals, health centers, federal and public health agencies, and payers. Epic Systems became the first EHR vendor to announce it would seek QHIN status in August and was later joined by the likes of the eHealth Exchange network and trade association CommonWell Health Alliance.

How TEFCA coexists with other exchanges when it comes to benefits determinations, health care operations, treatment, payment and individual access remains to be seen. But scaling TEFCA will be the “real challenge” and one for which incorporating the Health Level Seven (HL7) Fast Healthcare Interoperability Resource (FHIR) data standard will be key, Tripathi said.

FHIR application programming interfaces streamline health information exchange by eliminating the need for separate data use agreements, and eventually they’ll enable questionnaires, scheduling, links, Clinical Decision Support hooks and subscriptions. That’s why there are already federal deadlines in place for their steady adoption across the public health ecosystem, but QHIN-to-QHIN brokered exchange remains years away.

By the end of 2022, certified EHR vendors must make a FHIR API available to customers.

HL7’s Helious FHIR Accelerator aims to improve the exchange of situational awareness information on hospital and intensive care unit beds available, ventilator counts, personal protective equipment counts, and vaccinations. The HHS Protect system launched during the height of the COVID-19 pandemic provides a lot of that information right now.

“But it’s done via spreadsheets,” Tripathi told FedScoop in July. “A lot of manual work is still done to populate that now.”

The government has spent about $40 billion on EHR infrastructure since the passage of the Health IT for Economic and Clinical Health (HITECH) Act in 2009. Yet clinical operations and health payment systems remain largely rooted in paper because states — most of which still don’t require electronic case reporting — have health authority in the U.S.

Jurisdictional issues and scarce resources are some reasons why about 30% of U.S. hospitals still don’t connect to a health information network, Tripathi said Dec. 15.

Naturally issues with case reports, lab and testing results, and vital records arose early in the pandemic, when they were often being shared by phone or fax.

For all these reasons the Centers for Disease Control and Prevention launched its Data Modernization Initiative (DMI) in 2020 to streamline sharing of electronic health information between care providers and state, local, tribal and territorial (SLTT) health departments. 

The DMI’s first phase has involved getting data from electronic sources into a Microsoft Azure cloud environment, called the Enterprise Data Analytics and Visualization (EDAV) platform, while providing SLTT health departments with automated forecasting analytics tools.

Data standardization is key to improving information sharing between these systems, which is why ONC is working closely with the CDC on its North Star Architecture. The U.S. Core Data for Interoperability (USCDI) Version 4 (v4) that ONC has planned for 2023 will become the de facto minimum set of health data classes and elements for nationwide, interoperable information exchange.

At the same time ONC is developing USCDI+, a nationwide public health data model, for release beyond 2023. Discussions with the CDC and Centers for Medicare and Medicaid Services revealed more than 20 data elements that overlapped, allowing the agencies to agree on a common approach.

ONC is now speaking with the White House Office of Science and Technology Policy and the National Institutes of Health about tailoring a USCDI+ program for President Biden’s Cancer Moonshot program.

EHR vendors support TEFCA and the DMI because they’ll be able to maintain just one customer interface, rather than hundreds to meet the various jurisdictional requirements of SLTT health departments, Tripathi said in July.

Phase I of the DMI is also improving the CDC’s situational awareness, which is based on the Data Collation and Integration for Public Health Event Response (DCIPHER) platform — originally intended to track food-borne diseases. DCIPHER gave rise to HHS Protect and has since had hospital capacity, social vulnerability, mobility, race and ethnicity, social determinants of health, economic, two-on-one, and climate data layered atop it as part of the new Center for Forecasting and Outbreak Analytics’ work, Dr. Dan Jernigan, deputy director for public health science and surveillance, told FedScoop in August.

The center can already do weekly influenza and some Mpox forecasting and has visibility into emerging problems at about 70% of emergency departments.

“To see a fully formed prediction center, it’s going to be a couple years,” Jernigan said. “The numbers of staff that are in the Center for Forecasting right now are in the tens to thirties, but it is anticipated to be a much larger group.”

As part of DMI Phase I, 128 reportable diseases now automatically trigger EHR electronic case reporting, which is routed to the Association of Public Health Laboratories-APHL Informatics Messaging Services (APHL-AIMS) cloud platform and then SLTT health departments. Electronic case reporting increased from 187 facilities pre-pandemic to more than 14,000, more than 30 of which turned on Monkeypox reporting.

While the effort highlights the CDC’s move toward pathogen- and program-agnostic systems through its DMI, electronic case reporting continues to fall short.

“It’s not nearly the volume that we need it to be,” Tripathi said in July. “But at least we’re starting to set up those pathways.”

At the same time the DMI has seen “dramatic improvements” in COVID-19 reporting across immunization information systems (IISs), he added.

IISs were slow to take adult COVID-19 vaccination information, but now they accept line-listed records using privacy-preserving record linkage — even for Monkeypox.

The CDC recently revised its DMI implementation plan, and Phase 2 will focus on improving state health departments’ cloud infrastructure and update the National Electronic Disease Surveillance System (NEDSS) Base System (NBS) that 26 states use for case management.

Cloud migration allows doctors like Phil Huang, director of Dallas Health and Human Services, to match immunization, lab and death records to know if a patient who passed away tested positive for COVID-19 and was vaccinated. 

“That ability to put that data together and integrate it with other kinds of information, even down to the neighborhood level, helps him do his prevention work and his mitigation work in a much more targeted way,” Jernigan. 

CDC proposed the DMI receive about $200 million in fiscal 2023 to continue its “incremental” progress, but the Healthcare Information and Management Systems Society estimated the initiative needs $33 billion over the next 10 years to be successful, he added.

Meanwhile ONC, unable to enforce TEFCA, is working with federal partners to highlight the need for network-to-network interoperability and hoping its rollout leads outside providers to question why they’re still faxing records.

“We were given no dollars and no new authorities to do this,” Tripathi said.

The post HHS, health information networks expect rollout of trusted data exchange next year: Micky Tripathi appeared first on FedScoop.

]]>
63698
CDC looks to improve internal data sharing with centralized, cloud-based ecosystem https://fedscoop.com/cdc-edav-data-sharing-platform/ Fri, 27 May 2022 18:32:10 +0000 https://fedscoop.com/?p=52950 The Enterprise Data Analytics and Visualization (EDAV) platform lets CDC scientists catalog, analyze and publish findings faster.

The post CDC looks to improve internal data sharing with centralized, cloud-based ecosystem appeared first on FedScoop.

]]>
The Centers for Disease Control and Prevention launched a centralized, cloud-based data ecosystem to streamline intra-agency information sharing, according to the deputy director for public health science and surveillance.

Speaking during the National Center for Health Statistics Board of Scientific Counselors meeting Thursday, Dan Jernigan said the Enterprise Data Analytics and Visualization (EDAV) platform lets CDC scientists catalog, analyze and publish findings faster.

The COVID-19 pandemic revealed the CDC has a data-sharing problem due to its many one-off, proprietary systems tracking individual diseases, but EDAV allows for reuse of forecasting solutions.

“This is an important way that we are trying to have systems that are not unique to the pathogen but are pathogen and program agnostic,” Jernigan said.

Next, the agency’s Data Modernization Initiative (DMI) will connect EDAV core services with its Microsoft Azure cloud environment to rearchitect siloed systems like ArboNET, used to share arbovirus case information.

While Azure is the primary EDAV environment, the CDC sees itself as a multi-cloud environment, and future building blocks and applications will be cloud agnostic “as much as possible,” Jernigan said. The agency brought in architects familiar with different environments and is working with other cloud providers to develop tools that work in multiple environments for the public health benefits.

Amazon Web Services is used for the National Syndromic Surveillance Program, and Amazon provides the platform for the CDC’s biggest data-sharing intermediary, the AIMS Platform used by the Association of Public Health Laboratories.

The DMI is also reimagining data flow into the CDC through a consortium, including the Office of the National Coordinator for Health Information (ONC), developing a North Star Architecture. The future-state public health ecosystem will ensure federal, state and local health department information systems are connected and interoperable.

ONC is developing new data standards, in accordance with 21st Century Cures Act requirements, that the North Star Architecture will use to decrease the reporting burden on health care providers and hospitals, eliminate the need for phone calls, and improve national disease forecasting and mitigations.

The CDC further established a Consortium for Data Modernization, with public health partners and industry associations, that meets biweekly to identify issues and decide who will address them. The agency will also reestablish the Data and Surveillance Workgroup under the Advisory Committee for the Director this summer.

Lastly, the CDC is holding listening sessions with potential private sector partners on the development of prototypes that will further the DMI.

“We don’t have all the funding that we need to do it,” Jernigan said. “But we are going to be targeting that funding to get critical efforts underway.”

The CDC is budgeting for the DMI based on five priorities: building the right foundation, accelerating data to action, workforce, partnerships, and change management.

Building the right foundation involves getting data from the appropriate sources. For instance, the National Center for Health Statistics (NCHS) is part of a $200 million grant that will fund states standardizing vital statistics, immunization, and laboratory case reporting data from electronic health records and other sources.

From there the CDC will ensure there’s a secure, accessible cloud environment for the data to land and that there are tools available to state and local health departments to analyze the information available.

“We want to be able to have rapid outbreak responses and develop common operating pictures,” Jernigan said.

The DMI is integrating data from nontraditional sources, and the CDC received $3 billion through the American Rescue Plan Act for a five-year program grant that will help hire data scientists and other personnel.

Each of the five DMI priorities has an implementation team associated with it that are standing up communities of practice for developing definitions, identifying barriers and risks, and setting objectives and desired results.

“Our ultimate goal is to move from siloed and brittle public health data systems to connected, resilient, adaptable and sustainable. And that sustainable piece is going to be important as we move forward, thinking about how we’re going to keep these efforts going — response-ready systems that can help us solve problems before they happen and reduce the harm caused by the problems that do happen,” Jernigan said. “So essentially better, faster, actionable intelligence for decision-making at all levels of public health.”

The post CDC looks to improve internal data sharing with centralized, cloud-based ecosystem appeared first on FedScoop.

]]>
52950