open data Archives | FedScoop https://fedscoop.com/tag/open-data/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Tue, 13 Jun 2023 14:11:01 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 open data Archives | FedScoop https://fedscoop.com/tag/open-data/ 32 32 Hill staffers participate in first-ever Data Skills for Congress program https://fedscoop.com/hill-staffers-participate-in-first-ever-data-skills-for-congress-program/ https://fedscoop.com/hill-staffers-participate-in-first-ever-data-skills-for-congress-program/#respond Tue, 13 Jun 2023 14:11:00 +0000 https://fedscoop.com/?p=69448 Staff and policy aides from the offices of Senator Tim Scott, R-S.C., and Reps. Derek Kilmer, D-Wash., Gerry Connolly, D-Va., and Mark Takano, D-Calif., among others, took part in the program.

The post Hill staffers participate in first-ever Data Skills for Congress program appeared first on FedScoop.

]]>
Dozens of congressional staffers from key offices across Capitol Hill will receive Congressional Data Certificates after participating in the first-ever Data Skills for Congress program, FedScoop has learned.

The course was designed to educate congressional staff on federal data policy, how to better work with government data, and to modernize government data via new policy ideas.

Staff and policy aides from the offices of Sen. Tim Scott, R-S.C., and Reps. Derek Kilmer, D-Wash., Gerry Connolly, D-Va., and Mark Takano, D-Calif., among others, took part in the program run by the University of California at Berkeley and USAFacts, a nonprofit and nonpartisan civic initiative focused on making government data more accessible. The cohort was 60% Democrats, 20% Republicans and 20% nonpartisan, according to organizers.

Forty-two staffers enrolled in the program, which began in February and will conclude this month. The program included eight classroom sessions held remotely with a mix of live and recorded lectures and opportunities for in-person meetings in D.C.

“The Data Skills for Congress program, launched in 2023, equips member and professional staff with skills to use data in policy-making and constituent services, and write legislation to improve public data,” USAFacts said in a blog post last week.

“This free program isn’t just an education in data literacy in order to shape policies that ensure accurate, usable data flows within government. It’s a catalyst for congressional modernization and a rallying cry for greater data use across Congress,” the group added.

The Data Skills for Congress class is the first program of its kind approved by the House and Senate Ethics Committees and is intended to be a first step toward providing skills and context for data policy and practices.

Some members of Congress, like Kilmer, are pushing for greater data-driven decision-making in Congress through recently introduced bipartisan legislation that would create a commission on “evidence-based policymaking” within Congress to ensure policymaking is based more on federal data and facts rather than opinions. The bill would also push for the creation of a chief data office responsible for cultivating congressional data strategies.

The Data Skills for Congress organizers say they exceeded enrollment goals in this first program by 66% and 87% of participants reported they would recommend the program to their peers. 

“I learned a lot and I think these are basic skills all congressional staff should have,” one congressional participant said, according to USAFacts.

The pilot program was focused on five key objectives related to U.S. open data topics:

  • Educate participants on existing U.S. data policies through seminars led by data policy experts;
  • Develop an understanding of open data challenges and technologies common in the U.S.;
  • Build basic skills in data collection and visualization;
  • Apply new open data knowledge to produce reports based on publicly available data or draft policy to improve government data; and
  • Create relationships with other congressional staff who share an interest in open data and its use in Congress.

The post Hill staffers participate in first-ever Data Skills for Congress program appeared first on FedScoop.

]]>
https://fedscoop.com/hill-staffers-participate-in-first-ever-data-skills-for-congress-program/feed/ 0 69448
U.S. chief data scientist explains government’s push for greater use of disaggregated data https://fedscoop.com/us-chief-data-scientist-interview/ Fri, 09 Sep 2022 19:52:59 +0000 https://fedscoop.com/?p=60032 Denice Ross spells out her priorities over the last 10 months leading the government's data strategy.

The post U.S. chief data scientist explains government’s push for greater use of disaggregated data appeared first on FedScoop.

]]>
U.S. Chief Data Scientist Denice Ross remembers the killing of 18-year-old Michael Brown by Ferguson, Missouri police as a galvanizing moment for federal officials in their approach to open data.

At the time of the young man’s death in 2014, police departments did not release use of force data — basic information needed by federal officials to determine how Black communities were being affected by law enforcement violence — and led her to spearhead the novel Police Data Initiative.

The effort started with 14 police departments committing to opening at least three datasets — their use of force dataset almost always being one — and 129 jurisdictions were on board by the end of the Obama administration. Police and citizen expectations of what transparency and accountability should look like, and what data should be open, had changed, Ross told FedScoop in an exclusive interview.

“But we didn’t create a mechanism for turning that data into action, so that’s why I’m back.” Ross said. “Because open data is necessary and not sufficient to drive the type of action that we need to create a more equitable society.”

Ross was a Presidential Innovation Fellow at the time.

Now the U.S. chief data scientist since November, Ross’ focus has been ensuring the data agencies are using and publishing are yielding more equitable outcomes for Americans. And that requires the “next generation of open data” as she sees it: disaggregated data.

Disaggregated data is separated into smaller units, often demographically, to answer questions like which populations are underserved by federal programs and policies and make course corrections narrowing service gaps. The process is time intensive and necessitates skilled data practitioners, including career federal officials upskilled in data science, Ross said.

President Biden stressed his commitment to equitable data from his first day in office with the immediate issuance of the Racial Equity Executive Order in support of underserved communities. When the White House’s Equitable Data Working Group (EDWG), created by the order, released its five recommendations in April for improving use, normalizing disaggregated data while protecting privacy topped the list.

“Now everybody is talking about disaggregating data,” Ross said. “It used to be — I’ve been in this field for 20 years — I always avoided that word because it was so jargoney, and nobody knew what we were talking about.”

Still agencies are a “mixed bag” when it comes to collecting disaggregated data and using it properly, she added.

Bright spots include the Federal Interagency Council on Statistical Policy recently releasing a searchable catalog of disaggregated datasets on Asian, Native Hawaiian and Pacific Islander populations, as well as agencies disaggregating grant data by location to ensure fairer distribution, Ross said.

The chief data scientist has spent the last 10 months building a small team within the White House Office of Science and Technology Policy that supports the Biden administration’s biggest priorities like the Bipartisan Infrastructure Law (BIL) with $1.2 billion behind it, Inflation Reduction Act and Customer Experience Executive Order reducing barriers to government benefits. For the first time, a dedicated team is applying disaggregated data to a president’s policy agenda.

“What I do is infuse equitable data into those priorities, so data isn’t a side thing,” Ross said. “It’s actually integrated into how we design programs and policies.”

For that reason it’s important the team be a diverse mix of genders, races, ethnicities and lived experiences, she added.

Ross finds the biggest obstacle to the team’s work is its hybrid nature; there aren’t as many in-person interagency meetings or civic tech innovation summits for sharing best practices since the pandemic began.

“The collaboration tools that we have are just mostly not compatible,” Ross said. “And so we end up making the most of what we can with a PowerPoint and a Zoom call, but that’s a far cry from being in the same room with a bunch of post-it notes and really doing some solid design thinking using the best available tools.”

Ross assists with the hiring of data practitioners within agencies like the Office of Personnel Management’s surge team for the White House’s BIL implementation, which requires hundreds of STEM-trained personnel to support investments.

The chief data scientist’s team is responsible for operationalizing some of the recommendations of the White House Equitable Data Working Group (EDWG), which is transitioning into the National Science and Technology Council’s Subcommittee on Equitable Data. Ross will co-chair that subcommittee.

In addition to disaggregating data, the team is uncovering underused data; improving agencies’ capacity for policy and program equity assessments; creating public data visualization tools; and soliciting feedback from state, local, tribal and territorial communities. Some communities surpass the federal government when it comes to disaggregating data, which is why OSTP recently issued requests for information (RFIs) on LGBTQI+ equity and equitable data engagement and accountability on behalf of the new subcommittee.

“We’re really serious about these RFIs because we need the wisdom from the field in order to be able to implement these Equitable Data Working Group recommendations, in the most useful way, inside the federal government,” Ross said.

Other EDWG recommendations fall to the Office of Management and Budget and U.S. Chief Statistician Karin Orvis, who’s currently modernizing the 25-year-old Statistical Policy Directive No. 15 (SPD 15) on race and ethnicity data standards. 

OMB recently released a plain-language recommendation to agencies for making the best use of existing race and ethnicity standards because the revised guidance isn’t expected until summer 2024. The recommendation includes practical flexibilities for disaggregating race and ethnicity data, approaches to data on more than one race, and advice on adding additional race categories to forms or surveys.

“I’ll just spoil that one,” Ross said. “You should not add some other race category to your forms or surveys because then it makes your data really unusable.”

OMB’s first listening session on the SPD 15 revision is slated for Sept. 15, 2022, and an RFI will be issued soon, Ross said.

The chief data scientist expects the update will have a ripple effect on how SLTT governments collect their own data with California already considering new race categories.

Ross will spend the rest of the year helping stand up the Working Group on Criminal Justice Statistics called for in May’s Policing and Criminal Justice Executive Order, harkening back to her work as a PIF, and ensuring her subcommittee hits the ground running.

“My priority for the rest of 2022 is to get these interagency collaborations going through the Subcommittee on Equitable Data,” Ross said. “That includes working on sexual orientation and gender identity data, infrastructure investment and equity assessments.”

The post U.S. chief data scientist explains government’s push for greater use of disaggregated data appeared first on FedScoop.

]]>
60032
Open data: A critical tool for police reform and racial equity https://fedscoop.com/open-data-critical-tool-police-reform-racial-equity/ https://fedscoop.com/open-data-critical-tool-police-reform-racial-equity/#respond Thu, 14 Jan 2021 14:51:26 +0000 https://fedscoop.com/?p=39660 In a new op-ed, members of CODE make the case that improved policing data is needed to reform the criminal justice system and should be a major priority for the Biden administration.

The post Open data: A critical tool for police reform and racial equity appeared first on FedScoop.

]]>
Last week’s insurrection of far-right extremists storming the Capitol spurred lawmakers to consider important questions about the state of American democracy and the smooth transition of presidential power. The nature of the police response, and the contrast with police treatment of Black Lives Matter protesters, has also intensified concerns about racial bias in policing that have been building since the killing of George Floyd last May.

As the incoming Biden administration tackles police reform, open and transparent data can be one of its most important tools for progress. President-elect Joe Biden and his advisers have announced racial equity as a central plank of their approach to “Build Back Better,” and a key part of that agenda is police and criminal justice reform. The administration’s transition website states it will establish “a nationwide ban on chokeholds; stopping the transfer of weapons of war to police forces; improving oversight and accountability, to create a model use of force standard; [and] creating a national police oversight commission.”

For these policies to be successful, the new administration will require access to quality open data on crime and other demographic data from states and local police departments. The Center for Open Data Enterprise (CODE) — a Washington-based nonprofit whose mission is to maximize the value of open and shared data for the public good  — has just published a Briefing Paper on Policing Data. The paper makes the urgent case that open policing data can inform criminal justice reform and provide insight on police involvement in marginalized communities.

Thanks to available data, we know that the U.S. criminal justice system is the largest in the world with incarceration rates of 698 per 100,000 residents that dwarf other developed nations. As of 2020, 2.3 million Americans were incarcerated in federal, state, or local prisons and jails. African Americans are more likely than white Americans to be arrested, convicted, and experience lengthy prison sentences. These structural problems start with increased police presence in Black communities and the prevalence of bias in the criminal justice system. Policing is the first interaction that many communities of color have with the justice system and can mark the entryway into the court and prison systems.

But what can the data now tell us about policing? Despite improved technology and reporting requirements, open policing data is sparse and inconsistent around the United States. CODE believes that improved policing data is needed to reform the criminal justice system and should be a major priority for the Biden administration.

Policing data can be used to improve accountability, build trust in the criminal justice system, and provide insights to drive better reform. Existing civil society databases and federal data sources demonstrate how open data is deployed for predictive policing, documenting bias in police departments, and measuring the impacts of use of force policies. However, the localized nature of police data and legacy data systems can make it very difficult to uniformly measure crime, access consistent police data, or comprehend what reforms are working and what reforms are not.

CODE’s Briefing Paper identifies key civil and state-level sources of policing data, outlines the policy landscape, and provides recommendations to improve and apply criminal justice data. The paper provides a deep dive into data use cases, cross-cutting issues and challenges, immediate opportunities to improve police data, and key questions to determine future progress. It recommends that policymakers select specific high-value datasets to open across the country, such as data on officer-involved shootings and complaints against officers.

Some jurisdictions are taking steps in the right direction by releasing up-to-date data about their local police forces. For example, the Citizen Complaint Authority in Cincinnati helps the public understand this data in graphs, charts, and maps, making it easier to devise better policies. Moreover, Wallkill, N.Y. publishes an annual spreadsheet of its police force’s demographics, including details like rank, years on the force, gender, and education levels of the 120 people in their department. This helps local citizens identify if the police forces that are policing their communities look like them.

But more discussion, best practices, and better data are needed. What kinds of federal oversight are needed to create better standards and data sharing for police departments? What kinds of nationally available criminal justice and policing data would improve police accountability and substantive criminal justice reform? How can data better help policymakers and researchers understand the disproportionate use of violence by police against Black and Brown communities?  CODE believes that addressing these questions could help chart a path forward for a better open data landscape that supports police accountability and addresses racial inequities.

While the Federal Data Strategy and Foundations for Evidence-based Policymaking Act have continued to enable better data sharing at the federal level, much of the remaining work will require close collaboration with states and local municipalities. As the Biden administration plans to implement sweeping changes that impact the criminal justice system and other areas of racial equity, accessing, analyzing, and applying open data will enable a better understanding of the policy options on the table. This information may also begin the long and important process of increasing transparency and trust between police departments and the public.

Paul Kuhne is Roundtables Program Manager, and Temilola Afolabi is Research Associate, at the Center for Open Data Enterprise. CODE welcomes inquiries and opportunities for collaboration. Please contact temilola@odenterprise.org.

The post Open data: A critical tool for police reform and racial equity appeared first on FedScoop.

]]>
https://fedscoop.com/open-data-critical-tool-police-reform-racial-equity/feed/ 0 39660
VA’s AI Tech Sprint yields a tool for matching patients with clinical trials, and more https://fedscoop.com/va-ai-tech-sprint-students/ https://fedscoop.com/va-ai-tech-sprint-students/#respond Fri, 10 Jan 2020 20:57:59 +0000 https://fedscoop.com/?p=35072 The application was built by a group of local high school students.

The post VA’s AI Tech Sprint yields a tool for matching patients with clinical trials, and more appeared first on FedScoop.

]]>
A group of high school students was one of the top teams to emerge from the recent AI Tech Sprint by the Department of Veterans Affairs, delivering a web application that could help match cancer patients to clinical trials.

The three students from Northern Virginia entered their work in a competition that included software companies like Oracle Healthcare and MyCancerDB. Digital consulting company Composite App took the $20,000 first place prize for its solution — a tool for helping patients stay on track with their care plan —  but the clinical trials team got an honorable mention.

The tech sprint was organized by the VA’s new AI institute, and it focused on partnering with outside organizations and companies interested in applying artificial intelligence tools and techniques to VA data.

The high school team’s members — Shreeja Kikkisetti, Ethan Ocasio and Neeyanth Kopparapu — met as part of the Northern Virginia-based nonprofit Girls Computing League. They were unique in a competition otherwise dominated by adult professionals from software and health care companies.

Their solution, the Clinical Trials Selector, takes lab results, diagnosis and demographic data from the VA and Centers for Medicare and Medicaid Services (CMS) and uses natural language processing as part of its technology to match patients to trials from the National Cancer Institute’s clinical trials database.

The current process for matching patients to trials is a bit more laborious for both the patients themselves and for doctors, Gil Alterovitz, the VA’s director of AI, told FedScoop. It can involve veterans inputting their own data into the clinical trials database — a process that can be confusing and time-consuming.

The Clinical Trials team’s solution, on the other hand, envisions a world where veterans don’t need to have an in-depth understanding of their own medical charts — instead they can log into their personalized VA.gov portal and automatically be matched to potential trials. The application is “functional,” team member Ethan Ocasio told FedScoop, but the team is still working to move it out of a sandbox and make it “production ready” for the VA environment.

Shreeja Kikkisetti and Ocasio told FedScoop that the biggest challenge of the sprint was learning to work with medical informatics. But they both said being a part of the sprint was a “fantastic opportunity” to put their coding skills to a real world test. Ultimately, both said they envision the application being an open source framework available for use by a variety of institutions, not just the VA.

Other companies involved in the sprint include Sanford Imagenetics, which created a product for determining of the relevance of pharmacogenetic testing based on patient characteristics, and LifeOmic, which built a visualization tool for the data sharing platform known as the Veterans Precision Oncology Data Commons.

The VA has become increasingly interested in positioning itself as a federal leader in artificial intelligence research and development. The agency has also recently launched AI projects like an effort to use AI to reduce veterans’ wait times for health appointments, and another to scan medical records and evaluate suicide risk as part of the REACH VET program.

The post VA’s AI Tech Sprint yields a tool for matching patients with clinical trials, and more appeared first on FedScoop.

]]>
https://fedscoop.com/va-ai-tech-sprint-students/feed/ 0 35072
This bill could ‘turbocharge’ financial regulators’ analytics https://fedscoop.com/financial-regulators-open-data-legislation/ https://fedscoop.com/financial-regulators-open-data-legislation/#respond Fri, 27 Sep 2019 15:50:50 +0000 https://fedscoop.com/?p=33848 The House Financial Services Committee's leadership is pushing legislation to standardize data at eight agencies, paving the way for RegTech and AI apps.

The post This bill could ‘turbocharge’ financial regulators’ analytics appeared first on FedScoop.

]]>
A bill that would require financial regulatory agencies to standardize and open their data has been reintroduced by House Financial Services Committee leadership.

The Financial Transparency Act would see eight regulators adopt data collection and dispersion standards for the information they collect, including a move to electronic forms.

Data would be made electronically searchable and downloadable in bulk without license restrictions.

A common data structure would streamline agencies’ ability to garner insights from their information, said Hudson Hollister — founder of HData and before that the Data Coalition — at the Data Driven Government event on Wednesday.

“The reason why that is huge is that this means we turbocharge the power of the analytics that regulators can deploy in order to protect their constituencies, in order to enforce their rules, in order to do their jobs,” Hollister said.

The problem of entity identification — identifying relationships between data — when datasets are dirty would no longer be a problem in financial regulation if the bill passes, he added.

Improving data accuracy will also enable the development of regulatory technology, or RegTech, and artificial intelligence applications, said Craig Clay, a president at risk and compliance solutions company DFIN, in a statement.

All eight financial regulators within the Financial Stability Oversight Council would be affected: the Board of Governors of the Federal Reserve System, Commodity Futures Trading Commission, Federal Deposit Insurance Corporation, Federal Housing Finance Agency, National Credit Union Administration, Office of the Comptroller of the Currency, Securities and Exchange Commission, and the Treasury Department.

“The benefits of applying data standards to financial regulatory information as proposed by this legislation are clear: reduced compliance costs for businesses, better information for investors, and a more efficient regulatory oversight system that can effectively identify and address bad actors,” said Nick Hart, CEO of the Data Coalition, in a statement.

Rep. Carolyn Maloney, D-N.Y., who chairs the subcommittee on investor protection, and Rep. Patrick McHenry, R-N.C., ranking member of the main committee, reintroduced the legislation.

Maloney said the bill would “bring financial reporting into the 21st century,” in the announcement.

“Technology plays a key role in how Americans pay bills, save for a home, or even start a new business — it just makes sense for financial regulators to use that same technology to make public data more easily accessible,” McHenry said in a statement.

The post This bill could ‘turbocharge’ financial regulators’ analytics appeared first on FedScoop.

]]>
https://fedscoop.com/financial-regulators-open-data-legislation/feed/ 0 33848
AI development requires good datasets, and OMB wants ideas on how to help https://fedscoop.com/america-ai-initiative-data-omb-rfi/ https://fedscoop.com/america-ai-initiative-data-omb-rfi/#respond Thu, 11 Jul 2019 16:56:40 +0000 https://fedscoop.com/?p=33010 What new or improved open federal data sets could help with AI?

The post AI development requires good datasets, and OMB wants ideas on how to help appeared first on FedScoop.

]]>
The White House Office of Management and Budget is looking for feedback on which government datasets could be released or opened up or generally improved in order to help support the development of artificial intelligence.

The office published its request for input to the Federal Register on Wednesday.

The RFI is part of the administration’s American AI Initiative, an executive order that President Trump signed in February. The directive aims to promote American leadership in the development of this new technology.

As part of this effort, OMB wants feedback on what kind of data people want and how this data will help in AI research and development. The RFI outlines questions like “what Federal data and models are you seeking to use that are restricted to the public?” and “what characteristics should the Federal Government consider to increase a data set or model’s utility for AI R&D (e.g., documentation, provenance, metadata)?” and more.

“Over the years, a number of data sets have already been made available via data.gov,” the RFI document states. “Some of these datasets are fully publicly available, while others have restricted use (see restricted use data sets). However, these data sets may or may not be useful or suitable for AI R&D and testing.”

OMB is betting that the public, academia and private sector companies may have some useful feedback on this, though.

“The Trump Administration understands that unleashing Federal data and models is critical to drive top-notch AI research and promote technological breakthroughs and competitiveness,” Michael Kratsios, deputy assistant to the president for technology policy, said in a statement. “Importantly, as the RFI and the American AI Initiative reflect, we also take seriously the need to balance access to data and maintaining the civil liberty and privacy protections Americans expect.”

Another component of the National AI Initiative is the National Institute of Standards and Technology’s efforts to craft a “plan for federal engagement in AI standards.” NIST published a draft version of the plan last week, and is accepting public comments until July 19.

Comments on the OMB RFI will be accepted until Aug. 10.

“This RFI represents yet another step forward in the American AI Initiative to accelerate our leadership and empower our innovators and the American people,” Kratsios said.

The post AI development requires good datasets, and OMB wants ideas on how to help appeared first on FedScoop.

]]>
https://fedscoop.com/america-ai-initiative-data-omb-rfi/feed/ 0 33010
How federal agencies can use agile development to apply open data https://fedscoop.com/the-opportunity-project-report-center-for-open-data-enterprise/ https://fedscoop.com/the-opportunity-project-report-center-for-open-data-enterprise/#respond Wed, 19 Jun 2019 13:11:59 +0000 https://fedscoop.com/?p=32721 The Center for Open Data Enterprise published a report on The Opportunity Project and the lessons of its success.

The post How federal agencies can use agile development to apply open data appeared first on FedScoop.

]]>
Over the past three years, a small, dedicated team of federal innovators has been steadily transforming the way government agencies apply open government data. The Opportunity Project (TOP), housed in the Census Bureau at the U.S. Department of Commerce, has adopted agile development principles to help government, communities, and the technology industry build high-impact, data-driven digital tools and platforms.

TOP facilitates 12- to 14-week technology development cycles that bring together tech teams from industry, academia, nonprofits, and user communities to work with federal data stewards on public challenges in health care, education, transportation, and other areas. This innovative approach has produced more than 70 digital tools since TOP was launched in 2016 and is now being replicated by others in government. Some of the most recent TOP products include:

  • A risk assessment tool using satellite data to find areas where poor infrastructure may be vulnerable during natural disasters;
  • A model using geospatial and emergency medical services (EMS) data to predict where and when the risk of opioid overdose will be high;
  • Online tools that use artificial intelligence (AI) and occupational data to match veterans with job apprenticeships; and
  • A website that uses federal spending and audit data to maximize the use of funds for addressing homelessness.

Today, the IBM Center for the Business of Government published our report on TOP and the lessons of its success. The report, “Agile Problem Solving in Government: A Case Study of the Opportunity Project,” describes what has made TOP successful, how TOP can be further developed, and how government agencies can adopt the TOP model. The Center for Open Data Enterprise (CODE) participated in the 2018 TOP technology development cycle and wrote the report as part of our organization’s mission to maximize the value of open government data for the public good.

As the report explains, The Opportunity Project combines three key elements: the power of open government data, public-private collaboration, and high-energy agile approaches to software development. By adopting and adapting TOP’s approach to their own missions and programs, government agencies can apply agile, collaborative, data-driven solutions to a wide range of public problems. Here are some of the elements that TOP has shown to be key to success.

Collaboration based on shared benefits 

The Opportunity Project recruits diverse participants by delivering benefits for all stakeholders. Government agencies benefit from the resources that tech teams contribute, and from the experience of participating in agile, user-focused projects. Companies that contribute their work through tech teams can showcase their capabilities while learning about government data resources. And user advocates participating in TOP help ensure that their needs have priority in government data programs and the resulting platforms and tools.

Organizing projects around common themes

In 2018, The Opportunity Project used a common theme to organize and provide focus for a number of their projects. About half the projects last year were organized through a “geo-cohort” that included experts in geospatial data to tackle a number of related problems using the federal government’s geospatial data assets. This year, TOP is organizing most of its work around two major themes: workforce development and increasing participation in the 2020 U.S. Census.

Commitment to user engagement

From the beginning, TOP has engaged user advocates who can help shape projects to ensure that end-users’ needs are met. Recent user advocates have represented groups as diverse as military veterans and the Choctaw Nation of Oklahoma. The user perspective has influenced many TOP projects and contributed to their success.

Combining virtual and in-person collaboration

For most of the past three years, TOP has functioned almost exclusively through virtual meetings and participation. Federal agency representatives and tech teams have connected by phone and online conferencing, and have met in person only after their work is complete. In 2018, however, TOP supplemented this virtual collaboration with two in-person meetings: a meeting in Puerto Rico to convene participants working on geospatial projects, and a user engagement workshop for projects led by the U.S. Department of the Treasury and the White House Office of Management and Budget. The experience showed that combining virtual and in-person meetings is a powerful model for collaboration, and TOP plans to continue this approach with an initial workshop on July 2 in Chicago.

Showcasing success

Every year, TOP showcases the results of its work with a Demo Day, in which teams present the products they have built. The March 2019 Demo Day launched more than 20 digital tools and platforms produced during the 2018 development cycle for an audience of more than 200 people. The Demo Day combined dynamic speakers, effective presentations, and keynotes from government leaders, all live-streamed and made available online. These annual events encourage further adoption of the TOP methodology, participation in upcoming projects, and potential partnerships to make the products developed through TOP sustainable and scalable.

Over the past three years, TOP has developed a network of government agency partners and a clear, replicable methodology that set the stage for applying its agile approach to data-driven projects across government. Last October, the U.S. Department of Health and Human Services launched the first federal agency effort specifically based on the TOP methodology. The success of this HHS Health Tech Sprint shows that the TOP approach can be adapted by individual agencies to address complex public problems in service of their mission.

The Opportunity Project’s website includes many examples of its work and a detailed toolkit for using its methodology. We hope that our new report will help increase awareness of TOP’s work and encourage more federal agencies to follow this innovative, successful model.

Joel Gurin is president of the Center for Open Data Enterprise (CODE), a Washington-based nonprofit that works to maximize the value of open government data for the public good. Katarina Rebello is CODE’s director of programs.

The post How federal agencies can use agile development to apply open data appeared first on FedScoop.

]]>
https://fedscoop.com/the-opportunity-project-report-center-for-open-data-enterprise/feed/ 0 32721
U.S. CIO teases forthcoming Federal Data Strategy https://fedscoop.com/federal-data-strategy-2019-suzette-kent/ https://fedscoop.com/federal-data-strategy-2019-suzette-kent/#respond Wed, 27 Mar 2019 17:08:22 +0000 https://fedscoop.com/?p=31799 Suzette Kent expects the data strategy will be released in coming weeks with the White House's key data priorities for 2019.

The post U.S. CIO teases forthcoming Federal Data Strategy appeared first on FedScoop.

]]>
As the White House plans to release the first Federal Data Strategy “very soon,” U.S. CIO Suzette Kent teased Wednesday several of the priorities the administration plans to include in the strategy.

Kent said the data strategy will arrive in the coming weeks with the White House’s key data priorities for 2019 — “the actions that we’re asking agencies to prioritize for 2019” and “things that will continue to propel us forward as quickly as possible.”

She briefly described what to expect when the strategy drops: “We’re ensuring that we’re using federal data to grow the economy and increase effectiveness in government,” Kent said. The administration is also focused, she said, on “preserving individual privacy and building citizen trust. And ensuring that we’re prioritizing certain data sets that are important to stimulating our economy, protecting our nation and continuing research in this field.”

“We’re also expanding some of the things that we have done in the geospatial area, which is one of the most successful areas of open data so far,” she noted, adding that there will be continued and added focus on fiscal data transparency.

Along with those priorities, Kent said the administration has teed up “a set of activities that are focused around expanding our ethical frameworks, building data and access tools that are reusable.” Data.gov, for instance, is a great tool in its intent but perhaps isn’t as easy to use as federal agencies would like, she said. “We have to make those more usable and leverage modern technology.”

Speaking at the AFCEA DC Artificial Intelligence and Machine Learning Summit, Kent tied the new data strategy back to AI as the arduous but necessary foundation that needs to be developed before to ensure the success of intelligent automation.

“Some of the investments we have to make in data are some of the toughest work that we’re going to have to do in the next few years,” she said. But in doing that tough work, it gets federal agencies to a place of “transformative capabilities” that “help us solve some of our most complex problems faster and in ways that we couldn’t even imagine many years ago.”

The administration has been developing the Federal Data Strategy since the release of the President’s Management Agenda, which features a cross-agency priority goal for “leveraging data as a strategic asset.” The team leading that goal issued several requests for public input into the development of the strategy.

Later in 2019 after the release of the strategy, the administration will issue a one-year action plan for the actions set out in the plan.

The post U.S. CIO teases forthcoming Federal Data Strategy appeared first on FedScoop.

]]>
https://fedscoop.com/federal-data-strategy-2019-suzette-kent/feed/ 0 31799
How to run an agency Opportunity Project sprint https://fedscoop.com/run-agency-opportunity-project-sprint/ https://fedscoop.com/run-agency-opportunity-project-sprint/#respond Fri, 01 Mar 2019 16:20:00 +0000 https://fedscoop.com/?p=31517 Find a tough question then get out of the way — what HHS learned from "TOP Health," a fourteen-week pilot sprint modeled after The Opportunity Project at the Census Bureau.

The post How to run an agency Opportunity Project sprint appeared first on FedScoop.

]]>
When it comes to running an Opportunity Project sprint at a federal agency, the best advice Kristen Honey can give is to craft a really precise, difficult question, and then get out of the way and let your collaborators take over.

“They will surprise you,” Honey said.

Honey, an innovator in residence at the Office of the CTO at the Department of Health and Human Services, is fresh off “TOP Health,” a fourteen-week pilot sprint modeled after The Opportunity Project (TOP) at the Census Bureau and run by HHS with help from the Presidential Innovation Fellows program. TOP Health began in October 2018 and wrapped up in January — because of the partial government shutdown, the initiative’s showcase and demo day were postponed until this week.

TOP Health focused on two big and thorny questions: How can emerging technologies like artificial intelligence help match patients to experimental therapies, and how can digital tools and data be used for the prevention of Lyme disease?

While the Census Bureau has run a number of TOP sprints since it took over the program, which began at the White House under President Obama, TOP Health was something different. The experiment: Can another agency organize a TOP sprint around its own data and problem sets? Is the methodology mature enough?

Well, yes. But the three-month-long pilot wasn’t without its challenges. For example, there isn’t yet a formal how-to playbook for TOP. So while HHS had the example of sprints run by Census, it didn’t have any ready-made materials that would have made organizing the initiative a little easier. This is something Honey hopes will come out of the HHS experiment, a set of “TOP in a box” materials that other agencies can draw on moving forward.

But despite organizational challenges, there are big benefits to running a project like this that make wading through the unknown worth it, Honey said.

Above and beyond the products and tools created, which are owned by the industry collaborators, getting feedback on open data, from the people and organizations that use it, is really valuable for agencies.

“The federal government has unlocked a lot of data,” Honey said. “But they range all over the board on discovery, and how accessible they are.” The feedback loop between federal data stewards on the one hand and industry, nonprofit or academic collaborators on the other allows agencies to see the actionable insights that can come from the data.

“And really that’s why we’re unlocking all this data,” Honey said. “It’s not just to add more numbers, to get a higher number on data.gov. You actually want this data unlocked for a purpose.”

All in all, while the future of similar agency-specific TOP sprints remains to be seen, Honey and team are celebrating a successful pilot edition. “The outcome exceeded our expectations,” she said.

The post How to run an agency Opportunity Project sprint appeared first on FedScoop.

]]>
https://fedscoop.com/run-agency-opportunity-project-sprint/feed/ 0 31517
The future of the OPEN Government Data Act relies largely on what CDOs do with it https://fedscoop.com/future-open-government-data-act-relies-largely-cdos/ https://fedscoop.com/future-open-government-data-act-relies-largely-cdos/#respond Fri, 08 Feb 2019 20:11:35 +0000 https://fedscoop.com/?p=31291 Advocates hope that the CDOs will be able to help agencies overcome a "natural bureaucratic aversion to openness."

The post The future of the OPEN Government Data Act relies largely on what CDOs do with it appeared first on FedScoop.

]]>
The OPEN Government Data Act, which President Trump signed in to law in January as part of the Foundations for Evidence-Based Policymaking Act, requires that all agencies designate a nonpolitical chief data officer. And this is a good thing because the future of the law, advocates argued Thursday, relies very much on what these CDOs do.

“The CDO position is going to be really important to help shifting the culture within agencies,” Christian Troncoso, policy director at BSA, said during a panel discussion hosted by the Data Coalition. “For the first time, we’ll have someone at every agency whose job it is to evangelize the importance of open data.”

While some agencies already have a CDO role, many don’t. But with a push from this new law, that’s about to change. Supporters of the legislation see these data leaders as key allies inside agencies, working to make sure the law gets implemented according to plan.

Per Troncoso, these new CDOs will have a big role to play in helping to overcome agencies’ “natural bureaucratic aversion to openness.”

“There’s, I think, naturally, a tendency within government or any other large institution to favor risk aversion and opacity… people take a sort of siloed view of what they’re working on and don’t necessarily appreciate the fact that the data they may be generating in the course of a project could also be helpful to their colleagues within the agency, certainly, but then to their colleagues across government as well,” Troncoso said. “The CDOs are going to have a really important role in sort of changing the culture within government on these sorts of issues.”

Christian Hoehner of the Data Coalition shared a theory for how CDOs can go about building influence and trust within their agencies — by being helpful. “The extent to which CDOs can kind of position themselves as that internal help desk or clearinghouse… I think that’s where they’ll start amassing a lot of influence through the time that they’re saving their colleagues,” he said.

Overall, the group celebrated the passage of the law but acknowledged that there remains a long road to implementation ahead.

As Rep. Derek Kilmer, D-Wash., the bill’s original sponsor, said in his remarks at the opening of the event — “passing the OPEN Government Data Act was a big step, but it wasn’t the last step.”

The post The future of the OPEN Government Data Act relies largely on what CDOs do with it appeared first on FedScoop.

]]>
https://fedscoop.com/future-open-government-data-act-relies-largely-cdos/feed/ 0 31291