Duane Blackburn Archives | FedScoop https://fedscoop.com/tag/duane-blackburn/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Wed, 09 Feb 2022 18:18:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 Duane Blackburn Archives | FedScoop https://fedscoop.com/tag/duane-blackburn/ 32 32 GSA won’t use facial recognition with Login.gov for now https://fedscoop.com/gsa-forgoes-facial-recognition-for-now/ Wed, 09 Feb 2022 18:18:20 +0000 https://fedscoop.com/?p=47507 The agency's secure sign-in team continues to research the technology and to conduct equity and accessibility studies.

The post GSA won’t use facial recognition with Login.gov for now appeared first on FedScoop.

]]>
The General Services Administration won’t use facial recognition to grant users access to government benefits and services for now, but its secure sign-in team continues to research the technology.

“Although the Login.gov team is researching facial recognition technology and conducting equity and accessibility studies, GSA has made the decision for now not to use facial recognition, liveness detection, or any other emerging technology in connection with government benefits and services until rigorous review has given us confidence that we can do so equitably and without causing harm to vulnerable populations,” said Dave Zvenyach, director of TTS, in a statement provided to FedScoop.

“There are a number of ways to authenticate identity using other proofing approaches that protect privacy and ensure accessibility and equity.”

Login.gov ensures users are properly authenticated for agencies’ services and verifies identities, and the Technology Transformation Services team that manages it is also studying facial recognition equity and accessibility.

GSA‘s methodical evaluation of the technology contrasts with that of the IRS, which announced Monday that it would transition away from using ID.me‘s service for verifying new online accounts after the company disclosed it lied about relying on 1:many facial recognition — a system proven to pose greater risks of inaccuracy and racial bias.

Login.gov currently collects a photo of a state-issued ID and other personally identifiable information, which are validated against authoritative data sources. The last step involves either sending a text message to the user’s phone number or a letter to their address containing a code that must be provided to Login.gov to complete identity verification.

More than 60 applications across 17 agencies — including USAJOBS at the Office of Personnel Management and the Paycheck Protection and Disaster Loan Application programs at the Small Business Administration — use Login.gov, encompassing more than 17 million users.

GSA’s rejection of facial recognition for Login.gov was first reported by The Washington Post, but the technology is most certainly in the agency’s, and the government’s, future.

The White House Office of Science and Technology Policy is crafting an Artificial Intelligence Bill of Rights to protect people from technology infringements and focused its initial request for information on biometrics like facial recognition.

While OSTP’s definition of biometrics needs refining, not all facial recognition algorithms are prejudicially biased. Technical and operational bias also exist and don’t necessarily lead to inequitable outcomes.

“There are not direct correlations between technical and operational biases and prejudicial bias,” Duane Blackburn, science and technology lead at MITRE‘s Center for Data-Driven Policy, told FedScoop in January. “Even though in a lot of policy analyses they’re treated as equivalent.”

The post GSA won’t use facial recognition with Login.gov for now appeared first on FedScoop.

]]>
47507
MITRE: White House biometrics definition requires rethink https://fedscoop.com/biometrics-definition-ai-bill-of-rights/ Wed, 09 Feb 2022 16:22:06 +0000 https://fedscoop.com/?p=47124 OSTP conflated three distinct concepts as biometrics, which will lead to confusion as it attempts craft an AI Bill of Rights.

The post MITRE: White House biometrics definition requires rethink appeared first on FedScoop.

]]>
MITRE’s Center for Data-Driven Policy recommended the White House redefine biometrics as it develops an Artificial Intelligence Bill of Rights, in a request for information response submitted last month.

Within its RFI, the Office of Science and Technology Policy married biometrics for identification with technology for inferring emotion or intent and medicine’s understanding of the term as any biological-based data. MITRE would rather OSTP use the National Science and Technology Council‘s internationally accepted definition of biometrics limiting them to identity matters.

The U.S. lacks a comprehensive privacy law that would serve as the foundation for regulating AI, which has policy groups like the Open Technology Institute pressing the Biden administration for increased oversight and safeguards. OSTP wanted RFI respondents to examine biometrics through the lens of AI to inform the AI Bill of Rights government will use to protect people from problematic technologies but in doing so conflated three distinct concepts, which MITRE holds will lead to confusion.

“They kind of grouped multiple, different technologies into a single grouping, and those technologies all have different backgrounds, different operational issues and different policy considerations,” Duane Blackburn, science and technology policy lead at the Center for Data-Driven Policy, told FedScoop. “Grouping them together like that is going to really complicate the policy analysis and potentially leads to making improper decisions.”

MITRE’s second recommendation for OSTP is that it makes evidence- and science-based policy decisions because misconceptions about identity biometrics abound — the first being they’re not scientific in nature. Blackburn points to the decades of biometrics research, international standards, accreditation programs for examiners and university degrees.

The second misconception is about how face recognition technologies, specifically, are biased. Most people assume the bias is prejudicial for and against certain ethnic groups, and while that may be true for some algorithms, the assumption overlooks technical and operational bias, Blackburn said.

When face recognition technologies were first being developed 20 years ago, image lighting, pose angle and pixel numbers greatly impacted results — known as technical bias.

A face recognition algorithm trained for longer with more data performing more accurately than another is an example of operational bias, which impacts how the system works.

“There are not direct correlations between technical and operational biases and prejudicial bias, even though in a lot of policy analyses they’re treated as equivalent,” Blackburn said. “You can take a biometric algorithm with no differential performance technical bias and create systems with massive prejudicial bias.”

The opposite is also true, he added.

Lastly MITRE recommends OSTP ensure any policy decisions around biometrics are focused and nuanced, given the many biometrics that exist: fingerprint, face recognition, iris recognition and some aspects of DNA.

“You can’t really come up with a singular policy that’s going to be proper for all three or four of those modalities,” Blackburn said.

Using biometrics to unlock a phone is “significantly different” than law enforcement using it to identify a criminal, and decisions will need to be made about what data sharing is allowable under the AI Bill of Rights, he added.

An OSTP task force released a report on scientific integrity in early January reinforcing the need for technical accuracy when making policy decisions. Challenges aside, Blackburn said he remains optimistic OSTP is up to the task of crafting an AI Bill of Rights.

“How can we set up the policy so that it’s accurate from a technical, scientific-integrity perspective, while also meeting the objectives of the public that they represent,” Blackburn said. “It’s not easy, it takes a lot of time and effort, but OSTP and the federal agencies working on these issues have a lot of experience doing that.”

The post MITRE: White House biometrics definition requires rethink appeared first on FedScoop.

]]>
47124