{"id":45244,"date":"2021-06-30T13:50:24","date_gmt":"2021-06-30T10:50:24","guid":{"rendered":"https:\/\/forklog.com\/en\/?p=45244"},"modified":"2025-08-31T20:50:37","modified_gmt":"2025-08-31T17:50:37","slug":"study-finds-u-s-federal-agencies-lax-in-facial-recognition-use","status":"publish","type":"post","link":"https:\/\/u1f987.com\/en\/study-finds-u-s-federal-agencies-lax-in-facial-recognition-use\/","title":{"rendered":"Study finds U.S. federal agencies lax in facial-recognition use"},"content":{"rendered":"<p>The U.S. Government Accountability Office found an almost complete lack of accountability by federal agencies in their use of private facial-recognition systems such as Clearview AI. The matter is described in a<a href=\"https:\/\/www.gao.gov\/products\/gao-21-518\" target=\"_blank\" rel=\"noreferrer noopener nofollow\"> report<\/a> by the agency.<\/p>\n<p>According to the study, of 14 federal agencies that use private facial-recognition systems for criminal investigations, only U.S. Immigration and Customs Enforcement developed a list of approved software vendors and a log of its use.<\/p>\n<p>The agencies were also asked how they deployed the technology during the protests in the United States in the summer of 2020, and during the Capitol Hill unrest on January 6, 2021.<\/p>\n<p>Six agencies, including <span data-descr=\"Federal Bureau of Investigation\" class=\"old_tooltip\">FBI<\/span> and the United States Postal Inspection Service, used biometric identification against \u201cpersons suspected of violating the law\u201d during the protests.<\/p>\n<p>Capitol Police, U.S. Customs and Border Protection, and the Bureau of Diplomatic Security also used the technology to investigate events at the U.S. Capitol.<\/p>\n<p>According to the researchers, if federal agencies do not know which facial-recognition systems they are using, they have no way to mitigate privacy and security risks or ensure high algorithmic accuracy.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cWhen agencies use facial-recognition technology without a privacy impact assessment, there is a risk that they will not comply with privacy-related laws, rules and policies,\u201d the report says.<\/p>\n<\/blockquote>\n<p>Besides private systems, many agencies use government tools for biometric identification. Twenty of 42 surveyed federal agencies reported using facial-recognition technologies supported by the Pentagon and the Department of Homeland Security.<\/p>\n<p>Government tools can contain huge amounts of identifiers: according to the GAO, the Department of Homeland Security\u2019s system stores more than 835 million biometric identifiers.<\/p>\n<p>The authors of the report issued 26 recommendations for federal agencies using facial recognition.<\/p>\n<p>In June, Amnesty International reported that<a href=\"https:\/\/u1f987.com\/en\/news\/amnesty-international-nypd-operates-15280-facial-recognition-cameras\"> the New York Police Department may track people and recognise their faces<\/a> using 15,280 surveillance cameras.<\/p>\n<p>In April, U.S. Senator Ron Wyden introduced a bill that<a href=\"https:\/\/u1f987.com\/en\/news\/us-lawmakers-propose-to-ban-clearview-ai-by-law\"> would prohibit government agencies from using Clearview AI<\/a> without a court order.<\/p>\n<p>In March, activists filed a lawsuit in California<a href=\"https:\/\/u1f987.com\/en\/news\/covid-19-diagnosis-from-cough-sounds-computer-vision-fooled-by-stickers-and-other-ai-news\"> seeking to halt the operations of Clearview AI<\/a> in the state.<\/p>\n<p>Subscribe to ForkLog News on Telegram: <a href=\"https:\/\/t.me\/forklogAI\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">ForkLog AI<\/a> \u2014 all the news from the world of AI!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The U.S. GAO found an almost complete lack of accountability by federal agencies in their use of private facial-recognition systems such as Clearview AI.<\/p>\n","protected":false},"author":1,"featured_media":45245,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"1","news_style_id":"1","cryptorium_level":"","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[438,1515,26],"class_list":["post-45244","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-artificial-intelligence","tag-tracking","tag-usa"],"aioseo_notices":[],"amp_enabled":true,"views":"19","promo_type":"1","layout_type":"1","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/45244","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/comments?post=45244"}],"version-history":[{"count":1,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/45244\/revisions"}],"predecessor-version":[{"id":45246,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/45244\/revisions\/45246"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/media\/45245"}],"wp:attachment":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/media?parent=45244"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/categories?post=45244"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/tags?post=45244"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}