{"id":48000,"date":"2021-08-20T12:05:08","date_gmt":"2021-08-20T09:05:08","guid":{"rendered":"https:\/\/forklog.com\/en\/?p=48000"},"modified":"2025-09-02T00:05:10","modified_gmt":"2025-09-01T21:05:10","slug":"rights-groups-urge-apple-not-to-deploy-mass-photo-scanning-tool","status":"publish","type":"post","link":"https:\/\/u1f987.com\/en\/rights-groups-urge-apple-not-to-deploy-mass-photo-scanning-tool\/","title":{"rendered":"Rights groups urge Apple not to deploy mass photo-scanning tool"},"content":{"rendered":"<p>More than 90 human-rights groups from around the world <a href=\\\"https:\/\/cdt.org\/press\/in-letter-to-apple-ceo-90-organizations-urge-tim-cook-to-scrap-plans-to-weaken-digital-privacy-and-security\/\\\" target=\\\"_blank\\\" rel=\\\"noreferrer noopener nofollow\\\">condemned<\/a> Apple\u2019s plans to scan users\u2019 devices for material depicting child abuse (CSAM).<\/p>\n<p>The Center for Democracy and Technology published an <a href=\\\"https:\/\/cdt.org\/wp-content\/uploads\/2021\/08\/CDT-Coalition-ltr-to-Apple-19-August-2021.pdf\\\" target=\\\"_blank\\\" rel=\\\"noreferrer noopener nofollow\\\">open letter<\/a>, in which it urged the corporation to abandon the use of the tool. In their view, the tech giant could usher in \u201ccensorship, surveillance, and persecution on a global scale.\u201d<\/p>\n<p>Rights groups have expressed concerns about the accuracy of Apple\u2019s technology. They say such algorithms tend to mislabel artworks, health information, educational resources, propaganda messages and other images. There is also a risk of government interference, the organisations say.<\/p>\n<blockquote class=\\\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\\\">\n<p>\u201cAs soon as this capability is built into Apple devices, the company and its competitors will face immense pressure \u2014 and possibly legal requirements \u2014 from governments around the world to scan photos not only for CSAM but also for other images that the government deems undesirable,\u201d the letter says.<\/p>\n<\/blockquote>\n<p>Rights groups fear that authorities could pressure the company to scan devices in search of \u201cterrorist and extremist materials,\u201d or even unflattering images of politicians.<\/p>\n<blockquote class=\\\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\\\">\n<p>\u201cApple will lay the groundwork for censorship, surveillance, and persecution worldwide,\u201d they concluded.<\/p>\n<\/blockquote>\n<p>Back in August, Apple described the tool for<a href=\"https:\/\/u1f987.com\/en\/news\/apple-deploys-ai-to-scan-user-iphones-for-illegal-content\"> scanning users&#8217; photos<\/a> for CSAM.<\/p>\n<p>In mid-August, the company promised<a href=\"https:\/\/u1f987.com\/en\/news\/apple-vows-not-to-turn-its-child-abuse-content-detection-algorithm-into-a-surveillance-tool\"> not to turn the algorithm for detecting illegal content into a surveillance tool<\/a>.<\/p>\n<p>Subscribe to ForkLog&#8217;s Telegram updates:<a href=\\\"https:\/\/t.me\/forklogAI\\\" target=\\\"_blank\\\" rel=\\\"noreferrer noopener nofollow\\\"> ForkLog AI<\/a> \u2014 all the news from the world of AI!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>More than 90 human-rights groups worldwide condemned Apple&#8217;s plans to scan users&#8217; devices for material depicting child abuse (CSAM).<\/p>\n","protected":false},"author":1,"featured_media":48001,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"1","news_style_id":"1","cryptorium_level":"","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[1112,438,1515],"class_list":["post-48000","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-apple","tag-artificial-intelligence","tag-tracking"],"aioseo_notices":[],"amp_enabled":true,"views":"8","promo_type":"1","layout_type":"1","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/48000","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/comments?post=48000"}],"version-history":[{"count":1,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/48000\/revisions"}],"predecessor-version":[{"id":48002,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/48000\/revisions\/48002"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/media\/48001"}],"wp:attachment":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/media?parent=48000"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/categories?post=48000"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/tags?post=48000"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}