{"id":72913,"date":"2023-01-23T13:16:27","date_gmt":"2023-01-23T11:16:27","guid":{"rendered":"https:\/\/forklog.com\/en\/?p=72913"},"modified":"2025-09-09T11:11:11","modified_gmt":"2025-09-09T08:11:11","slug":"tech-giants-urge-us-supreme-court-to-bar-lawsuits-against-algorithms","status":"publish","type":"post","link":"https:\/\/u1f987.com\/en\/tech-giants-urge-us-supreme-court-to-bar-lawsuits-against-algorithms\/","title":{"rendered":"Tech giants urge US Supreme Court to bar lawsuits against algorithms"},"content":{"rendered":"<p>A group of companies, users, scholars and human-rights experts have spoken out in support of the tech giants in the YouTube algorithms case being heard by the US Supreme Court. CNN reports.<\/p>\n<p>The case is Gonzalez v. Google. In 2015, a relative of the plaintiff died in an ISIS attack in Paris. According to the plaintiffs, YouTube&#8217;s recommendation algorithms were partly responsible for the spread of recruitment videos for the terrorist organization.<\/p>\n<p>Google argues that Section 230 of the Communications Decency Act does not permit such lawsuits. Excluding AI-based recommendation systems from legal protection could lead to radical changes to the Internet, allies of the tech giants say.<\/p>\n<p>Companies such as Meta, Twitter and Microsoft have sided with Google, along with critics of the corporations, including Yelp and the Electronic Frontier Foundation.<\/p>\n<p>They are joined by Reddit and a group of volunteer moderators of the platform. They say the lawsuit would set a dangerous precedent. In the future, the ruling could lead to lawsuits against non-algorithmic forms of recommendations and against individual users.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cThe entire Reddit platform is built on recommendations of content by users through voting and pinning posts. In this case there is no doubt about the consequences of the suit: their theory would sharply expand the ability to hold people accountable for their online interactions,\u201d the company said.<\/p>\n<\/blockquote>\n<p>Yelp says their business depends on providing relevant and non-deceptive reviews to their users. A decision in favour of the plaintiffs could disrupt the service&#8217;s core functions, effectively forcing it to stop curating all reviews, including manipulative or fake ones.<\/p>\n<p>Section 230 guarantees that platforms can moderate content to provide users with the most relevant information from the vast amount of information on the Internet, Twitter said.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cToday, an average user would need about 181 million years to download all the data on the Internet,\u201d the companies added.<\/p>\n<\/blockquote>\n<p>Meta argues that a new interpretation of Section 230 would spark broad discussions about what it means to \u201crecommend\u201d something on the Internet.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cIf merely displaying a third-party post in a user\u2019s feed qualifies as a \u201crecommendation,\u201d many services would face potential liability for essentially all third-party content they host,\u201d the company said.<\/p>\n<\/blockquote>\n<p>Meta representatives added that nearly all decisions about sorting, selecting, organizing and displaying third-party content could be interpreted as a \u201crecommendation.\u201d<\/p>\n<p>The court ruling also threatens GitHub. Microsoft says that for a platform with 94 million users, the consequences of restricting Section 230 would be \u201cdevastating.\u201d<\/p>\n<p>A company spokesperson said Bing search and the LinkedIn social network also rely on the algorithmic protection under the aforementioned provision.<\/p>\n<p>According to the Stern Center for Business and Human Rights at New York University, it is impossible to craft a rule that would isolate algorithmic recommendation as a meaningful liability category. Such attempts could lead to \u201cthe loss or suppression of a substantial amount of valuable speech,\u201d especially for minorities.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cWeb sites use \u2018targeted recommendations\u2019 because they make their platforms convenient and useful. Without protection for recommendations, services would have to remove third-party content [&#8230;]. Valuable freedom of speech would disappear,\u201d the NYU statement says.<\/p>\n<\/blockquote>\n<p>In November 2022, Twitter leadership <a href=\"https:\/\/u1f987.com\/en\/news\/elon-musk-axes-twitters-ethical-ai-team\">laid off a group of AI researchers<\/a>, working on transparency and fairness of the platform&#8217;s algorithms.<\/p>\n<p>In September, Facebook&#8217;s recommendation system was accused of <a href=\"https:\/\/u1f987.com\/en\/news\/facebook-algorithms-accused-of-aiding-rohingya-genocide-in-myanmar\">\u201ctargeted incitement of atrocities\u201d<\/a>, committed by the Myanmar military against the Rohingya.<\/p>\n<p>Subscribe to ForkLog news on Telegram: <a href=\"https:\/\/t.me\/forklogAI\" target=\"_blank\" rel=\"noopener nofollow\" title=\"\">ForkLog AI<\/a> \u2014 all the news from the world of AI!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>A group of companies, users, scholars and human-rights experts have spoken out in support of the tech giants in the YouTube algorithms case before the US Supreme Court.<\/p>\n","protected":false},"author":1,"featured_media":72914,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"1","news_style_id":"1","cryptorium_level":"","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[438,1162,738,26],"class_list":["post-72913","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-artificial-intelligence","tag-court-cases","tag-google","tag-usa"],"aioseo_notices":[],"amp_enabled":true,"views":"15","promo_type":"1","layout_type":"1","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/72913","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/comments?post=72913"}],"version-history":[{"count":1,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/72913\/revisions"}],"predecessor-version":[{"id":72915,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/72913\/revisions\/72915"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/media\/72914"}],"wp:attachment":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/media?parent=72913"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/categories?post=72913"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/tags?post=72913"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}