{"id":43197,"date":"2021-05-26T11:55:00","date_gmt":"2021-05-26T08:55:00","guid":{"rendered":"https:\/\/forklog.com\/en\/?p=43197"},"modified":"2025-08-31T02:34:30","modified_gmt":"2025-08-30T23:34:30","slug":"china-tests-emotion-detection-ai-on-uyghurs","status":"publish","type":"post","link":"https:\/\/u1f987.com\/en\/china-tests-emotion-detection-ai-on-uyghurs\/","title":{"rendered":"China tests emotion-detection AI on Uyghurs"},"content":{"rendered":"<p>A software engineer described testing an emotion-recognition system on Uyghurs at police stations in <span data-descr=\"the western region of China home to 12 million Uyghurs belonging to ethnic minorities\" class=\"old_tooltip\">Xinjiang<\/span>, according to <a href=\"https:\/\/www.bbc.com\/news\/technology-57101248\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">BBC<\/a>.<\/p>\n<p><!--more--><\/p>\n<p>According to the engineer, the system uses artificial intelligence and facial recognition to capture a person\u2019s emotions. The technology resembles a lie detector, but with &#8220;much more advanced&#8221; capabilities.<\/p>\n<p>Video cameras were placed at a distance of three metres from the subjects. The test subjects were placed in special chairs, where their wrists and ankles were secured with metal restraints, an eyewitness said.<\/p>\n<p>The engineer demonstrated to journalists and human rights advocates five photographs of detained Uyghurs and noted that the recognition system is intended for a &#8220;preliminary trial without any reliable evidence.&#8221;<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cThe Chinese government uses Uyghurs as test subjects for various experiments, just as rats are used in laboratories,\u201d he said.<\/p>\n<\/blockquote>\n<p>According to the programmer, the system detects and analyzes even subtle changes in facial expressions. The software creates a pie chart, with red segments representing negative emotions.<\/p>\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh4.googleusercontent.com\/6ye9mZ6HnyhMyeKkI5Z-VIGNSjm5nhJEJmGz4Q1MjJEtO2ibNfL9NStGj5c4D7FHHksuAJwZalDjxLu9n5zjIsdNVoUuAV1B75u4glx2xnp4-cJfxVLU9vv5jNy7-b8_S1Pw9Zjr\" alt=\"China tests emotion-detection AI on Uyghurs\"\/><figcaption><em>Conclusions of the emotional well-being assessment; red indicates negative or anxious state. Data: BBC.<\/em><\/figcaption><\/figure>\n<p>China\u2019s embassy in London did not respond to journalists\u2019 questions about the use of emotion-recognition software in the province, but said it respects minority rights.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cPolitical, economic and social rights and the freedom of religious belief for all ethnic groups in Xinjiang are fully guaranteed,\u201d the Chinese diplomatic ministry said.<\/p>\n<\/blockquote>\n<p>In March 2021, Facebook<a href=\"https:\/\/u1f987.com\/en\/news\/espionage-against-uyghurs-on-facebook-ransomware-attacks-and-other-cybersecurity-news\"> blocked a group of Chinese hackers<\/a> who used the platform to surveil Uyghur journalists and activists.<\/p>\n<p>In December 2020, Chinese IT company Alibaba acknowledged that<a href=\"https:\/\/u1f987.com\/en\/news\/alibaba-used-ai-to-identify-minorities-uk-shows-queen-elizabeth-ii-deepfake-and-other-ai-news\"> it had developed facial-recognition technology to identify the Uyghur minority<\/a> in China.<\/p>\n<p>Subscribe to ForkLog news on Telegram: <a href=\"https:\/\/t.me\/forklogAI\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">ForkLog AI<\/a> \u2014 all the news from the world of AI!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>A software engineer described testing an emotion-recognition system on Uyghurs at police stations in Xinjiang.<\/p>\n","protected":false},"author":1,"featured_media":43198,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"1","news_style_id":"1","cryptorium_level":"","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[438,133,1515],"class_list":["post-43197","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-artificial-intelligence","tag-china","tag-tracking"],"aioseo_notices":[],"amp_enabled":true,"views":"31","promo_type":"1","layout_type":"1","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/43197","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/comments?post=43197"}],"version-history":[{"count":1,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/43197\/revisions"}],"predecessor-version":[{"id":43199,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/43197\/revisions\/43199"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/media\/43198"}],"wp:attachment":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/media?parent=43197"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/categories?post=43197"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/tags?post=43197"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}