adplus-dvertising

AI emotion-detection software tested on Uyghurs by the Chinese authority: A great concern for basic freedom, privacy and versatile human personalities

AI tech

Yesterday on 26th May 2021, an article was published on BBC headlined “AI emotion-detection software tested on Uyghurs”. The article gives a detailed overview of the current state of the aggressive, totalitarian usage of technology by the Chinese government, in which smartphones are also used as a central tool. Uyghur minors in China are collectively under heavy surveillance. After using the Uyghur detection technology via advanced AI-based facial recognition software, China is now using emotion-detection technology on the Uyghurs and they are also often used as test subjects for various technological innovations. The 12 million Uyghur Muslims in China most of whom are in so-called “re-education centers” must provide DNA samples to local officials regularly, undergo digital scans and also download a government phone app, which gathers personal data including contact lists and text messages according to BBC investigations.

This extreme form of control, espionage and surveillance puts a natural, unbiased human development in danger. People that are so heavily controlled are meant to have anxiety, nervousness, mental damage and have negative emotional structure. So, by using this sort of technology, the Chinese authority is only damaging the positive development of the Uyghur minors.

A Warning for the people of Bangladesh

Several Chinese companies have shown interest in developing next-generation tech infrastructure including 5G network system in Bangladesh. Huawei is one of them which created AI-based Uyghur detection software in the past and because of that, we have removed the brand from our site. Generally, there is no problem working with a Chinese company to develop a tech infrastructure. But we urge the government and the people of Bangladesh to be extremely careful about protecting data privacy and basic freedom. Of course, technology can be used to reduce crime and improve everyday life, but mass control, surveillance, using humans as test subjects, evaluating emotions via AI are not the way to go. It may work as a boomerang in the long run as humans wouldn’t be willing to live as slaves to an artificial (AI-based) system.

Source: BBC, techshohor