Huawei worked on several surveillance systems promoted to identify ethnicity, documents show
December 08, 2020The Chinese tech giant Huawei has tested facial recognition software that could send automated “Uighur alarms” to government authorities when its camera systems identify members of the oppressed minority group, according to an internal document that provides further details about China’s artificial-intelligence surveillance regime.
A document signed by Huawei representatives — discovered by the research organization IPVM and shared exclusively with The Washington Post — shows that the telecommunications firm worked in 2018 with the facial recognition start-up Megvii to test an artificial-intelligence camera system that could scan faces in a crowd and estimate each person’s age, sex and ethnicity.
If the system detected the face of a member of the mostly Muslim minority group, the test report said, it could trigger a “Uighur alarm” — potentially flagging them for police in China, where members of the group have been detained en masse as part of a brutal government crackdown. The document, which was found on Huawei’s website, was removed shortly after The Post and IPVM asked the companies for comment.