Business

Chinese Huawei has tested AI software that could identify Uyghur Muslims and alert police, report says

The logo of the Chinese company Huawei in its main UK offices on January 28, 2020.

Daniel Leal-Olivas | AFP via Getty Images

GUANGZHOU, China – Huawei, in collaboration with one of China’s largest artificial intelligence (AI) companies Megvii, has tested a facial recognition system that could be used to detect members of a Muslim minority group and send messages alerts to authorities, according to a new report.

The Uyghur ethnic minority is a repressed Muslim group often targeted by the Chinese government, who live primarily in the western Xinjiang region.

An official 2018 document produced by the two Chinese companies showed that Huawei tested Megvii’s software on its video cloud infrastructure. The document was discovered by IPVM, an American research company specializing in video surveillance analysis. IPVM shared their discovery with The Washington Post, which on Tuesday was the first media organization to report on its content.

The test was carried out to see if Huawei’s hardware is compatible with Megvii’s facial recognition software, the IPVM report says. Huawei provided hardware such as cameras, servers and cloud computing infrastructure, while Megvii provided the software, he added.

As part of the trial, a feature called “Uyghur Alert” was tested. Another feature of the software was to determine “ethnicity” as part of its “facial attribute analysis,” according to the IPVM report..

‘Uyghur alert’

A feature such as “Uyghur Alert” could be used to report a member of the minority group to authorities, according to IPVM.

“Systems like Megvii are built into the Huawei system so that information and alarms (like on Uyghurs) are generated by Megvii and then sent into the Huawei system so that monitors (e.g. police) can examine and respond,” John Honovich, president of IPVM, told CNBC via email, when he explained the potential functionality of the “Uyghur alert” feature.

Huawei and Megvii’s collaboration on Uyghur alarms further proves that many large Chinese CCTV / facial recognition companies are deeply involved in the Uyghur crackdown.

John Honovich

President, IPVM

It is another tool in the arsenal of the Chinese authorities, who have used technology to crack down on the Muslim minority group. the New York Times reported last year that facial recognition was used to track Uyghurs and keep tabs on their movements.

IPVM found the Huawei and Megvii document, marked “confidential,” via a Google search. It was uploaded to Huawei’s website but has since been deleted.

“Huawei and Megvii’s collaboration on Uyghur alarms further proves that many large Chinese CCTV / facial recognition companies are deeply involved in the Uyghur crackdown. Anyone who does business with these companies should take note,” the IPVM report concluded. , written by Honovich.

Huawei and Megvii respond

Neither Huawei nor Megvii have denied the veracity of the document discovered by IPVM.

A Huawei spokesperson told CNBC about a comment the company made to IPVM, in which it said the system was not used in a real-life scenario.

“This report is just a test and it has not seen real world application,” the statement said. “Huawei only provides general purpose products for this type of testing. We do not provide custom algorithms or applications.”

“Huawei operates in accordance with the laws and regulations of all countries and regions in which we operate,” the statement continued, “and only provides ICT (information and communication technology) products and solutions that meet recognized standards of industry. “

Huawei declined to answer further questions about the report.

Megvii told CNBC that “its solutions are not designed or customized to target or label ethnic groups.”

“Our business is focused on the well-being and safety of individuals, not on monitoring particular demographic groups,” said a spokesperson for Megvii.

US government allegations

IPVM’s Honovich noted via email that from a technical standpoint, facial recognition based on ethnicity is difficult.

“We remain skeptical of the accuracy of ethnic recognition, be it Uyghur or otherwise, even under perfect conditions, the actual conditions of surveillance cameras (bad angles, poor lighting, far distances, etc.) exacerbating this “, did he declare.

This is not the first time that Chinese tech companies have been linked with Uyghur surveillance. Last year the The United States listed 28 organizations on the so-called entity list. US companies are prohibited from doing business with companies on this blacklist, which includes some of the Chinese AI champions like Megvii, SenseTime, Hikvision and Iflytek.

Washington alleged that “these entities have been implicated in human rights violations and abuses in the implementation of China’s campaign of repression, mass arbitrary detention and high-tech surveillance against the Uyghurs, Kazakhs and other members of Muslim minority groups “in China’s Xinjiang region. .

US technology used, report says

IPVM said the document showed the US semiconductor giant Nvidia helped power the joint Megvii and Huawei surveillance system with its Tesla P4 GPU chip.

The report noted that it was not clear whether Nvidia knew what its chips were being used for. Nvidia did not respond to a request for comment when contacted by CNBC.

Last month the New York Times reported that the Intel and Nvidia were used to power computers capable of processing and viewing surveillance footage and which were part of China’s surveillance of Uyghurs in Xinjiang.

Tuesday, US Senator Marco rubio and U.S. Representative Jim McGove sent letters to Intel and Nvidia CEOs in response to the NYT story. Lawmakers asked companies if they knew how their technology was being used and if they were taking steps to ensure their chips were “not used for human rights violations or to compromise US national security.” .

Nvidia and Intel were not immediately available for comment when asked about the letters.


Source link

Show More

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button