iPhone surveillance for CSAM too dangerous, say civil rights teams

More than 90 civil rights groups around the world have signed an open letter objecting to the so-called iPhone monitoring features and asking Apple to abandon its CSAM scanning plans.

Additionally, they want the iPhone maker to abandon iMessage nude recognition plans as it could put young gays at risk.

The signatories to the letter include the American Civil Liberties Union (ACLU), the Canadian Civil Liberties Association, Australian Digital Rights Watch, UK Liberty and global Privacy International.

The letter highlights the main risk, mentioned by many, of abuse by repressive governments.

Once this capability is built into Apple products, the company and its competitors will face tremendous pressure – and possibly legal requirements – from governments around the world to scan photos not just for CSAM but for other images as well Government finds objectionable.

These images can be human rights abuses, political protests, images that labeled companies as “terrorist” or violent extremist content, or even unflattering images of the very politicians who pressure the company to look for them.

And that pressure could extend to all of the images stored on the device, not just those uploaded to iCloud. In doing so, Apple laid the foundation for censorship, surveillance and prosecution on a global scale.

But it also says that scanning children’s iMessage accounts for files separately, another form of iPhone surveillance, could put children at risk.

The system developed by Apple assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child and that these individuals are in a healthy relationship. This may not always be the case; an abusive adult could be the organizer of the account and the consequences of notifying the parents could endanger the child’s safety and well-being. LGBTQ + adolescents on family accounts with unsympathetic parents are particularly at risk.

It states that companies respect Apple’s intentions, but the company should stand by its privacy values.

We support efforts to protect children and take a firm stand against the spread of CSAM. But the changes announced by Apple put children and its other users at risk now and in the future. We urge Apple to abandon these changes and reaffirm the company’s commitment to protecting its users with end-to-end encryption. We also urge Apple to consult more regularly with civil society groups and vulnerable communities that may be disproportionately affected by changes to its products and services.

It follows that the German Bundestag wrote a similar letter to Apple a few days ago.

Via Reuters. Image: Yahoo

FTC: We Use Income Earning Auto Affiliate Links. More.

For more Apple news, see 9to5Mac on YouTube:

Comments are closed.