We are all suspects until we prove otherwise

Hi, I’m Chris.

I’m here to help you through every step of your PRVCY journey.

Wether you’re already taking the PRVCY online courses or a new subscriber, I’ll post constant news and information based on our research to help you taking back control of you PRVCY!

In the words of the European Union, the proliferation of images and videos of child sexual abuse, which has dramatically increased with the development of the digital world, allows for the creation of “solutions” where the control of all our digital activities is deemed acceptable.

Even though the harm suffered by the victims is infinite, and perpetrators have found new ways to access and exploit children and providers of hosting or interpersonal communication services through digital services, our privacy and intimacy are fundamental human rights.

Public authorities worldwide want us to believe that their surveillance policies represent responsible and careful behavior essential for a safe, predictable, and trustworthy online environment and the exercise of guaranteed fundamental rights. In the United Kingdom, government officials have expressed concerns for years that online services are not doing enough to combat illegal content, especially material related to child sexual abuse.

The “solution” was the Online Safety Bill, which aims to make the United Kingdom the safest place in the world for internet use. But don’t worry, this dreadful law doesn’t only affect the United Kingdom—it serves as a blueprint for oppression worldwide. Advocates of the law point to the worst online content such as terrorist posts and child abuse, but surveillance won’t stop there.

Companies will be compelled to monitor additional categories of content and exchange information about users between different countries. Journalists and human rights activists will inevitably become targets. And users will never be sure if their private messages are being read and intercepted by private companies.

 

Apple is always one step ahead when collaborating with governments

In August 2021, Apple announced a plan to scan photos stored by users in iCloud for Child Sexual Abuse Material (CSAM). The tool was designed to protect privacy and allow the company to flag potentially problematic and abusive content without disclosing anything else.

However, the initiative was controversial, soon drawing widespread criticism from privacy and security researchers as well as digital rights groups who feared that the surveillance feature itself could be abused to undermine the privacy and security of iCloud users worldwide. In early September 2021, Apple announced a pause in the rollout of the feature to “collect feedback and make improvements before these important child safety features are released.”

In other words, the rollout will still happen. Now, the company states that the CSAM detection tool for iCloud photos is no longer available due to the feedback and insights it received.

Instead, as Apple explained to WIRED, this week, the company is focusing its efforts and investments on combating CSAM through its “Communication Safety” features, announced in August 2021 and introduced last December. Parents and caregivers can opt-in through family iCloud accounts for protection. The features work in Siri, Apple’s Spotlight search, and Safari search to warn if someone is viewing or searching for Child Sexual Abuse Material and to provide resources on the spot to report the content and seek help.

 

Children, like adults, rely on encrypted communication apps like WhatsApp or Signal and rightly expect not to be subjected to mandatory identity verification, arbitrary filtering, and surveillance. Especially abused children need private and secure channels to report what has happened to them. The bill aims to protect children but disregards their privacy and violates internationally recognized principles on children’s rights.

But Parliament has not cared. Even worse, the United Kingdom is not alone in this: Since it is not feasible to garner public support for the idea that the police scan every digital message, lawmakers in other liberal democracies have resorted to workarounds, claiming that backdoors in encryption are necessary to investigate files related to the most heinous crimes. They have falsely claimed that certain methods of checking user files and messages, such as client-side scanning, do not break encryption at all.

Authorities have also tried—and thankfully failed—to pressure Apple to introduce a system of software scanners on every device that constantly searches for images of child abuse and reports to the authorities.

But Parliament has not cared. Even worse, the United Kingdom is not alone in this: Since it is not feasible to garner public support for the idea that the police scan every digital message, lawmakers in other liberal democracies have resorted to workarounds, claiming that backdoors in encryption are necessary to investigate files related to the most heinous crimes. They have falsely claimed that certain methods of checking user files and messages, such as client-side scanning, do not break encryption at all.

Authorities have also tried—and thankfully failed—to pressure Apple to introduce a system of software scanners on every device that constantly searches for images of child abuse and reports to the authorities.

But Parliament has not cared. Even worse, the United Kingdom is not alone in this: Since it is not feasible to garner public support for the idea that the police scan every digital message, lawmakers in other liberal democracies have resorted to workarounds, claiming that backdoors in encryption are necessary to investigate files related to the most heinous crimes. They have falsely claimed that certain methods of checking user files and messages, such as client-side scanning, do not break encryption at all.

Authorities have also tried—and thankfully failed—to pressure Apple to introduce a system of software scanners on every device that constantly searches for images of child abuse and reports to the authorities.

The importance of preventing and combating child abuse must not be a pretext for introducing mass surveillance programs. States must find solutions that keep the personal information, communication, and devices of citizens private. However, because experience repeatedly shows us that we cannot trust governments on this issue, PRVCY World was created from the beginning.

Take a look at our PRVCY courses for you:

Latest PRVCY Insiders:

Categories

Hi, I’m Chris.

I’m here to help you through every step of your PRVCY journey.

Wether you’re already taking the PRVCY online courses or a new subscriber, I’ll post constant news and information based on our research to help you taking back control of you PRVCY!

PRVCY Insider

Stay up to date with the latest news on data protection and controlling your privacy online.

EN - PRVCY Insider