mastodon.world is one of the many independent Mastodon servers you can use to participate in the fediverse.
Generic Mastodon server for anyone to use.

Server stats:

12K
active users

Meredith Whittaker

Ignoring expert consensus, feeling no shame following exposés showing tech lobbyists shaping these EU surveillance proposals, EU politicians are at it again

So, we'll reiterate: Signal would rather leave the EU market than subject our users to mass gov surveillance. FULL STOP

netzpolitik.org/2024/internes-

@Mer__edith Any non-xitter link for this seemingly interesting statement?

@richlv@mastodon.social @Mer__edith@mastodon.world


Statement, signed by 250+ researchers, warning that the modifications of the Regulation to detect CSAM proposed by the EU presidency dont solve the issues pointed out by experts. It still introduces societal risks without solving the CSA problem.
http://www.csa-scientist-open-letter.org/

I summarized the issues with the regulation before
https://twitter.com/carmelatroncoso/status/1676192115414519808
Given the state of detection technology: too many errors and is easy to evade, it cant be effective
Introducing detection technologies on users' devices undermines the protection of end-to-end encryption


The
@EU2024BE proposes two modifications to address these issues:
1) require more than one suspicious detection to trigger a report
2) only do mandatory detection in high-risk services
They also say that all of it will be done respecting the protecting provided by encryption

The first modification does not address the issue of the bad quality of detectors. False positives will still be large even with two detections. The assumption that such false alarms will be independent is false. In reality this wont be the case, eg sharing several beach pictures

Such modification also does not address the issue that detectors are easy to evade, and those wanting to share illicit material will do so, while innocent users will get caught once and again by the false positives.

The second modification does neither help with the issue of proportionality. End-to-end encrypted services (like the most used messaging systems Signal/WhatsApp/Threema/Telegram) are high risk. Thus their billions of users will still be subject to mandatory detection

Also, as we exposed before, if these platforms become a problem for those sharing CSAM, they will move to other platforms -- proprietary or those that don't have detection, while legitimate users will stay and be subject to continuous screening for no reason.

More importantly, adding detection capabilities to end-to-end encrypted services BREAKS the protection given by encryption. Once messages can be read by anyone that is not the sender or receiver, encryption is useless...

... the same as an envelope doesn't provide privacy if someone can read the letters before they are introduced in it; and walls do not provide privacy if you can put a camera inside -- even if this camera/letter only results in reports when detecting suspicious activity


We are also worried that the regulation pushes for further uses of technology, such as age verification, when it is not clear such technology (as CSAM detectors) is ready -- as the EU Parliament itself recognizes
https://www.europarl.europa.eu/RegData/etudes/ATAG/2023/739350/EPRS_ATA(2023)739350_EN.pdf

(1/2)

carmelatroncoso.github.ioRedirecting

@richlv@mastodon.social @Mer__edith@mastodon.world

Finally, we stress our disappointment that despite the many critiques of the process, it continues evolving without consulting academic experts, and without any transparency on who is consulted and on the methods that will be used to implement the regulation

Protecting children from online abuse while preserving their right to secure communications is critical. Eradicating CSAM relies on eradicating abuse, not only abuse material. Technocentric approaches focused on sharing material dont tackle the core of the problem

We recommend substantial increases in investment and effort to support existing proven approaches to eradicate abuse-- as indicated in the letter --and with it, abusive material. Our communications don't need to be threatened to achieve this.

(2/2)

@richlv @Mer__edith I was gonna recommend following the corresponding @carmelatroncoso account via the bird.makeup service, but that mirror hasn't cached any corresponding tweets yet. More info on the bird.makeup service:

bird.makeupbird.makeup - Home Page

@taivlam @Mer__edith
Well, that's sort of spending time and money to give attention (donate resources, basically) to xitter.
Self-molestation, in a way :)

@Wittenborner Sure, it would be sad. But saying that Signal is “too secure to be legal in the EU” is a powerful statement.

I really hope that Threema will do the same. From WhatsApp and Telegram, I expect nothing though. They have other priorities than security.

@Mer__edith

@shred @Mer__edith Thinking about this again i must say that you are absolutely right!

@Mer__edith you guys should make it possible to sign up without a phone and such; this legislation will happen. From the user end we can deal with the EU exit using VPNs, but not if Signal relies on the app stores and such.

We'll if you provide .apk somewhere online, then it's not really possible to make you leave 😉

@Mer__edith
What would "leave the EU market" even mean?
Just not available in App Stores, or would you also make it impossible to install Signal and use it with EU phone numbers?

@Mer__edith Having worked for multiple years in content moderation (CSAM being one of the areas), I can 100% confirm that this act (like many have already said) will do nothing. There are automations being used to try help detect CSAM but it is not effective/accurate enough to be reliable.

Also, the “rules & regulations” that content moderators have to follow are often very arbitrary and don’t help solve the problem.

@Mer__edith If you still can send text, you can send pictures encoded in blobs or simply share URLs to criminal content.
Criminals could also sideload apps and use VPNs to "officially" send their pics within the EU.
Pointless nonsense...

@Mer__edith Thanks for stepping up!

In order for the citizens to have control over the state - which is a base for democracy and written into the German constitution, citizens need space without state supervision to form an independent discussion. The state in turn under democratic legitimicy has to accept that basic principal of protection of privacy.

Scanning of private data for state conformity is a direct violation, intimidation and resembles the contrary: the state's power over citizens.

@Mer__edith I think that one would prepare such legislation if one thinks that civil discourse will become illegal under coming legislation and there are no measures at the moment to censor accordingly. There seems to be a shift towards state power over people, towards taking control over communication. This meddles with and breaks the basic interpersonal fabric of our relationships being personal. As such it drives division between the people. #divideandrule is not in serving the people.

@Mer__edith The answer to any demand for back doors in encryption is “You first.” All the reasons they can’t (or won’t) implement an encryption standard with a master key are all the same reasons we can’t or won’t.