Signal Is More Than Encrypted Messaging. Under Meredith Whittaker, It’s Out to Prove Surveillance Capitalism Wrong | WIRED

24/04/2025

URL: Signal Is More Than Encrypted Messaging. Under Meredith Whittaker, It’s Out to Prove Surveillance Capitalism Wrong | WIRED

Whatsapp’s (and Apple’s… and presumably Messenger’s) end to end encryption is the same protocol (the Signal protocol) that signal uses. Didn’t know that!!

Big tech aren’t responsible. Look at the massive global outages CrowdStrike and Microsoft caused by cutting some QA corners. Signal tries to be responsible. And also they don’t surveil.

“This is not a sclerotic kind of museum piece. This is an adolescent animal that is about to double, triple in size.” – haha, she does not speak like other organisation presidents.

Now it’s gone big, paranoid hackers are asserting there must be a backdoor. There, as far as is known, is not. But its difficult to continuously prove a negative. Security researchers spreading misinformation that drives people off Signal onto other, less secure, alternatives is bad, and those people should feel bad.

They have to take these stories seriously, and work hard to reassure people that they aren’t true.

So, let’s talk about money. How does signal keep itself afloat.

…donations!? They’re a nonprofit, because at this moment that is the best way to keep their incentives towards privacy and away from surveilence. If they became a for-profit company, then people would start trying to make profit out of privacy, and you can’t do that. “There isn’t a business model for privacy on the internet” (because these people view privacy on the internet as a fundamental, free, essential human right). I imagine you could quite reasonably run it is a subscription service, but then you would be inaccessible to some people (and have to collect payment information…). Hmm.

It helps that signal is pretty cheap. One of the goals of the signal project, of Whittaker’s goals, is to develop these other business models.

Whittaker takes home less than some of her engineers, and they try to pay those about as much as silicon valley. They don’t have equity (stocks) to pay them with, so it’s “just” a good sallary. This is because they run critical global infrasrtucter. They need salaried staff on pagers to wake up at 02:00 and fix servers.

Whittaker “started out” (ie, first job after graduating in English and Rhetoric – ohhh, that’s why she speaks so well) was a sort of customer support at google. This was in 2006 as google was about to explode, and they were really into letting their employees do weird shit and leaning into that self-satisfied “don’t be evil” “we can do ethical tech” thing. So she ended up leading a reserach group, working on standards, that sort of thing.

Now we get to the “google’s business model is deeply toxic” bit. Okay so she started something called “Measurement Lab” which was collecting open source data on internet performance and a lot of it, trying to put some meat to the bones of the whole net neutrality thing, (and they found that actually yeah there were some issues at interconnects).

Then machine learning blew onto the scene, which was supposedly doing wonderful things with very dodgy input data, way less easy to measure than network data and doing statistics, and then calling that intelligence.

Plus she learned that the infrastructure for “measuremnt lab” cost $40e6 a year in uplink connectivity. “This isn’t innovation, it’s capital.”

So the AI boom came out of all this capital tech had, and the massive massive amounts of data from their social media companies, and a great interest in using ai to target ads. Oh no. (Damn, for a verbal interview, this is very well structured).

Eventually she came to the realisation that google kept her around as a critic so they could claim to be listening to criticism, hers, but they then just ignored it. Then military contracts, ai targetting strikes based on metadata. Google had previously disavowed doing millitary work. Eventually whittaker and other senior ai peeps at google started walkouts, with thousands an dthousands of participants.

Also the whole thing doesn’t work, the Project Maven systems are shitty, and buggy. Also like this was years ago, its only gotten worse. IDF is using AI to target strikes in Gaza. Keys into well-known facts about Israel’s tech/defence industry surveillence. Gaza is extremely surveiled.

AI is the product of mass surveilence business models, they aren’t separate technological phenomena. That’s why Signal (ie, Whittaker) cares about both. “AI is the narrative. It’s not the technology. Surveillence and infrastructure are the material conditions.”

Suppose we got AI-powered radiological detection that works, but then it is released into a system where it’s not used to treat people, but used to deny coverage, or turn patients away. How is it really going to be used. (ok, this bit is us focussed). It’s about power, material conditions, who owns the capital and so on.

Signal is the example of the other paradigm. Proof things can be done differently. It doesn’t single-handedly constitute a weapon to defeat surveillence capitalism and AI. Proton recently became a nonprofit.

My biggest thoughts are about the infrastructure. How do you draw down and democratise that cloud (“other peoples’ computers”) infrastructure.

Nonprofit means there’s no motivation for them to expand to fill the market, no need for signal search, signal storage and so on. There is room for others. They aren’t competing. Companies don’t need to dominate everything.

Currently, users are coerced by lack of alternatives onto big tech, not because they like it. Everybody uses windows because it is preinstalled on their laptops and getting linux to work is something you have to do yourself, and could be tricky.

Signal is just one example, but the first.

Okay what do i think: - lol, id enjoy working at signal. i hope more (sustainable, long-lasting) companies like it exist by the time im looking for work - very Cory Doctorow core. - signal was founded by like a guy, im just a person. can i do cool things like that too? (yeah!)