HomeIndustriesSignal’s Meredith Whittaker: ‘I see AI as born out of surveillance’

Signal’s Meredith Whittaker: ‘I see AI as born out of surveillance’

Meredith Whittaker shouldn’t be a follower of norms. It takes a number of tries to get her to choose a restaurant to satisfy at — her suggestions include the lobby of her hotel in London and a coffee shop that doesn’t take reservations, which makes me nervous.

Eventually, she relents and chooses Your Mum’s Kitchen, a tiny family-run eatery within the basement of a Korean supermarket on a quiet road in north London. Korean cuisine is her comfort food, she says, having grown up in Los Angeles’s Koreatown neighbourhood. Even now, Korean hotpots and , a spicy kimchi stew, are her go-to recipes when she’s cooking at home in Brooklyn.

Sign up for The Best of Lunch with the FT

Discover a few of our greatest interviews with our pop-up newsletter, The Best of Lunch with the FT. Click here to enroll

Whittaker, who’s petite, with a signature sweep of dark curls streaked with grey, is arguably Silicon Valley’s most famous gadfly. Over the past few years, she has come to represent the antithesis of Big Tech, an industry that has built its wildly successful economic model mostly through “surveillance capitalism” — benefiting from people’s personal data.

Meanwhile, Whittaker is the iconoclast president of the inspiration behind the favored encrypted messaging app Signal, which is funded primarily by donations, and has been downloaded tons of of thousands and thousands of times by people everywhere in the world. She is a rare tech executive who decries excesses of corporate power, rails against what she calls a “mass surveillance business model” and lobbies for the preservation of privacy.

Trying to pin down Whittaker is like bottling lightning. She is basically itinerant, travelling the world speaking on her pet themes of privacy, organised labour and power, and is continually fizzing with words and concepts that are likely to shoot off in unexpected directions. Her day-to-day involves running a tech company, but she also publishes academic papers on the sociopolitics of AI and is an outspoken anti-surveillance activist. To her, the disparate threads form a coherent picture of what she stands for.

“I see AI as born out of the surveillance business model . . . AI is largely a way of deriving more power, more revenue, more market reach,” she says. “A world that has larger and higher AI, that needs increasingly more data . . . and more centralised infrastructure (is) a world that’s the alternative of what Signal is providing.”

At Google, where she began her profession in 2006, Whittaker witnessed the rise of this recent wave of so-called artificial intelligence — the power to tug out patterns from data to generate predictions, and more recently create text, images and code — as Google began to leverage the valuable data trails it was harvesting from its users.

“Suddenly, all over the place, there have been little courses in Google that were like, learn machine learning, apply machine learning to your thing,” she says. “We hadn’t decided it was called AI then. The branding was still type of up within the air.”

In 2014, a visiting engineer from Harvard told Whittaker about an idea he had to make use of AI software to predict genocides. She remembers this because the moment she began to have ethical concerns concerning the technology.

She knew the software to be imperfect, shaped heavily by the human behavioural data it was trained on, which was often biased, incomplete and messy. The idea still haunts her: “I used to be like, how do that that is definitely accurate, and the way do you are taking responsibility for the indisputable fact that a prediction itself can tip the scales? And what’s the role of such a thing?”

It prompted her to co-found the AI Now Institute in New York, alongside Kate Crawford, a peer at Microsoft, to research the urgent societal impacts of AI, focused on the current moderately than an amorphous future. Since then, she has been involved with organising worldwide worker walkouts to protest against Google’s military contracts, and advised Lina Khan, the chair of the US Federal Trade Commission, on the link between corporate concentration of power and AI harms.

Whittaker became president of the Signal Foundation in 2022. The Signal app it runs is utilized in essentially the most sensitive scenarios by militaries, politicians and CEOs in addition to dissidents and whistleblowers, and the encryption techniques developed by its engineers over the past decade are utilized by its rivals WhatsApp and Facebook Messenger. It currently oscillates between 70mn and 100mn users every month, depending on external triggers similar to the wars in Ukraine and Gaza, which cause spikes in sign-ups, she explains.

“Within Signal, we’re continually attempting to collect no data,” she says. “Having to truly make the thing (when) we’re depending on large actors within the ecosystem who set the norms, own the infrastructure . . . as an mental proposition is basically interesting.

“It’s given me a whole lot of analytical tools to use within the AI space and think concerning the political economy from inside.”


Whittaker is clearly the expert between us on the menu, and she or he spearheads the ordering of — a soft tofu stew with seafood that she later proclaims “the right food” — plump vegetable dumplings and sour-hot kimchi on the side. My ramen bowl is piled with sliced bean curd and a fried egg floats, jewel-like, on the surface. The café is unlicensed (you possibly can bring your individual booze) but Whittaker tries to interest me in barley tea. It’s even higher iced on a hot day, she says wistfully.

Menu

Your Mum’s Kitchen

Sundubu jjigae £11.50
Ramen £5.80
Fried bean curd £2.30
Vegetable dumplings £3
Kim £1.50
Kimchi £3.50
Barley tea x2 £4.60
Ginger honey tea x3 £5.60
Total inc service £41.58

The mother-daughter duo within the open kitchen are working in practised tandem, plating and ladling out steaming bowls whose aromas permeate the compact pastel-painted room. Whittaker says it reminds her of dinners along with her dad as a youngster at a student joint called Tofu House in LA, normally after an argument. It’s now been changed into a series, she complains.

Whittaker arrived at Google straight from a level in English literature and rhetoric on the University of California, Berkeley. Before then, the one jobs she’d had were helping out in LA jazz clubs where her dad played the trombone. “I didn’t have a profession ambition, it was very happenstantial,” she says. “I see kids now whose lives are structured inside an inch of each minute. And it seems really hard as a teenager who’s developing and attempting to work out yourself on the planet.”

The job was in customer support and involved resolving user complaints. But because this was Google within the late 2000s, the team was staffed by college graduates with humanities degrees from elite universities, including a girl who as an undergraduate had codified a Central American language that had never been mapped before. Their performance was scored based on what number of reported bugs the engineering team resolved, but this was particularly hard to do, since the team was in a unique constructing to the engineers.

Whittaker decided the efficient thing to do can be to ride over to the engineering constructing (Google has free campus bikes) and set herself up on a couch there, in order that she could collaborate directly with the engineers and solve problems in real time. It was her first taste of how large tech corporations worked — when you desired to fix something, you needed to work across the bureaucracy and do it yourself, even when it meant making enemies of your managers and colleagues.

It was at Google that Whittaker also learnt the subtleties of constructing a world web business. She worked closely with the internally powerful technical infrastructure team, which might go on to develop into Google Cloud, and have become particularly occupied with net neutrality — the concept of an open and democratised web. It led her to found an open-source research group often called M-Lab, working with civil society and privacy researchers outside Google to measure the worldwide speed and performance of the web. Google funded M-Lab with roughly $40mn a yr, which Whittaker calls a “rounding error” for the search giant. But it gave her an idea of how much it could cost an independent, community-led project to construct recent web infrastructure with out a Big Tech sponsor.

In 2010, through her work at M-Lab, Whittaker became involved in online security circles, attending to know digital mavericks accused of being paranoid about privacy in a pre-Snowden era. “I believed of it as cutting tributaries within the river (of Google) and releasing a few of it to anarchist privacy projects, like . . . Tor,” she explains, referring to the non-profit that runs anonymous web browsers. “I used to be just attempting to work out how can we support the community that’s effectively constructing prophylactics to the business model I’m starting to know?”

That was when she first met Moxie Marlinspike, a cryptographer and entrepreneur who had founded Signal, which she was helping to fundraise for on the time. “There just wasn’t an understanding then of what it actually meant economically to construct large-scale tech . . . and there still isn’t,” Whittaker says. “People are too afraid to ask the political and economic questions.”

More than a decade later, as president of the Signal Foundation, she stays a privacy absolutist — committed to the thought of end-to-end encryption and the necessity for pockets of digital anonymity in an industry fuelled by the monetisation of information — despite political pushback from governments everywhere in the world.

She has also chosen to publish an in depth breakdown of Signal’s operating costs, estimating that by 2025 Signal would require roughly $50mn a yr to operate. Most of the associated fee is to keep up the digital infrastructure required to run a real-time consumer messaging app, similar to servers and storage, however the app’s end-to-end encrypted calling functionality is one of the vital expensive services it provides.

“I desired to talk concerning the money, and the way not free tech is, when you’re not willing to monetise surveillance. What’s paying for this when you don’t know the associated fee?” she says. “Not that many individuals have access to this information, and one thing I can do is shift the narrative by speaking truthfully concerning the economics.”

Until 2017, Whittaker had thought she could successfully mobilise change from contained in the machine, increase ethical AI research and development programmes at Google in collaboration with academics at universities and corporations similar to Microsoft. But within the autumn of that yr, a colleague contacted her a few project they were working on. They had learnt it was a part of a Department of Defense pilot contract, codenamed Project Maven, that used AI to analyse video imagery and eventually improve drone strikes. “I used to be mainly only a . . . dissent court jester,” she says, still visibly disenchanted.

She drafted an open letter to Google’s chief executive, Sundar Pichai, that received greater than 3,000 worker signatures, urging the corporate to tug out of the contract. “We imagine that Google mustn’t be within the business of war,” the letter said.

“The Maven letter was type of like, I can’t make my name as an ethical actor redounding to Google’s profit,” she says. “You’re talking about Google becoming a military contractor. It’s still shocking, even though it’s develop into normalised for us, but it is a centralised surveillance company with more kompromat than anyone could ever dream of, and now they’re partnering with the world’s most lethal military, as they call themselves.

“Yeah, that was the tip of my rope.”

Whittaker went on to assist organise worker protests and walkouts, during which greater than 20,000 Google staff participated, to protest against the corporate’s handling of other ethical matters similar to sexual harassment allegations against high-profile executives. At the time, Google’s management opted to not renew the Pentagon contract once it expired. But Whittaker left Google in 2019, after the corporate presented her with a set of options that she says gave her no selection but to quit. “It was like, you possibly can go be an administrator, doing spreadsheets and budgets for the open source office (and) stop all of the shit I had been constructing ceaselessly.”

In recent academic papers that Whittaker has published on the closed nature of AI technology and industrial capture of the sector, she often refers to the facility of workplace activism and organising inside tech firms and universities, as a lever to envision the tech industry’s dominance and power over civil society, academia and governments.

“This is how I landed on labour organising and social movements. It wasn’t an ideological a priori radicalism,” she says. “Politics aside, I attempted force of argument, I attempted working from the within, I attempted working in government. The place that seems to have the capability to rein in capital appears to be labour.”


Whittaker is on the road for greater than 120 days in a yr. To keep sane, she says she sticks to little routines wherever she is on the planet. Like making a French press coffee from a selected chicory-mushroom mix she likes, or doing a each day yoga class via video from her teacher’s New York studio. She at all times tries to cook dinner on the day she gets home from a visit, to feel grounded again, “like a human on the planet, not a brain on a stick”.

She reads voraciously on the political history of technology and today is animated by efforts to reshape the present tech ecosystem through lessons she is learning.

“The (AI) market is crazy right away. Seventy per cent of Series A (early-stage start-up) investment is coming from the hyperscalers, and nearly all of that goes back to the hyperscalers,” she says, referring to cloud corporations Microsoft, Google and Amazon. “It’s like a Potemkin market, it’s not an actual market.”

The consequences, in line with Whittaker, are a handful of centralised actors which are determining the form of AI and, ultimately, who gets to make use of systems which are capable of creating sensitive determinations in health, war, financial services and energy. “There’s been an actual problematic assumption that you’ve gotten to have a pc science degree to make decisions about medical care or education or resource distribution from public agencies,” she says.

“We are led to see these (AI systems) as a type of . . . revolutionary inflection point in scientific progress. I don’t think they’re that. They are the derivatives of massive network monopolies, and so they’re a way for these monopolies to grow their reach.”

I ask if she sees any potential for the AI systems we’re constructing now to have a positive impact, but she pushes back. “Not without radical social change,” she says — a restructure that might disrupt the economic forces around AI and the handful of personal corporations that currently control it, but in addition one which prioritises social goals over revenue and growth. In order to do that, she refers back to the idea proposed by Maria Farrell and Robin Berjon of “rewilding the web” — renewing the present ecosystem in order that it’s surrounded by “other pine trees and forest life”, a plurality of digital life. 

This is where she feels that projects similar to Signal play a job. “Independent alternatives (to Big Tech) are literally secure havens from those paradigms,” she says. “For me, fighting for one more model of tech, as an infrastructure for dissent and honesty within the face of . . . surveillance, it’s the identical project. It’s just one other place to do it.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read