Out of Gaza To UkraineToday's war zones are used as testing grounds for brand new systems based on artificial intelligence. Billions of dollars at the moment are being pumped into AI weapons technology, much of it from Silicon Valley enterprise capitalists.
In this episode of The weekly conversation In the podcast, we check with Elke Schwarz, who studies the ethics of autonomous weapons systems, about what this influx of recent investment means for the long run of warfare.
The introduction of AI within the defense industry is now attracting significant amounts of cash. In 2024, the worldwide military AI market was Worth an estimated $13.3 billion (ÂŁ10.8 billion).with a projected growth to $35 billion over the following seven years. Elke Schwarz, a lecturer in political theory at Queen Mary University of London within the United Kingdom, has just published a brand new study showing that a key driver of the expansion of military startup products similar to autonomous drones and other AI-powered systems is the influx of giant investment and influence from enterprise capital firms.
Venture capitalists have traditionally been wary of the defense sector. U.S. military contracts were typically won by just a few large firms, and this industry was considered difficult to achieve a foothold in. Schwarz says it also stays “ethically frowned upon” to take advantage of conflict. But those “moral concerns were put aside in a short time,” she says, when it seemed possible to disrupt the defense sector.
The technology start-up Palantir followed in 2016 sued the US Army It was about procurement rules that excluded other firms from competing for a selected contract. In 2016, a judge ruled in favor of Palantir's case secured a contract price $823 million. This paved the best way for more startups to compete for contracts. In December, the Financial Times reported that Palantir and one other defense startup called Anduril were in discussions with a couple of dozen other technology firmsto create a consortium that will compete directly for U.S. government contracts.
VC Logic enters the sport
Successful startups have to grow quickly and be ambitious in the event that they wish to proceed attracting rounds of investment. And that logic influences the narrative around startups promoting AI products, says Schwarz.
You must make big guarantees. You must think big. You must express great intentions, perhaps unattainable but truly tempting goals. We're not saying it's all fantasy, but it surely's all actually exaggerated… and you have got to make yourself indispensable to create a vision of inevitability.
Using this language of inevitability, Schwarz says that probably the most vocal startup founders and their VC backers claim that “war can only be won with more AI.” They argue that AI systems will make it possible to win wars faster and more precisely than prior to now.
However, her research questions the impact of incorporating AI systems into military decision-making and particularly the kill chain. She refers to a report by investigative magazine +972 about Israel's alleged use of AI-powered systems within the Gaza war to discover Hamas militants who could attack them in possible airstrikes. For Schwarz, such developments suggest that “there could also be an inclination to make use of technologies indiscriminately or imprecisely, irrespective of how precise they could be.”
And she fears that fairly than resolving conflicts more humanely and with less violence, as AI military advocates suggest, these systems could actually “lower the brink for resorting to violence.”
Listen to the interview with Elke Schwarz on The Conversation Weekly podcast to search out out more. A transcript is on the market on Apple Podcasts. You can even read an article she wrote for The Conversation's Insights series about her research.