HomeNewsDeepfake detection company Pindrop receives $100 million loan to expand its offering

Deepfake detection company Pindrop receives $100 million loan to expand its offering

The threat of deepfakes is growing because the AI ​​tools to create them grow to be widely available. There was a 245% increase in deepfakes worldwide from 2023 to 2024, an upward trend that, in accordance with verification provider Sumsub, has been partly triggered by upcoming election cycles. This can also be affecting the company sector; a recent survey by Business.com found that 10% of firms were victims of deepfake fraudlike cloned voices.

Not surprisingly, this trend has been a boon for firms that market tools to counter deepfakes and technologies to create them. One of those firms, Pin dropannounced on Wednesday that it had secured a five-year loan of $100 million from Hercules Capital. The amount might be earmarked for product development and hiring, in accordance with CEO Vijay Balasubramaniyan.

“With advances in generative AI, voice cloning particularly has grow to be a robust tool,” Balasubramaniyan told TechCrunch. “Deepfake detection using AI detection technologies is now required in every call center to remain one step ahead of fraudsters.”

Pindrop develops anti-deepfake and multi-factor authentication products aimed toward firms in banking, finance and related industries. The company claims that its tools can, for instance, discover callers in call centers – and achieve this with greater accuracy than competing solutions.

“Pindrop uses a dataset of over 20 million utterances, each synthetic and real, to coach the AI ​​models to differentiate between real human voices and artificial ones,” said Balasubramaniyan. “We also trained over 330 text-to-speech (TTS) models to discover TTS models used to create the deepfake.”

Bias is a standard problem in deepfake detection models. Many audio models are inclined to be biased in recognizing Western, American voices and perform poorly with different accents and dialects, which could lead on a detector to categorise an actual voice as a deepfake.

It is controversial whether synthetic training data – training data generated by AI models themselves – mitigates or aggravates the bias. Balasubramaniyan definitely believes the previous, claiming that Pindrop's voice authentication products deal with the “acoustic and spectro-temporal characteristics” of voices, quite than on pronunciation or language.

“AI-based speech recognition systems are inclined to display biased results related to differences in tone, accent, slang and dialect that might have racial implications,” Balasubramaniyan said. “These biases arise from the homogeneity of the information used to coach the systems, which can not reflect various ethnic, racial, gender or other differences, thereby limiting the range of the information used to coach the AI ​​systems.”

Regardless of the effectiveness of its products, Pindrop has made significant progress since 2011, when Balasubramaniyan, a former Google worker, founded the corporate with former Barracuda Networks research director Paul Judge and Mustaque Ahamad. To date, the Atlanta-based company, which employs around 250 people, has raised $234.77 million in enterprise capital from investors including Vitruvian Partners, CapitalG, IVP and Andreessen Horowitz.

When asked why Pindrop selected debt as an alternative of equity this time, Balasubramaniyan said it was an “attractive option” to “efficiently raise growth capital” without diluting Pindrop's equity. (This is a standard strategy.)

The proceeds from the loan will enable Pindrop to introduce its technology into recent sectors, Balasubramaniyan added, corresponding to healthcare, retail, media and travel.

“With the rise of generative AI, we see tremendous demand for our solutions globally and plan to expand into countries where there are significant deepfake threats,” said Balasubramaniyan. “Pindrop is positioned to assist businesses protect themselves and their consumers from increasing fraud and deepfake threats with fraud prevention, authentication and authenticity solutions.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read