Friend, a startup developing a $99 AI-powered necklace designed to function a digital companion, has delayed its first shipments until the third quarter.
Friend had planned to ship devices to pre-order customers in the primary quarter. But in accordance with co-founder and CEO Avi Schiffmann, this is not any longer feasible.
“Although I’d have liked to ship in the primary quarter of this yr, I still have improvements to make, and unfortunately you may't start making electronics until you're 95% done along with your design,” says Schiffmann said in an email to customers. “I estimate that we are going to start our final push at the top of February, when our prototype is prepared.”
An email I sent to all Friend pre-order customers: pic.twitter.com/wUPR0OhpI4
– Avi (@AviSchiffmann) January 20, 2025
Friend, which has an eight-person engineering staff and $8.5 million in capital from investors including Perplexity CEO Aravind Srinivas, made waves when it spent $1.8 million on the domain name Friend.com. This fall, as a part of what Schiffmann called an “experiment,” Friend introduced an internet platform at Friend.com that allowed people to seek advice from random examples of AI characters.
The reception was mixed. TechRadar's Eric Schwartz noted that Friend's chatbots often inexplicably initiated conversations with anecdotes about trauma, including assaults and layoffs. In fact, when this reporter visited Friend.com on Monday afternoon, a chatbot named Donald shared that the “ghosts of his past” are “scaring the hell out of him.”
In the above email, Schiffmann also said that Friend can be discontinuing its chatbot experience.
“We are pleased that thousands and thousands were capable of mess around with what I imagine to be probably the most realistic chatbot,” Schiffmann wrote. “That really demonstrated our internal ability to administer traffic and really taught us loads about digital accompaniment… (But) I need us to focus solely on the hardware, and I've realized that with digital Chatbots and embodied companions don’t mix well.”
AI-powered companions have develop into a hot topic. Character.AI, a Google-backed chatbot platform, was accused in two separate lawsuits alleging psychological harm to children. Some experts have expressed concerns that AI companions could making the isolation worse by replacing human relationships with artificial ones, and generate harmful content that could cause mental illness.