Friend, a startup developing a $99 AI-powered necklace designed to act as a digital companion, has delayed shipping its first batch until the third quarter.
Friend planned to ship the device to customers who pre-ordered it in the first quarter. But co-founder and CEO Avi Schiffman says that's no longer possible.
“We had hoped to ship in the first quarter of this year, but improvements are still needed. Unfortunately, we can only begin manufacturing electronics when the design is 95% complete,” Schiffman told customers. stated in an email to. “We hope to be able to start the final spurt by the end of February when the prototype is complete.”
Email sent to all Friends pre-order customers: pic.twitter.com/wUPR0OhpI4
— Avi (@AviSchiffmann) January 20, 2025
Friend, which has an engineering staff of eight and $8.5 million in capital from investors including Perplexity CEO Aravind Srinivas, raised eyebrows when it spent $1.8 million on the domain name Friend.com. Ta. This fall, as part of what Schiffman called an “experiment,” Friend debuted a web platform on Friend.com that allows people to converse with random examples of AI characters.
Reception was mixed. TechRadar's Eric Schwartz noted that Friend's chatbot often inexplicably begins conversations with traumatic anecdotes, such as robberies or shootings. In fact, when this reporter visited Friend.com on Monday afternoon, a chatbot named Donald shared: [his] “The past'' “astonished him.''
My experience using Friend's chatbot. Image credit: friend
In the email above, Schiffman also said that Friend plans to end the chatbot experience.
“I'm thrilled that millions of people have played with what I believe is the most realistic chatbot,” Schiffman wrote. “This really demonstrated our in-house ability to manage traffic and taught us a lot about digital companionship… [But] I'd like to continue focusing solely on hardware, but I've found that digital chatbots and physical companions don't mix well. ”
AI-powered companions have become a hot topic. Google-backed chatbot platform Character.AI has been accused in two separate lawsuits of causing psychological harm to children. Some experts have raised concerns that AI companions could exacerbate isolation by replacing human relationships with artificial ones and generate harmful content that can lead to mental health conditions. I'm doing it.