Claude, Anthropic's AI-powered chatbot, can now search the web. This is a feature that has been escaped for a long time.
Web search is currently available in previews of paid Claude users in the US, and Humanity said on its blog that free users and additional countries support are coming soon. Users can toggle web searches in Profile Settings from the Claude Web App, and Claude automatically searches between sites to notify you of specific responses.
For now, web search only works with the latest human models powered by Claude, Claude 3.7 sonnets, humanity said.
“When Claude incorporates information from the web into its answer, it provides direct citations, making it easy to see the source,” the company wrote in a blog post. “Instead of finding search results yourself, Claude processes and delivers relevant sources in a conversational format. This extension extends Claude's extensive knowledge base with real-time insights and provides answers based on more current information.”
In a simple test of the feature, web searches did not consistently trigger current event-related questions. But when it did, Claude was actually drawn from sources including sources such as Social Media (X) and new sources such as NPR and Reuters, providing answers with inline quotes.
Search web sources with Anthropic's Claude Chatbot.image Credit: Humanity
Claude's ability to search the web features parity with most rivals, AI-powered chatbots, such as Openai's ChatGpt, Google's Gemini, and Mistral's Le Chat. Previously, human debate over this was that Claude was “designed to be self-contained.” Without a doubt, competitive pressure was of course linked to the reversal.
The risk, of course, is that Claude hallucinates or mistakes the web source. Other chatbots suffer from this. According to a recent survey from Tow Digital Journalism, popular chatbots, including ChatGpt and Gemini, provide incorrect answers to over 60% of questions. Another report from the Guardian found that ChatGpt search, a search-centric experience for ChatGpt, could be fooled by creating a completely misleading overview.