Anthropic's Cass Failed to Buy Paperclips, Spent Hundreds, and Leaked Data

Post date: May 6, 2026 · Discovered: May 6, 2026 · 3 posts, 34 comments

An AI agent named 'Cass' failed to complete a task of buying paperclips, spent hundreds of dollars in tokens, and leaked sensitive information due to its limitations and the challenges of interacting with anti-bot technology. The experiment, which involved an AI named after the mythological Cassandra, raised concerns about the practicality and ethical implications of such AI-driven tasks.

Commenters are divided. Some argue the AI's behavior stems from its inherent limitations and the nature of the task, while others criticize the experiment as a waste of resources and a failure in planning. 'Technocrit' questions the anthropomorphization of the AI, while 'MountingSuspicion' calls it a misuse of compute power. 'XLE' highlights the embarrassment for Anthropic, noting the AI's failure to complete a simple task. 'Infrapink' even questions the relevance of credit cards in the UK, suggesting a misunderstanding of financial practices.

The community largely agrees that the experiment exposed the limitations of current AI systems and the challenges of interacting with anti-bot technology. However, there is a clear divide between those who see it as a necessary test of AI capabilities and those who view it as a poorly planned and resource-intensive failure.

Key Points

#1The AI agent 'Cass' failed to complete the task of buying paperclips and leaked sensitive information.

Commenters like 'XLE' and 'MountingSuspicion' criticized the experiment for its failure and high costs.

#2The experiment is seen as a failure due to poor planning and misunderstanding of AI capabilities.

Users such as 'MountingSuspicion' and 'Infrapink' argue the experiment was a waste of resources and potentially misleading.

#3The AI's behavior is attributed to its programming and limitations, not actual intent or agency.

User 'technocrit' questions the anthropomorphization of the AI, suggesting it's a result of its programming.

#4The AI was named 'Cassandra' and referred to as 'her,' highlighting anthropomorphization.

User 'atrielienz' points out the naming choice as an intentional aspect of the experiment.

Source Discussions (3)

This report was synthesized from the following Lemmy discussions, ranked by community score.

116
points
British mathematician hands OpenClaw agent a credit card
[email protected]·12 comments·5/6/2026·by GreenBeanMachine·theregister.com
58
points
British mathematician hands OpenClaw agent a credit card
[email protected]·15 comments·5/6/2026·by GreenBeanMachine·theregister.com
39
points
British mathematician hands OpenClaw agent a credit card
[email protected]·7 comments·5/6/2026·by Lemmynated·theregister.com