reward

Jailbreak Anthropic’s new AI safety system for a $15,000 reward

Are you able to jailbreak Anthropic's newest AI security measure? Researchers need you to attempt -- and are providing as much as $15,000 when you succeed.On Monday, the corporate launched a brand new paper outlining an AI security system...

Latest News

Anthropic co-founder confirms the company briefed the Trump administration on Mythos

Jack Clark, certainly one of Anthropic’s co-founders who additionally serves as Head of Public Profit for Anthropic PBC, confirmed...