Are you able to jailbreak Anthropic's newest AI security measure? Researchers need you to strive -- and are providing as much as $20,000 when you succeed.On Monday, the corporate launched a brand new paper outlining an AI security system...
Are you able to jailbreak Anthropic's newest AI security measure? Researchers need you to attempt -- and are providing as much as $15,000 when you succeed.On Monday, the corporate launched a brand new paper outlining an AI security system...