- GPT-5 jailbroken in under 24 hours: “shockingly low” safety Cybernews
- Researchers Uncover GPT-5 Jailbreak and Zero-Click AI Agent Attacks Exposing Cloud and IoT Systems The Hacker News
- GPT-5 surrendered to the hackers in 24 hours and gave out a “recipe” for a bomb, more likely 4o ITC.ua
- Cisco Talos Researcher Reveals Method That Causes LLMs to Expose Training Data TechRepublic