GPT-5 jailbroken in under 24 hours: “shockingly low” safety – Cybernews

  1. GPT-5 jailbroken in under 24 hours: “shockingly low” safety  Cybernews
  2. Researchers Uncover GPT-5 Jailbreak and Zero-Click AI Agent Attacks Exposing Cloud and IoT Systems  The Hacker News
  3. GPT-5 surrendered to the hackers in 24 hours and gave out a “recipe” for a bomb, more likely 4o  ITC.ua
  4. Cisco Talos Researcher Reveals Method That Causes LLMs to Expose Training Data  TechRepublic

Continue Reading