BOOK THIS SPACE FOR AD
ARTICLE ADJailbreaking LLM-Controlled Robots
Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.
Tags: hacking, LLM, robotics, social engineering
Subscribe to comments on this entry
Leave a comment
All comments are now being held for moderation. For details, see this blog post.
LoginName
URL:
Remember personal info?
Fill in the blank: the name of this blog is Schneier on ___________ (required):
Comments:
Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/
Sidebar photo of Bruce Schneier by Joe MacInnis.