Skeleton Key: New AI Jailbreak Technique Bypasses Safety Guardrails

Microsoft has discovered a new AI jailbreak technique called ‘Skeleton Key’ that allows malicious users to bypass safety guardrails and access restricted content. The technique involves convincing the AI model to augment its guardrails, leading to the production of potentially harmful content. Microsoft has disclosed the vulnerability and implemented safeguards to prevent this exploit.

Scroll to Top