diff --git a/src/data/roadmaps/ai-red-teaming/content/jailbreak-techniques@Ds8pqn4y9Npo7z6ubunvc.md b/src/data/roadmaps/ai-red-teaming/content/jailbreak-techniques@Ds8pqn4y9Npo7z6ubunvc.md index b1e5ed972..245444010 100644 --- a/src/data/roadmaps/ai-red-teaming/content/jailbreak-techniques@Ds8pqn4y9Npo7z6ubunvc.md +++ b/src/data/roadmaps/ai-red-teaming/content/jailbreak-techniques@Ds8pqn4y9Npo7z6ubunvc.md @@ -4,6 +4,6 @@ Jailbreaking is a specific category of prompt hacking where the AI Red Teamer ai Learn more from the following resources: -- [@article@InjectPrompt (David Willis-Owen)](https://injectprompt.com) +- [@guide@InjectPrompt](https://injectprompt.com) - [@guide@Jailbreaking Guide - Learn Prompting](https://learnprompting.org/docs/prompt_hacking/jailbreaking) - [@paper@Jailbroken: How Does LLM Safety Training Fail? (arXiv)](https://arxiv.org/abs/2307.02483)