|
|
@ -4,6 +4,6 @@ Jailbreaking is a specific category of prompt hacking where the AI Red Teamer ai |
|
|
|
|
|
|
|
|
|
|
|
Learn more from the following resources: |
|
|
|
Learn more from the following resources: |
|
|
|
|
|
|
|
|
|
|
|
- [@article@InjectPrompt (David Willis-Owen)](https://injectprompt.com) |
|
|
|
- [@guide@InjectPrompt](https://injectprompt.com) |
|
|
|
- [@guide@Jailbreaking Guide - Learn Prompting](https://learnprompting.org/docs/prompt_hacking/jailbreaking) |
|
|
|
- [@guide@Jailbreaking Guide - Learn Prompting](https://learnprompting.org/docs/prompt_hacking/jailbreaking) |
|
|
|
- [@paper@Jailbroken: How Does LLM Safety Training Fail? (arXiv)](https://arxiv.org/abs/2307.02483) |
|
|
|
- [@paper@Jailbroken: How Does LLM Safety Training Fail? (arXiv)](https://arxiv.org/abs/2307.02483) |
|
|
|