computer-scienceangular-roadmapbackend-roadmapblockchain-roadmapdba-roadmapdeveloper-roadmapdevops-roadmapfrontend-roadmapgo-roadmaphactoberfestjava-roadmapjavascript-roadmapnodejs-roadmappython-roadmapqa-roadmapreact-roadmaproadmapstudy-planvue-roadmapweb3-roadmap
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
1.3 KiB
1.3 KiB
Prompt Engineering
For AI Red Teamers, prompt engineering is both a tool and a target. It's a tool for crafting inputs to test model boundaries and vulnerabilities (e.g., creating jailbreak prompts). It's a target because understanding how prompts influence LLMs is key to identifying prompt injection vulnerabilities and designing defenses. Mastering prompt design is fundamental to effective LLM red teaming.
Learn more from the following resources:
- @article@Introduction to Prompt Engineering - Datacamp - Tutorial covering basics.
- @article@System Prompts - InjectPrompt - Look at the system prompts of flagship LLMs.
- @course@Introduction to Prompt Engineering - Learn Prompting - Foundational course from Learn Prompting.
- @guide@Prompt Engineering Guide - Learn Prompting - Comprehensive guide from Learn Prompting.
- @guide@The Ultimate Guide to Red Teaming LLMs and Adversarial Prompts (Kili Technology) - Connects prompt engineering directly to LLM red teaming concepts.