From cf5a7d055aa0d7bd842e04ebe99d23e4a64c3a41 Mon Sep 17 00:00:00 2001 From: "Charles J. Fowler" Date: Thu, 31 Oct 2024 10:46:54 +0000 Subject: [PATCH] Improve Prompt Engineering - Basic LLM & Prompt Introduction: Links (#7639) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * 📃 docs, data (Image Prompting) Update Topic/Sub Topics - In Place Edits. - intent: Update topic from May 2023 to Oct 2024 - data: src/data/roadmaps/prompt-engineering/content/ - modify - 10X .ms --- Co-authored-by: @iPoetDev * 📃 docs, data (Prompt Engineering Roadmap) Basic Concepts - In Place Edits. - changes: single paragraphs (74-125 words)> - concerns: if any more concise, topics looses fidelity, meaning and utility. - data: src/data/roadmaps/prompt-engineering/content/ - 📂 100-basic-llm - modify: Topic - update content: - index.md - 100-what-are-llm.md - 101-llm-types.md - 102-how-llms-built.md --- Co-authored-by: @iPoetDev * 📃 docs: (Prompt Eng.) Basic LLM Concepts - New Links. - intent: Update topic from May 2023 to Oct 2024 - 📂 100 basic-llm - modify topics: - add links - 100-what-are-llms.md - 101-types-llms.md - 102-how-llms-are-bilt.md BREAKING CHANGE: ❌ --- Co-authored-by: @iPoetDev * docs: (Prompt Eng.) Prompting Introduction - New Links. - intent: Update topic from May 2023 to Oct 2024 - 📂 101-prompting-introduction - modify topics: - add links - index.md - 100-basic-prompting.md - 101-need-for-prompting.md BREAKING CHANGE: ❌ --- Co-authored-by: @iPoetDev --- .../content/100-basic-llm/100-what-are-llms.md | 6 ++++++ .../content/100-basic-llm/101-llm-types.md | 6 ++++++ .../content/100-basic-llm/102-how-llms-built.md | 6 ++++++ .../101-prompting-introduction/100-basic-prompting.md | 6 ++++++ .../101-prompting-introduction/101-need-for-prompting.md | 5 ++++- .../content/101-prompting-introduction/index.md | 3 ++- 6 files changed, 30 insertions(+), 2 deletions(-) diff --git a/src/data/roadmaps/prompt-engineering/content/100-basic-llm/100-what-are-llms.md b/src/data/roadmaps/prompt-engineering/content/100-basic-llm/100-what-are-llms.md index 2a910c582..2f9d22eca 100644 --- a/src/data/roadmaps/prompt-engineering/content/100-basic-llm/100-what-are-llms.md +++ b/src/data/roadmaps/prompt-engineering/content/100-basic-llm/100-what-are-llms.md @@ -6,6 +6,12 @@ LLMs have the ability to achieve state-of-the-art performance in multiple Natura As an example, OpenAI's GPT-3 is a prominent LLM that has gained significant attention due to its capability to generate high-quality text and perform a variety of language tasks with minimal fine-tuning. +Learn more from the following resources: + - [@roadmap.sh@Introduction to LLMs](https://roadmap.sh/guides/introduction-to-llms) +- [@article@Large language model](https://en.wikipedia.org/wiki/Large_language_model) - [@video@Intro to Large Language Models](https://www.youtube.com/watch?v=zjkBMFhNj_g) +- [@video@Large Language Model Operations (LLMOps) Explained](https://www.youtube.com/watch?v=cvPEiPt7HXo) +- [@video@How Large Language Models Work](https://youtu.be/5sLYAQS9sWQ) - [@feed@Explore top posts about LLM](https://app.daily.dev/tags/llm?ref=roadmapsh) + diff --git a/src/data/roadmaps/prompt-engineering/content/100-basic-llm/101-llm-types.md b/src/data/roadmaps/prompt-engineering/content/100-basic-llm/101-llm-types.md index e5bb2a42b..d227bb7b1 100644 --- a/src/data/roadmaps/prompt-engineering/content/100-basic-llm/101-llm-types.md +++ b/src/data/roadmaps/prompt-engineering/content/100-basic-llm/101-llm-types.md @@ -17,3 +17,9 @@ Instruction Tuned LLMs = Base LLMs + Further Tuning + RLHF ``` To build an Instruction Tuned LLM, a Base LLM is taken and is further trained using a large dataset covering sample "Instructions" and how the model should perform as a result of those instructions. The model is then fine-tuned using a technique called "Reinforcement Learning with Human Feedback" (RLHF) which allows the model to learn from human feedback and improve its performance over time. + +Learn more from the following resources: + +- [@article@Understanding AI Models: Base Language Learning Models vs. Instruction Tuned Language Learning Models - Olivier Mills](https://oliviermills.com/articles/understanding-ai-models-base-language-learning-models-vs-instruction-tuned-language-learning-models) +- [@video@Why Are There So Many Foundation Models?](https://www.youtube.com/watch?v=QPQy7jUpmyA) +- [@video@How to Pick the Right AI Foundation Model](https://www.youtube.com/watch?v=pePAAGfh-IU) \ No newline at end of file diff --git a/src/data/roadmaps/prompt-engineering/content/100-basic-llm/102-how-llms-built.md b/src/data/roadmaps/prompt-engineering/content/100-basic-llm/102-how-llms-built.md index dcd58eafd..f2112f18a 100644 --- a/src/data/roadmaps/prompt-engineering/content/100-basic-llm/102-how-llms-built.md +++ b/src/data/roadmaps/prompt-engineering/content/100-basic-llm/102-how-llms-built.md @@ -9,3 +9,9 @@ On a high level, training an LLM model involves three steps i.e. data collection - **Evaluation**: The final step is to evaluate the performance of the model to see how well it performs on various tasks such as question answering, summarization, translation etc. The output from the training Pipeline is an LLM model which is simply the parameters or weights which capture the knowledge learned during the training process. These parameters or weights are typically serialized and stored in a file, which can then be loaded into any application that requires language processing capabilities e.g. text generation, question answering, language processing etc. + +Learn more from the following resources: + +- [@article@What is LLM & How to Build Your Own Large Language Models?](https://www.signitysolutions.com/blog/how-to-build-large-language-models) +- [@guides@Large language model](https://en.wikipedia.org/wiki/Large_language_model) +- [@video@Five Steps to Create a New AI Model](https://youtu.be/jcgaNrC4ElU) \ No newline at end of file diff --git a/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/100-basic-prompting.md b/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/100-basic-prompting.md index 06a11331f..7937c4e79 100644 --- a/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/100-basic-prompting.md +++ b/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/100-basic-prompting.md @@ -26,3 +26,9 @@ Write me an introductory guide about Prompt Engineering. ``` However, using plain text as prompts i.e. without using any best practices you may not be able to fully utilise the power of LLMs. That's where "Prompt Engineering" or knowing the best practices for writing better prompts and getting the most out of LLMs comes in. + +- [@guides@Basics of Prompting | Prompt Engineering Guide](https://www.promptingguide.ai/introduction/basics) +- [@article@Prompting Basics](https://learnprompting.org/docs/basics/prompting) +- [@offical@Prompt engineering - OpenAI API](https://platform.openai.com/docs/guides/prompt-engineering) +- [@offical@Prompt engineering overview - Anthropic](https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/overview) +- [@course@Introduction to Prompt Engineering (Playlist)](https://youtube.com/playlist?list=PLYio3GBcDKsPP2_zuxEp8eCulgFjI5a3g&si=n3Ot-tFECp4axL8L) \ No newline at end of file diff --git a/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/101-need-for-prompting.md b/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/101-need-for-prompting.md index 3443b35cd..380ebe40f 100644 --- a/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/101-need-for-prompting.md +++ b/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/101-need-for-prompting.md @@ -24,4 +24,7 @@ Prompts can help reduce inaccuracies and ambiguities in the AI's responses. By p In conclusion, the need for prompting stems from its role in guiding AI model behavior, improving text quality and relevance, eliciting a specific output, aligning AI and human intent, and reducing inaccuracies and ambiguity in generated content. By understanding and mastering the art of prompting, users can unlock the true potential of AI language models. -- [@article@Prompting Basics](https://learnprompting.org/docs/basics/prompting) \ No newline at end of file +- [@article@Prompting Basics](https://learnprompting.org/docs/basics/prompting) +- [@video@AI prompt engineering: A deep dive](https://youtu.be/T9aRN5JkmL8?si=3uW2BQuNHLcHjqTv) +- [@video@What is Prompt Tuning?](https://www.youtube.com/watch?v=yu27PWzJI_Y) +- [@guides@What is Prompt Engineering? A Detailed Guide For 2024](https://www.datacamp.com/blog/what-is-prompt-engineering-the-future-of-ai-communication) \ No newline at end of file diff --git a/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/index.md b/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/index.md index 2a62bbb3b..4d7072753 100644 --- a/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/index.md +++ b/src/data/roadmaps/prompt-engineering/content/101-prompting-introduction/index.md @@ -24,4 +24,5 @@ Hello, how are you? But it's one of the best practices to be clear and use delimiters to separate the content in prompt from the instructions. You will learn more about it in the "Best Practices" nodes of the roadmap. -- [@article@Basic Prompting](https://learnprompting.org/docs/basics/intro) +- [@article@Basic Prompting - Learn Prompting](https://learnprompting.org/docs/basics/intro) +- [@guides@Basics of Prompting - Prompt Engineering Guide](https://www.promptingguide.ai/introduction/basics) \ No newline at end of file