Add content for real world under prompting techniques

pull/3947/head
Kamran Ahmed 2 years ago
parent 0555452bf2
commit 42debdeab0
  1. 1
      src/data/roadmaps/prompt-engineering/content/102-prompts/prompting-techniques/105-dual-prompt.md
  2. 18
      src/data/roadmaps/prompt-engineering/content/103-real-world/100-structured-data.md
  3. 21
      src/data/roadmaps/prompt-engineering/content/103-real-world/101-inferring.md
  4. 27
      src/data/roadmaps/prompt-engineering/content/103-real-world/102-writing-emails.md
  5. 152
      src/data/roadmaps/prompt-engineering/content/103-real-world/103-coding-assistance.md
  6. 47
      src/data/roadmaps/prompt-engineering/content/103-real-world/104-study-buddy.md
  7. 2
      src/data/roadmaps/prompt-engineering/content/103-real-world/105-designing-chatbots.md
  8. 9
      src/data/roadmaps/prompt-engineering/content/103-real-world/index.md

@ -9,7 +9,6 @@ Write me a guide about SEO.
``` ```
However, this prompt may result in a generic guide without giving you what you may need. However, this prompt may result in a generic guide without giving you what you may need.
```
By adopting dual prompt, you will receive a more specific response that is tailored to your needs. For the above example, we could write our prompt in two prompts as follows: By adopting dual prompt, you will receive a more specific response that is tailored to your needs. For the above example, we could write our prompt in two prompts as follows:

@ -1,2 +1,20 @@
# Structured Data # Structured Data
Asking the model to generate structured data is a great way to utilize the power of LLMs.
For example, you might have an e-commerce application where you want to generate the search query from the user's natural language input. You can instruct LLM to identify the JSON version from the natural language text given by the user. Let's say that the user searches for `Birthday gift for my 18 months old daughter`. We could have the following prompt to generate the JSON object:
```
Print a JSON object containing `gender` ("male", "female"), `occasion` (one of "party", "birthday", "anniversary"), `age_years` (numeric value) from the text delimited by tripple quotes.:
"""Birthday gift for my 18 months old daughter"""
```
The output from model would be:
```json
{
"gender": "female",
"occasion": "birthday",
"age_years": 1.5
}
```

@ -1,2 +1,23 @@
# Inferring # Inferring
Making use of LLMs to infer information from text is another common use case. Imagine you are running a famous e-commerce store with thousands of reviews. You want to group the reviews by the sentiment of the review. You can use LLMs to infer the sentiment of the review.
Here are some prompts showing the use of inferrence:
```
What is the sentiment of this text which is delimited with tripple backticks. Your output must be a single word.
"""Horrible product. They charged me twice the price and it broke after a week. I will never buy from them again."""
```
The output from this prompt is `negative`.
Another example:
```
Identify a list of emotions that the writer of the following review is expressing. Include no more than 5 items in the list. Format your answers as a list of lower-case words separated by commas.
"""I am so happy with this product. It is the best thing I have ever bought. I will definitely buy from them again."""
```
Output: `happy, satisfied, pleased, content, enthusiastic`

@ -1 +1,28 @@
# Writing Emails # Writing Emails
Language Models, like ChatGPT, can be incredibly beneficial for writing emails due to their ability to generate coherent and contextually appropriate text. When composing an email, an LLM can provide assistance with various aspects, such as generating engaging introductions or addressing specific questions or concerns. For instance, if you're writing a sales email, you can prompt the LLM with the request to generate a compelling opening paragraph that captures the recipient's attention. Similarly, when responding to customer inquiries, you can ask the LLM to help craft clear and informative responses that address the query with precision.
For example, here is a prompt to generate an automated email response to a customer inquiry:
```
Write an email informing a customer that their inquiry has been received and will be addressed shortly. Include the order number and the customer's name in the email.
```
Here is the response generated by the LLM:
```
Subject: Confirmation - Inquiry Received - Order #123456
Dear [Customer's Name],
I'm writing to confirm that we have received your inquiry regarding your recent order. We appreciate you taking the time to reach out to us.
Your order number is #123456, and we understand the importance of your inquiry. Rest assured, our team is already reviewing your case and will address it promptly. We strive to provide a timely response and a satisfactory resolution to all customer inquiries.
If we require any additional information or clarification, we will reach out to you directly. In the meantime, please feel free to contact us if you have any further questions or concerns about your order.
We value your business and are committed to ensuring your complete satisfaction. Thank you for choosing our company, and we appreciate your patience as we work on addressing your inquiry.
Best regards,
AI Customer Service
```

@ -1,2 +1,154 @@
# Coding Assistance # Coding Assistance
You can use ChatGPT for debugging, code generation, reformatting, commenting, and more.
## Code Generation
Instructing ChatGPT to generate code is as simple as this:
```
Act as a junior python programmer. Please write the code to generate madlibs from 3 arrays of 6 items each: prepositions, verbs and nouns, with the nouns also being used as direct objects. After writing it, execute the code.
```
The output will be something like:
```python
import random
prepositions = ['at', 'in', 'on', 'with', 'under', 'over']
verbs = ['run', 'jump', 'dance', 'sing', 'laugh', 'cry']
nouns = ['cat', 'dog', 'tree', 'flower', 'book', 'computer']
preposition = random.choice(prepositions)
verb = random.choice(verbs)
noun = random.choice(nouns)
madlib = f"I {verb} {preposition} the {noun}."
print(madlib)
```
## Commenting and Reformatting Code
You can ask ChatGPT to add comments to your code, and reformat it for legibility. After your prompt instruction, add three hashmarks and then paste in the code you'd like it to clean up:
```
Please add line comments to this Python code and reformat it for legibility:
import math
amplitude = 30
frequency = 2
num_points = 50
for i in range(num_points):
y = int(round(amplitude * math.sin(2 * math.pi * frequency * i / num_points)))
print(' ' * (amplitude + y) + '*')
```
The output will be something like:
```python
# Import the math module
import math
# Set the amplitude, frequency, and number of points in the wave
amplitude = 30
frequency = 2
num_points = 50
# Loop over each point in the wave
for i in range(num_points):
# Calculate the y value of the wave using the sine function from the math module
y = int(round(amplitude * math.sin(2 * math.pi * frequency * i / num_points)))
# Print a space for padding, followed by an asterisk at the appropriate position on the wave
print(' ' * (amplitude + y) + '*')
```
## Debugging
Not only can ChatGPT detect syntax errors in code, but it can also find logical errors that would crop up when code is executed. Below is an example of a Python script that eventually causes a division by zero error on line 4 due to a logical error on line 3. Try this simple prompt to find and fix the error:
```
Please debug this Python code:
1. pizza_cost = 100
2. partygoers = 10
3. while partygoers = 0:
4. cost_per_person = pizza_cost / partygoers
5. print("the cost per person is $", "{:.2f}".format(cost_per_person))
6. partygoers -= 2
```
The output will include the corrected while loop, which is changed to ensure the denominator variable stays positive:
```
3. while partygoers > 0:
```
For more nuanced errors, you may need to add more specifics: the error message, environment variables, included libraries or variations in inputs.
## Optimizing Code
If your script is taking too long to execute, ChatGPT can optimize it. Here's an example of optimizing an inefficient script that finds prime numbers:
```
Act like a very senior python developer. Please optimize this script:
for num in range(1, 101):
if num 1:
for i in range(2, num):
if (num % i) == 0:
break
else:
print(num)
```
The output might be a script that uses the Sieve of Eratosthenes algorithm for finding prime numbers:
```python
numbers = list(range(2, 101))
for i in numbers:
for j in range(i * 2, 101, i):
if j in numbers:
numbers.remove(j)
print(numbers)
```
Note that we asked the bot to "act like a senior developer" to optimize the script. You can also dictate that it have a certain area of expertise (e.g., sorting algorithms) or number of years of experience. Alternatively, if you have a script that seems overly complicated, you can ask ChatGPT to write that script "as a very junior developer."
## Translating Between Programming Languages
A very useful feature ChatGPT offers is the ability to port programs from one language to another. A typical use for this might be upgrading code in an older language to a more modern one:
```
Act like an experienced developer with knowledge of both COBOL and Python. Please convert this COBOL program to Python:
IDENTIFICATION DIVISION.
PROGRAM-ID. CALCULATE-SS-TAX.
DATA DIVISION.
WORKING-STORAGE SECTION.
01 SALARY PIC 9(7)V99.
01 SS-TAX PIC 9(7)V99.
PROCEDURE DIVISION.
DISPLAY "Enter employee's salary: ".
ACCEPT SALARY.
COMPUTE SS-TAX = SALARY * 0.062.
DISPLAY "Social Security tax is: $", SS-TAX.
STOP RUN.
```
The output might be something like this:
```python
salary = float(input("Enter employee's salary: "))
ss_tax = salary * 0.062
print("Social Security tax is: $", round(ss_tax, 2))
```
Learn more from the following resources:
- [LLM Coding Assitance](https://learnprompting.org/docs/basic_applications/coding_assistance)

@ -1,2 +1,49 @@
# Study Buddy # Study Buddy
One of our favorite ways to use LLMs is as a study tool! In particular, it is useful for explaining confusing terms as well as quizzing you on test content.
## Explaining Terms
I was recently reading a textbook about Cacti1, and saw this passage:
```
Because of adaptations to xeric conditions and the presence of **less derived woods** in Pereskioideae, the Cactaceae is one of the most interesting families both anatomically and morphologically.
```
I had no idea what less derived woods meant, and Google only showed research articles on it. I decided to use ChatGPT to find out what it meant. I gave it the following prompt, which is simple the passage from the book and my question.
```
Because of adaptations to xeric conditions and the presence of less derived woods in Pereskioideae, the Cactaceae is one of the most interesting families both anatomically and morphologically.
What do less derived woods mean?
```
The model responded with:
```
In this context, "less derived woods" refers to the ancestral or primitive form of wood found in the early evolution of plants. These woods lack specialized structures and features that evolved later in more derived or advanced plants. In the Pereskioideae subfamily of the Cactaceae family, the wood is less derived compared to other subfamilies and is more similar to the wood found in non-cactus plants. The less derived wood in Pereskioideae is thought to be an adaptation to the more humid and less xeric environments in which these plants grow compared to other cacti.
```
The model was able to answer my question quickly and I was able to continue with my reading.
## Quizzing Yourself
Another way to use LLMs as a study tool is to quiz yourself. If you have an upcoming calculus test, you could simply ask:
```
Generate 5 calculus I quiz questions for me:
```
And the model will generate 5 questions for you to answer.
```
1. What is the derivative of y = x3?
2. Evaluate the indefinite integral of x3 + 5x2 + 7x + 4.
3. What is the product rule for derivatives?
4. Find the equation of the line tangent to y = x2 at the point (4, 16).
5. What is the second derivative of y = x4?
```
There are probably other ways to use LLMs to study. Many learning companies like Duolingo and Quizlet are implementing LLMs into their products.
- [LLMs as a Study Buddy](https://learnprompting.org/docs/basic_applications/study_tool)

@ -1,2 +1,4 @@
# Designing Chatbots # Designing Chatbots
Building chatbots to offer customer support, sales, or other services is a hot topic in the tech industry. LLMs make it possible to build chatbots that can respond to a wide variety of user inputs, and can be trained to respond to new inputs with minimal effort.

@ -1,2 +1,11 @@
# Real World Usage Examples # Real World Usage Examples
LLMs are used in a variety of ways. Here are some examples of how LLMs are used in the real world.
- Natural Language Processing
- Information Retrieval
- Question Answering
- Text Generation
- Text Summarization
- Learning Tools
- Chatbots

Loading…
Cancel
Save