Using Few-Shot Prompts with Langchain and OpenAI API in Real-World Applications

What are prompts?

A prompt for a language model is a set of instructions or input provided by a user to guide the model’s response, helping it understand the context and generate relevant and coherent language-based output, such as answering questions, completing sentences, or engaging in a conversation.

Setup the openAI API key

Look into the tutorials here for more details.

import os 
import openai
api_key = os.getenv("OPENAI_API_KEY")

What is zero-shot prompting?

Zero-shot prompting is a prompting technique where a user presents a task to an LLM without giving the model further examples. Here, the user expects the model to perform the task without a prior understanding, or shot, of the task. We use the pre-existing knowledge of the model to generate the response based on a particular context.

from langchain import PromptTemplate
demotemplate = '''I want you to act as a acting financial advisor for people.
In an easy way, explain the basics of {financial_concept}.'''

prompt=PromptTemplate(
    input_variables=["financial_concept"], 
    template=demotemplate
)

prompt.format(financial_concept="stocks")

Output:

'I want you to act as a acting financial advisor for people.\nIn an easy way, explain the basics of stocks.'

Langchain and open AI integration using a zero-shot prompt template

LangChain is a framework for developing applications powered by large language models (LLMs). Read more on their website here.

Find the prompt template documentation here

Temperature is a number between 0 and 2, with a default value of 1 or 0.7 depending on the model you choose. The temperature is used to control the randomness of the output. When you set it higher, you’ll get more random outputs. When you set it lower, towards 0, the values are more deterministic.

from langchain.llms.openai import OpenAI
from langchain.chains import LLMChain
llm=OpenAI(temperature=0.7)
chain1=LLMChain(llm=llm,prompt=prompt)
chain1.run('stocks')

Output:

"\n\nAs a financial advisor, it is my job to help people understand the basics of stocks and how they work. Stocks, also known as equities, represent ownership in a company. When you buy a stock, you are essentially buying a small piece of that company.\n\nSo why would someone want to buy stocks? Well, the main reason is to potentially make money. When a company does well, the value of its stock typically increases, and investors can sell their shares for a profit. However, it's important to remember that stocks also come with risks. If a company performs poorly, the value of its stock may decrease and investors could lose money.\n\nAnother important concept to understand is the stock market. This is where stocks are bought and sold, and it is influenced by various factors such as economic conditions, company performance, and investor sentiment. The stock market is constantly changing, so it's important to do research and stay informed before making any investment decisions.\n\nWhen buying stocks, you have the option to purchase individual stocks or invest in a group of stocks through a mutual fund or exchange-traded fund (ETF). These funds are managed by professionals and offer diversification, which means your money is spread across multiple companies, reducing your risk.\n\nIt's also important to consider your investment"

What is a few-shot prompt?

Few-shot prompting is a prompting technique where you give the model contextual information about the requested tasks. In this technique, you provide examples of both the task and the output you want. Providing this context, or a few shots, in the prompt conditions the model is to follow the task guidance closely.

Langchain and open AI integration using a few-shot prompt template

from langchain import PromptTemplate, FewShotPromptTemplate


# First, create the list of few shot examples.
examples = [
    {"multipleChoiceQuestion": "Generate a Math question for a 7 year old", "level":"easy","answer": "What is 2+3? A.5 B.6 C.8 D.100 answer: 5"},
    {"multipleChoiceQuestion": "Generate a Math question for a 7 year old", "level":"medium","answer":"What is 25+88? A.113 B.6 C.8 D.100 answer: 113"},
    {"multipleChoiceQuestion": "Generate a Math question for a 8 year old", "level":"easy","answer":"What is 26+883? A.5 B.6 C.909 D.100 answer: 909"},
    {"multipleChoiceQuestion": "Generate a Math question for a 8 year old", "level":"hard","answer":"What is 1014+883? A.1014 B.884 C.1896 D.1897 answer: 1897"}
]

# Next, specify the template to format the examples we have provided.
example_formatter_template = """MultipleChoiceQuestion: {multipleChoiceQuestion}
level: {level}
Answer: {answer}
"""

example_mcq_prompt = PromptTemplate(
    input_variables=["multipleChoiceQuestion","level"],
    template=example_formatter_template,
)
few_shot_mcq_prompt = FewShotPromptTemplate(
    # These are the examples we want to insert into the prompt.
    examples=examples,
    # This is how we want to format the examples when we insert them into the prompt.
    example_prompt=example_mcq_prompt,
    # The prefix is some text that goes before the examples in the prompt.
    # Usually, this consists of intructions.
    prefix="Give the MCQ with options and answer of every input based on their level\n",
    # The suffix is some text that goes after the examples in the prompt.
    # Usually, this is where the user input will go
    suffix="MultipleChoiceQuestion: {input} Level: {level} \Answer: ",
    # The input variables are the variables that the overall prompt expects.
    input_variables=["input","level"],
    # The example_separator is the string we will use to join the prefix, examples, and suffix together with.
    example_separator="\n",
)
print(few_shot_mcq_prompt.format(input="Generate a Math question for a 7 year old",level="medium"))

Output:

The output shows how the prompt is formatted as per the user input.

Give the MCQ with options and answer of every input based on their level

MultipleChoiceQuestion: Generate a Math question for a 7 year old
level: easy
Answer: What is 2+3? A.5 B.6 C.8 D.100 answer: 5

MultipleChoiceQuestion: Generate a Math question for a 7 year old
level: medium
Answer: What is 25+88? A.113 B.6 C.8 D.100 answer: 113

MultipleChoiceQuestion: Generate a Math question for a 8 year old
level: easy
Answer: What is 26+883? A.5 B.6 C.909 D.100 answer: 908

MultipleChoiceQuestion: Generate a Math question for a 8 year old
level: hard
Answer: What is 1014+883? A.1014 B.884 C.1896 D.1897 answer: 1897

MultipleChoiceQuestion: Generate a Math question for a 7 year old Level: medium \Answer: 

Now call the llm with the few_shot_mcq_prompt.

llm(few_shot_mcq_prompt.format(input="Generate a Math question for a 7 year old",level="medium"))

Output:

'Here is the MCQ with options and answer:\n\n**Level: medium**\nWhat is 43+17?\nA.50\nB.60\nC.70\nD.80\n\n**Answer:** B.60'

Using these separators one can split the string and store them in a CSV file in a required format. Looping this llm query inside a while or a for loop to run it a certain number of times will give you a list of questions as per your needs.

Click here to view the entire notebook

Other AI topics

View my portfolio here

Do you have business requirements that require creating prompts to enhance your client experience? Book a consultation call with me here.