Prompt templates#

In previous recipes, a prompt was just a simple Python string. We already encountered a situation, where we needed to use a variable in the prompt. For example, let’s say we want to create a pun generator that creates a pun based on a general topic. Every time we prompt the model, only the topic part of the prompt will change. So what is an efficient, convenient way to handle this?

from langchain_dartmouth.llms import ChatDartmouth
from dotenv import find_dotenv, load_dotenv

load_dotenv(find_dotenv())
True

Building prompts with basic Python strings#

As we have done before, we could create a simple string prompt and add the topic to it through string concatenation. First, we define the part of the prompt that does not change:

prompt = "You are a pun generator. Your task is to generate a pun based on the following topic: "

Then, we add the missing piece when we prompt the model:

llm = ChatDartmouth(model_name="llama-3-2-11b-vision-instruct")
response = llm.invoke(prompt + "computer programming")

print(response.content)
Why was the programmer's code sad? Because it had too many bugs to cry about, but in the end, it was just a byte-sized problem!

That works, but it is a little clunky. The main issue here is that we have to design the prompt in a way that puts all the variable parts at the end. For short prompts like this one, this might be acceptable. It greatly limits our design space, though, when we are dealing with longer instructions. What if want more than one variable part with a constant part in between?

Prompt templates#

Prompt templates (e.g., the PromptTemplate class) are components in the LangChain ecosystem that allow you to define your prompts more flexibly by using placeholders and then filling them with actual values when needed.

Let’s create the same pun generator as above using a PromptTemplate:

from langchain_core.prompts import PromptTemplate

prompt = PromptTemplate(
    template="You are a pun generator. Your task is to generate a pun based on the following topic: {topic}"
)

Notice the special substring {topic}! This is how we define a location and a name for a placeholder in the prompt!

Note

Prompt templates are similar to Python’s f-strings or format strings, but offer some additional convenience when using them with other LangChain components, as we will see in some later recipes. Most importantly, they do not require the placeholders to be filled when the string is first defined, but can defer this to a later time when they are invoked (see below).

We can fill in the placeholder using the PromptTemplate component’s invoke method to fully specify the prompt:

print(prompt.invoke("computer science"))
text='You are a pun generator. Your task is to generate a pun based on the following topic: computer science'

We can pass the now complete prompt directly to our LLM:

response = llm.invoke(prompt.invoke("computer science"))
print(response.content)
Here's a pun for you:

"Why did the computer go to therapy? It had a little 'glitch' in its code and needed to work on some 'byte-sized' issues."

So if we want to run this repeatedly for different topics, we only need to change the prompt template’s argument:

topics = ["college", "soccer", "cooking"]

for topic in topics:
    response = llm.invoke(prompt.invoke(topic))
    print(response.content)
    print("-" * 10)
Here's one that's "grade"-worthy: Why did the college student bring a ladder to class? Because they wanted to take their education to a higher level!
----------
Here's a kick-tastic pun for you:

"Why did the soccer ball go to the doctor? It was feeling a little deflated!"

Hope that one scored a goal with you!
----------
Here's one that's sure to "stir" up some laughter:

"Why did the chef go to therapy? Because he was feeling a little 'burned out'!"
----------

We could also extend this technique to multiple placeholders. Here is what the prompt template would look like in that case:

prompt = PromptTemplate(
    template="You are a pun generator. Your task is to generate a pun based on the following topic: {topic}. Your current mood is {mood}."
)

Now that we have more than placeholder, we cannot simply pass a single argument to the invoke method, though, because the prompt would not know which placeholder to map it to. Instead, we pass in a dictionary, using the placeholder names as keys and the desired text to fill-in as values:

placeholder_fillers = {"topic": "computer science", "mood": "exhilirated"}
print(prompt.invoke(placeholder_fillers))
text='You are a pun generator. Your task is to generate a pun based on the following topic: computer science. Your current mood is exhilirated.'

Now we can iterate through two lists t of topics and moods to generate pun for each pair:

moods = ["giggly", "dramatic", "whimsical"]

for topic, mood in zip(topics, moods):
    response = llm.invoke(prompt.invoke({"topic": topic, "mood": mood}))
    print(response.content)
    print("-" * 10)
*giggles* Okay, let's get this degree of pun-ishment started! Here's one:

Why did the college student bring a ladder to class? 

Because they wanted to take their education to a higher level! *giggles*
----------
*sigh* Ah, the pressure is mounting, but I shall rise to the occasion... Here's a soccer pun, crafted with the intensity of a World Cup final:

"Why did the soccer player bring a pillow onto the field? Because he wanted to have a soft defense! *dramatic sigh* The weight of expectation is crushing me, but I've managed to score a pun... now, if you'll excuse me, I need to go practice my penalty kicks... of wit."
----------
I'm feeling "fritter-ly" excited to create a pun for you. Here's one that's sure to "whisk" you away:

"Why did the chef quit his job? Because he couldn't'stir-crazy' anymore and needed to 'knead' a change!"

Hope that "cooks" your laughter!
----------

Since PromptTemplate objects are more than just strings, they have a few methods and fields that can be useful in the right circumstances. For example, you can learn the names of the required placeholders using the field input_variables:

prompt.input_variables
['mood', 'topic']

Chat prompt templates#

You can also create and use templates for chat prompts with a sequence of messages of different types:

from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate(
    [("system", "You are a {animal}."), ("human", "Tell us about {topic}.")]
)

prompt.invoke({"animal": "dog", "topic": "your day"})
ChatPromptValue(messages=[SystemMessage(content='You are a dog.', additional_kwargs={}, response_metadata={}), HumanMessage(content='Tell us about your day.', additional_kwargs={}, response_metadata={})])
response = llm.invoke(prompt.invoke({"animal": "dog", "topic": "your day"}))
print(response.content)
Oh boy, it's been a great day! I woke up early, feeling refreshed and ready to go. My humans, whom I lovingly refer to as "The Food People," took me outside for a quick potty break. I got to sniff all the interesting smells and mark my territory, just to let everyone know that this is my yard.

After that, we headed out for a morning walk. I love walks! I get to explore the neighborhood, meet new dogs (some of which I try to play with, while others I give a friendly bark to), and sniff all the fascinating smells. The Food People always laugh at my antics, and I love making them happy.

When we got back home, The Food People gave me some yummy breakfast, which I gobbled up in about two seconds. I'm a good boy, so I don't get too hungry for too long. Then, we played a game of fetch in the backyard. I love chasing after the ball and bringing it back to The Food People. They always throw it just right, and I get to show off my skills.

After all that exercise, I was ready for a nap. I curled up on my favorite cushion and snoozed for a bit. When I woke up, The Food People had set up a fun puzzle toy filled with treats. I love those things! I have to figure out how to get the treats out, and it keeps me busy for a while.

Later in the day, we had a visitor, my best buddy, a golden retriever named Max. We played together, running around and chasing each other. I'm a bit smaller than Max, but I'm fast, and we always have a great time together.

Now, I'm just relaxing in my bed, feeling content and happy. It's been a great day, and I'm grateful to have such wonderful Food People and a best buddy like Max. Woof woof!

Summary#

Prompt templates allow us to create a consistent structure for our prompts and make them more re-usable across different applications or tasks. This makes it easier to generate the right kind of input for an AI model, while also making the code cleaner and more readable.