LangChain

How to Compose Multiple Prompts in LangChain?

LangChain provides the environment for creating Large Language Models (LLMs) and chatbots for humans to interact with machines. The process uses a pipeline to build these models, comprising multiple steps from importing libraries to preprocessing the database to the testing phase. The prompts are a vital aspect of the model building in LangChain, enabling the model to perform better.

This post illustrates the process of composing multiple prompts in LangChain.

How to Compose Multiple Prompts in LangChain?

Multiple prompts can be used to get the parts of the command or prompts that were used previously to avoid repetition. The model can get the previous command to combine it with the current prompt and provide the display or reply accordingly.

To learn how to compose multiple prompts in LangChain, simply go through the listed steps:

Step 1: Install LangChain

Start composing multiple prompts in LangChain by installing the LangChain framework:

pip install langchain

Step 2: Using Prompt Template

After that, import the required libraries like PipelinePromptTemplate and PromptTemplate for using multiple prompts:

from langchain.prompts.pipeline import PipelinePromptTemplate
from langchain.prompts.prompt import PromptTemplate

Configure the template for the prompt and then call the PromptTemplate() method using the template:

full_template = """{introduction}

{example}

{start}"""
full_prompt = PromptTemplate.from_template(full_template)

After that, simply configure another prompt using the “introduction_template” variable and call the PromptTemplate() in another variable:

introduction_template = """You are impersonating {person}"""
introduction_prompt = PromptTemplate.from_template(introduction_template)

Set the example_template variable for the model so it can return the output in the given format:

example_template = """Here's an example of an interaction

Q: {example_q}
A: {example_a}"""
example_prompt = PromptTemplate.from_template(example_template)

Now, simply set the template for the question-and-answer style for the bot to understand the style:

start_template = """Now, do this for real!

Q: {input}
A:"""
start_prompt = PromptTemplate.from_template(start_template)

Step 3: Using Pipeline Prompt Template

After that, use the PipelinePromptTemplate() with multiple parameters which are the templates configured in the previous step:

input_prompts = [
    ("introduction", introduction_prompt),
    ("example", example_prompt),
    ("start", start_prompt)
]
pipeline_prompt = PipelinePromptTemplate(
    final_prompt=full_prompt,
    pipeline_prompts=input_prompts
    )

Step 4: Using Multiple Prompts

Execute the pipeline template to call the input variable to display the output which is the format of the prompts:

pipeline_prompt.input_variables

Call the prompt templates using a dataset with values for its parameter to display the output like real chat or conversation:

print(pipeline_prompt.format(
    person="Elon Musk",
    example_q="What's your favorite car",
    example_a="Tesla",
    input="What's your favorite social media site"
))

That is all about composing multiple prompts using the LangChain framework.

Conclusion

To compose the multiple prompts in LangChain, simply install the LangChain framework to get the PromptTemplate and PipelinePromptTemplate libraries. After that, configure the prompts using the PromptTemplate library and then combine them in the PipelinePromptTemplate library. Execute the prompts using the example dataset and get the output for different prompts. This post has illustrated the process of composing multiple prompts in LangChain.

About the author

Talha Mahmood

As a technical author, I am eager to learn about writing and technology. I have a degree in computer science which gives me a deep understanding of technical concepts and the ability to communicate them to a variety of audiences effectively.