LangChain

How to Use a Conversation Buffer Window in LangChain?

LangChain is the framework that can be used in the Python notebook to train language models or chatbots using machine learning models. These language models are used to have a conversation with humans in their natural language after training them in human-like languages.

This post will illustrate the process of using a conversation buffer window in LangChain.

How to Use a Conversation Buffer Window in LangChain?

The conversation buffer window is used to keep the most recent messages of the conversation in the memory to get the most recent context. It uses the value of the K for storing the messages or strings in the memory using the LangChain framework.

To learn the process of using the conversation buffer window in LangChain, simply go through the following guide:

Step 1: Install Modules

Start the process of using the conversation buffer window by installing the LangChain module with the required dependencies for building conversation models:

pip install langchain

After that, install the OpenAI module that can be used to build the Large Language Models in LangChain:

pip install openai

Now, set up the OpenAI environment to build the LLM chains using the API key from the OpenAI account:

import os
import getpass

os.environ["OPENAI_API_KEY"] = getpass.getpass("OpenAI API Key:")

Step 2: Using Conversation Buffer Window Memory

To use the conversation buffer window memory in LangChain, import the ConversationBufferWindowMemory library:

from langchain.memory import ConversationBufferWindowMemory

Configure the memory using the ConversationBufferWindowMemory() method with the value of k as its argument. The value of the k will be used to keep the most recent messages from the conversation and then configure the training data using the input and output variables:

memory = ConversationBufferWindowMemory( k=1)

memory.save_context({"input": "hello"}, {"output": "How are you doing"})

memory.save_context({"input": "I am Good What about you"}, {"output": "not much"})

Test the memory by calling the load_memory_variables() method to start the conversation:

memory.load_memory_variables({})

To get the history of the conversation, configure the ConversationBufferWindowMemory() function using the return_messages argument:

memory = ConversationBufferWindowMemory( k=1, return_messages=True)

memory.save_context({"input": "hi"}, {"output": "whats up"})

memory.save_context({"input": "not much you"}, {"output": "not much"})

Now, call the memory using the load_memory_variables() method to get the response with the history of the conversation:

memory.load_memory_variables({})

Step 3: Using Buffer Window in a Chain

Build the chain using the OpenAI and ConversationChain libraries and then configure the buffer memory to store the most recent messages in the conversation:

from langchain.chains import ConversationChain
from langchain.llms import OpenAI
#building summary of the conversation using multiple parameters
conversation_with_summary = ConversationChain(
    llm=OpenAI(temperature=0),
#building memory buffer using its function with the value of k to store recent messages
    memory=ConversationBufferWindowMemory(k=2),
#configure verbose variable to get more readable output
    verbose=True
)
conversation_with_summary.predict(input="Hi, what's up")

Now keep the conversation going by asking the question related to the output provided by the model:

conversation_with_summary.predict(input="What's their issues")

The model is configured to store only one previous message which can be used as the context:

conversation_with_summary.predict(input="Is it going well")

Ask for the solution to the problems and the output structure will keep sliding the buffer window by removing the earlier messages:

conversation_with_summary.predict(input="What's the solution")

That is all about the process of using the Conversation buffer windows LangChain.

Conclusion

To use the conversation buffer window memory in LangChain, simply install the modules and set up the environment using OpenAI’s API key. After that, build the buffer memory using the value of k to keep the most recent messages in the conversation to keep the context. The buffer memory can also be used with chains to instigate the conversation with the LLM or chain. This guide has elaborated on the process of using the conversation buffer window in LangChain.

About the author

Talha Mahmood

As a technical author, I am eager to learn about writing and technology. I have a degree in computer science which gives me a deep understanding of technical concepts and the ability to communicate them to a variety of audiences effectively.