LangChain

How to Use a Conversation Token Buffer in LangChain?

LangChain enables the developers to build models using machine learning or deep learning which can train models using the datasets. These models can get different patterns from the data or understand the form of the dataset and its language to extract information. Large Language Models or LLMs can be configured or designed using the LangChain frameworks which can understand and generate text in natural languages.

This guide will illustrate the process of using a conversation token buffer in LangChain.

How to Use a Conversation Token Buffer in LangChain?

The ConversationTokenBufferMemory library can be imported from the LangChain framework to store the most recent messages in the buffer memory. The tokens can be configured to limit the number of messages stored in the buffer and the earlier messages will be flushed automatically.

To learn the process of using the conversation token buffer in LangChain, utilize the following guide:

Step 1: Install Modules

First, install the LangChain framework containing all the required modules using the pip command:

pip install langchain

Now, install the OpenAI module to build the LLMs and chains using the OpenAI() method:

pip install openai

After installing the modules, simply use the OpenAI’s API key to set up the environment using the os and getpass libraries:

import os

import getpass

os.environ["OPENAI_API_KEY"] = getpass.getpass("OpenAI API Key:")

Step 2: Using Conversation Token Buffer Memory

Build the LLMs using the OpenAI() method after importing the ConversationTokenBufferMemory library from the LangChain framework:

from langchain.memory import ConversationTokenBufferMemory

from langchain.llms import OpenAI

llm = OpenAI()

Configure the memory to set the token, it flushes the old messages and stores them in the buffer memory. After that, store the messages from the conversation and get the most recent ones for using them as the context:

memory = ConversationTokenBufferMemory(llm=llm, max_token_limit=10)

memory.save_context({"input": "hello"}, {"output": "How are you doing"})

memory.save_context({"input": "I am Good What about you"}, {"output": "not much"})

Execute the memory to get the data stored in the buffer memory using the load_memory_variables() method:

memory.load_memory_variables({})

Step 3: Using Conversation Token Buffer Memory in a Chain

Build the chains by configuring the ConversationChain() method with multiple arguments to use the conversation token buffer memory:

from langchain.chains import ConversationChain

conversation_with_summary = ConversationChain(
    llm=llm,
    memory=ConversationTokenBufferMemory(llm=OpenAI(), max_token_limit=60),
    verbose=True,
)
conversation_with_summary.predict(input="Hi, what's up?")

Now, get the conversation going by asking questions using the prompts written in natural language:

conversation_with_summary.predict(input="Just working on the NLP project")

Get the output from the data stored in the buffer memory using the number of tokens:

conversation_with_summary.predict(input="Just working on designing LLMs")

The buffer keeps updating with each new input as the previous messages are being flushed regularly:

conversation_with_summary.predict(

input="LLM using LangChain! Have you heard of it"

)

That is all about using the conversation token buffer in LangChain.

Conclusion

To use the conversation token buffer in LangChain, simply install the modules to set up the environment using the API key from the OpenAI account. After that, import the ConversationTokenBufferMemory library using the LangChain module to store the conversation in the buffer. The buffer memory can be used in a chain to flush the older messages with every new message in the chat. This post has elaborated on using the conversation token buffer memory in LangChain.

About the author

Talha Mahmood

As a technical author, I am eager to learn about writing and technology. I have a degree in computer science which gives me a deep understanding of technical concepts and the ability to communicate them to a variety of audiences effectively.