How to use SQLDatabaseChain from LangChain with memory? – Langchain

by
Ali Hasan
artificial-intelligence langchain llama-cpp-python openai-api py-langchain

Quick Fix: To use SQLDatabaseChain from LangChain with memory, you can add a template to your LLM with a chat history field and use ConversationBufferMemory() to add it as a memory key. Alternatively, you can use an agent with a chat history in the suffix and ConversationBufferMemory() with chat_memory=message_history.

The Problem:

Use SQLDatabaseChain from LangChain with memory to build a chain that can make queries against a database and maintain context across queries. The chain should be able to use the context to provide accurate answers to subsequent queries. Also, determine if it’s possible to achieve this without using an agent.

The Solutions:

Solution 1: Adding Memory to SQLDatabaseChain

To enable context retention, add a memory key to the ConversationBufferMemory() and a template to the LLM chain. Here’s an example:

template = """
You are a chatbot having a conversation with a human.

{chat_history}
Human: {human_input}
Chatbot:
"""

prompt = PromptTemplate(input_variables=["chat_history", "human_input"], template=template)
memory = ConversationBufferMemory(memory_key="chat_history")

llm_chain = LLMChain(
    llm=OpenAI(),
    prompt=prompt,
    verbose=True,
    memory=memory
)

Alternatively, you can also use an agent with a chat history in the suffix. Here’s the code:

prefix = """Have a conversation with a human, answering the following questions as best you can. You have access to the following tools:"""
suffix = """
Begin!
"""

{chat_history}
Question: {input}
{agent_scratchpad}
"""

prompt = ZeroShotAgent.create_prompt(
    tools,
    prefix=prefix,
    suffix=suffix,
    input_variables=["input", "chat_history", "agent_scratchpad"],
)

memory = ConversationBufferMemory(memory_key="chat_history", chat_memory=message_history)

By incorporating memory, the chain can retain context and provide more accurate responses based on previous interactions.

Solution 2: Use historical queries

To fix the issue where the chain doesn’t use the context in memory to give a proper answer, you can try the following approach:

  1. Generate a single query by looking at all of the user’s historical questions at once.
  2. Pass both the historical questions and the current question as a single prompt to the LLM to generate an SQL query.
  3. The generated query should ideally generate a query like:
    select email from owners JOIN websites ON owners.id = websites.owner_id WHERE domain = 'https://damon.name' LIMIT 5;
    

Q&A

How to add memory to SQLDatabaseChain from LangChain?

Use ConversationBufferMemory()

How to avoid using agent memory with SQLDatabaseChain from LangChain?

Add chat history template to LLM

How to make multiple user question to generate single query using SQLDatabaseChain from LangChain?

Pass all historical questions as single prompt to LLM

Video Explanation:

The following video, titled "LangChain How To Add Memory To Agent & Chatbot - YouTube", provides additional insights and in-depth exploration related to the topics discussed in this post.

Play video

This video provides further insights and detailed explanations related to the content discussed in the article.