Using Custom JSON data for context in Langchain and ConversationChain() in ChatGPT OpenAI – Langchain

by
Ali Hasan
chatgpt-api langchain openai-api py-langchain python-3.x

The Solutions:

Solution 1: Using Langchain with Custom JSON Data

To use Langchain with your custom JSON dataset, you can follow these steps:

  1. Convert your JSON dataset into a list of dictionaries, where each dictionary represents a conversation turn. Each turn should contain "sender" (either "agent" or "user") and "content" keys.
  2. Create a ConversationalRetrievalChain instance, specifying your ChatOpenAI model and a vector store as the retriever.
  3. Pass your conversation turns as a list to the ConversationalRetrievalChain instance’s constructor. This will create a ConversationChain object that will provide context to the model.
  4. Use the predict method of the ConversationChain object to generate responses based on the provided context and your prompt.

Here’s an example code snippet:

import json
from langchain.chat_models import ChatOpenAI
from langchain.schema import AIMessage, HumanMessage
from langchain.retrieval import ConversationalRetrievalChain, vectorstore

# Load your JSON dataset
with open("custom_json.json", "r") as f:
    conversation_turns = json.load(f)

# Convert the JSON data into a list of dictionaries
conversation_turns = [{"sender": turn["sender"], "content": turn["content"]} for turn in conversation_turns]

# Create a ConversationalRetrievalChain instance
qa = ConversationalRetrievalChain.from_llm(
    ChatOpenAI(temperature=0.5, model="gpt-3.5-turbo"),
    vectorstore.as_retriever()
)

# Create a ConversationChain object with your conversation turns
conversation_chain = qa.from_messages(conversation_turns)

# Use the predict method to generate a response
prompt = "What did the president say about Ketanji Brown Jackson?"
response = conversation_chain.predict({"question": prompt})

print(response.content)

Q&A

Can I use custom JSON data for context in Langchain and ConversationChain() in ChatGPT OpenAI?

Yes, you can use a JSON file to provide context to your questions in Langchain.

How do I provide the JSON dataset to my ChatOpenAI() and ConversationChain()?

You can use the ‘predict_messages()’ method to provide the JSON dataset to your ChatOpenAI() and ConversationChain().

Video Explanation:

The following video, titled "Build ChatGPT Chatbots with LangChain Memory: Understanding ...", provides additional insights and in-depth exploration related to the topics discussed in this post.

Play video

Memory is a crucial element in building chatbots that can maintain a coherent conversation with users. In this video, we will explore ...