The Problem:
When attempting to use a custom LLM with llama_index
, an ImportError
is encountered, indicating that CustomLLM
cannot be imported from llama_index.llms
. Despite following the example provided in the documentation, the error persists. The llama_index
version being used is up-to-date (0.7.1). Provide a workaround or solution to enable the use of a custom dataset in llama_index
.
The Solutions:
Solution 1: Change Import Library
You need to change your import library from:
“`
from llama_index.llms import CustomLLM, CompletionResponse, LLMMetadata
“`
To this:
“`
from llama_index.llms.custom import CustomLLM
from llama_index.llms.base import CompletionResponse, LLMMetadata
“`
Q&A
What’s the modification needed to load custom LLM using llama_index?
Change the import library to from llama_index.llms.custom import CustomLLM
.
Video Explanation:
The following video, titled "Related Video", provides additional insights and in-depth exploration related to the topics discussed in this post.
This video provides further insights and detailed explanations related to the content discussed in the article.
The following video, titled "Related Video", provides additional insights and in-depth exploration related to the topics discussed in this post.
This video provides further insights and detailed explanations related to the content discussed in the article.