You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If I'm not misunderstanding anything, I'd be happy to contribute a pr for this callback that checks for the prompt in the kwargs before checking the positional arguments. Let me know.
The issue you've encountered with llm_completion_callback ignoring the prompt keyword argument is indeed due to the implementation expecting the prompt to be passed as the first positional argument. This is evident in both the wrapped_async_llm_predict and wrapped_llm_predict functions within the llm_completion_callback wrapper, where args[0] is used to access the prompt. Consequently, passing the prompt as a keyword argument leads to an IndexError because it's not positioned in the args tuple as expected.
To work around this issue, ensure you call the complete method with the prompt as the first positional argument:
ollama_instance.complete("Your prompt here")
Avoid passing the prompt as a keyword argument to prevent the IndexError:
Your willingness to contribute a fix for this behavior is commendable. Since I can't accept pull requests, I recommend reaching out directly to the maintainers of the LlamaIndex repository. They can provide guidance on how they'd like to address this issue and whether your proposed solution aligns with their vision for the project.
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
Bug Description
LLM classes, like
Ollama
, expose acomplete
method for providing llm completions like so,however this
llm_completion_callback
wrapper ignores keyword arguments.That is, given
the following works
whereas,
raises
IndexError: tuple index out of range
.This seems to be because in
llm_completion_callback
the callback only checks for the prompt as a positional argumentIf I'm not misunderstanding anything, I'd be happy to contribute a pr for this callback that checks for the prompt in the kwargs before checking the positional arguments. Let me know.
Version
0.10.36
Steps to Reproduce
Relevant Logs/Tracbacks
The text was updated successfully, but these errors were encountered: