Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: tuple index out of range when calling LLM with prompt keyword arg #13056

Closed
erictse opened this issue Apr 23, 2024 · 3 comments · Fixed by #13405
Closed

[Bug]: tuple index out of range when calling LLM with prompt keyword arg #13056

erictse opened this issue Apr 23, 2024 · 3 comments · Fixed by #13405
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@erictse
Copy link

erictse commented Apr 23, 2024

Bug Description

This succeeds:

output = llm.complete(prompt)

but see how it fails in Steps to Reproduce below. I could be wrong if providing prompt as a keyword arg is unsupported, but I didn't see any documentation or comments about that.

Version

0.10.30

Steps to Reproduce

output = llm.complete(prompt=prompt)

Relevant Logs/Tracbacks

File "…/main.py", line 411, in …
    output = llm.complete(prompt=prompt)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "…/.venv/lib/python3.12/site-packages/llama_index/core/llms/callbacks.py", line 305, in wrapped_llm_predict
    prompt = kwargs.get("prompt", default=args[0])
                                          ~~~~^^^
IndexError: tuple index out of range
@erictse erictse added bug Something isn't working triage Issue needs to be triaged/prioritized labels Apr 23, 2024
Copy link

dosubot bot commented Apr 23, 2024

Hey there, @erictse! 👋 I'm Dosu, a bot crafted to assist with squashing bugs, answering queries, and guiding you towards becoming a contributor in the LlamaIndex universe. While we await the arrival of a human maintainer, I'm here to lend a robotic hand. I've caught your issue and am currently analyzing the circuits to provide you with a detailed response shortly. Stay tuned!

@logan-markewich
Copy link
Collaborator

Yea it's technically not supported/advised. It might not be mentioned in the docs, but also, I don't think there's any example that uses it as a kwarg either 😅

Copy link

dosubot bot commented Apr 23, 2024

It appears that the llm.complete method in the LlamaIndex library version 0.10.30 expects the prompt to be passed as a positional argument, not as a keyword argument. This is why you're encountering the "tuple index out of range" error when attempting to use prompt=prompt. The method's implementation and the documentation do not support prompt as a keyword argument, which aligns with the maintainer's feedback that it's not advised or supported.

To avoid this error, you should call the complete method by directly passing the prompt as a positional argument, like so:

output = llm.complete(prompt)

If you need to pass additional options, they should be included as keyword arguments after the prompt:

output = llm.complete(prompt, formatted=False, additional_option="value")

This approach ensures compatibility with the current method signature and avoids the error you've encountered.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants