-
Notifications
You must be signed in to change notification settings - Fork 4.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add batch mode to QueryPipeline #13203
Conversation
@logan-markewich changed functions to run batch concurrently with async and validate that size of batch inputs are all the same. |
@xpbowler Ok, I think this is in decent shape! Do you want to add an example in the module_guides docs somewhere? |
updated docs! lmk how it is |
@logan-markewich hmm, some of the llama-index-integrations failed the unit test. I only changed the .md documentation since last time - did some requirements change? |
@xpbowler ah, there was some nasty CICD issues that I fixed on Monday. I'll merge main again and see if it works |
Description
Add batch mode to QueryPipeline.
Added:
batch
parameter torun
,arun
,_run
,_arun
,run_multi,
arun_multi`run_multi
/arun_multi
: Accepts inputs wrapped around in a List,List(Any)
rather thanList
merge_dicts
that re-formats the batch output of these 2 functions_run
/_arun
: Accepts inputs wrapped around in a List.I didn't create a batch mode for the with_intermediates functions for now because I don't think its very useful. The list comprehension functions might be confusing tho. I think it might need some more work.
Type of Change