-
Notifications
You must be signed in to change notification settings - Fork 410
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Added setLlmTokenCountCallback API endpoint to register a callback for calculating token count when none is provided #2065
Conversation
…ack for calculating token counts when none is provided
…ll `agent.llm.tokenCountCallback` when token_count is not present
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #2065 +/- ##
=======================================
Coverage 97.17% 97.18%
=======================================
Files 248 248
Lines 41880 41936 +56
=======================================
+ Hits 40698 40754 +56
Misses 1182 1182
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me.
* @example | ||
* // @param {string} model - name of model (i.e. gpt-3.5-turbo) | ||
* // @param {string} content - prompt or completion response | ||
* function tokenCallback(model, content) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The value of content
is extremely varied. I can see us fielding plenty of issues asking what it will be.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It must be a string. In the openai cases they are always strings. I know for langchain they aren't always strings but we don't assign tokens. Looking at bedrock they are strings as well
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it shouldn't be the parsed body. it should be the key in the body that contains the content.
Description
Adds
setLlmTokenCountCallback
to register callback to be called to get token counts when they are not present on events. This function must be synchronous and return token count.Related Issues
Closes #2055