Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Added token_count to LlmEmbedding and LlmChatCompletionMessage for openai #2061

Merged
merged 1 commit into from
Mar 4, 2024

Conversation

bizob2828
Copy link
Member

Description

To support calculating token counts downstream for Llm events that lack counts(i.e. streaming or disabling capturing content), we have added token_count to LlmEmbedding, and LlmChatCompletionMessage. This only adds and does not remove the old attributes for capturing token counts on prompts and completions. That will be handled in #2057

Related Issues

Closes #2056

Copy link

codecov bot commented Mar 4, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 97.16%. Comparing base (c5ab73c) to head (196d285).
Report is 2 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2061   +/-   ##
=======================================
  Coverage   97.16%   97.16%           
=======================================
  Files         247      247           
  Lines       41755    41762    +7     
=======================================
+ Hits        40573    40580    +7     
  Misses       1182     1182           
Flag Coverage Δ
integration-tests-16.x 78.40% <ø> (+0.01%) ⬆️
integration-tests-18.x 78.37% <ø> (ø)
integration-tests-20.x 78.38% <ø> (ø)
unit-tests-16.x 90.62% <100.00%> (+<0.01%) ⬆️
unit-tests-18.x 90.60% <100.00%> (+<0.01%) ⬆️
unit-tests-20.x 90.60% <100.00%> (+<0.01%) ⬆️
versioned-tests-16.x 74.59% <100.00%> (-0.02%) ⬇️
versioned-tests-18.x 75.45% <100.00%> (-0.02%) ⬇️
versioned-tests-20.x 75.46% <100.00%> (-0.02%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@jsumners-nr jsumners-nr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me.

@bizob2828 bizob2828 merged commit 47a925e into newrelic:main Mar 4, 2024
24 checks passed
@bizob2828 bizob2828 deleted the add-token-count branch April 3, 2024 19:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

Update Llm events to store content as token_count on LlmEmbedding and LlmChatCompletionMessage
2 participants