-
Notifications
You must be signed in to change notification settings - Fork 410
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add LangChain error events capture #2040
Conversation
2953515
to
557373e
Compare
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #2040 +/- ##
=======================================
Coverage 97.15% 97.16%
=======================================
Files 248 248
Lines 41600 41648 +48
=======================================
+ Hits 40418 40466 +48
Misses 1182 1182
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The LlmTool and LlmChatCompletionSummary need an error
attribute set to true when an error occurs. Also we can cut a separate PR but it looks like i missed calling out is_response
on LlmChatCompletionMessage when the content is the response.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can open another PR to capture the is_response
on LlmChatCompletionMessage as its unrelated to this PR
This PR resolves issue #1969.