-
Notifications
You must be signed in to change notification settings - Fork 555
Improve token usage recording #4566
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov ReportAttention: Patch coverage is
✅ All tests successful. No failed tests found.
Additional details and impacted files@@ Coverage Diff @@
## master #4566 +/- ##
==========================================
- Coverage 80.72% 80.72% -0.01%
==========================================
Files 156 156
Lines 16496 16507 +11
Branches 2806 2806
==========================================
+ Hits 13317 13325 +8
- Misses 2298 2299 +1
- Partials 881 883 +2
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, see one suggestion
Co-authored-by: Ivana Kellyer <ivana.kellyer@sentry.io>
Update token usage recording to work if the LLM is calling them
prompt_tokens
orinput_tokens
. Same forcompletion_tokens
andoutput_tokens
. Records also cached and reasoning tokens usage.Because the signature of a helper function was changed, other AI integrations also have changes.
This PR does not change behavior, just prepare the ground for future changes to the AI integrations.