feat: add LiteLLM callback handler for all providers #1275
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
Fixes #1028, #1079
Implements a LiteLLM callback handler that properly tracks LLM calls across all LiteLLM-supported providers including OpenAI and Anthropic.
Root Cause Analysis
#1079 (Anthropic models not tracked):
#1028 (Responses API not working):
litellm.responses()API wasn't supported because there was no callback handleroutputinstead ofchoices) that wasn't handledSolution
Created
LiteLLMCallbackHandlerthat:CustomLoggerinterfaceanthropic/,openai/, etc.)choices) and responses API (output) formatsinput_tokens/output_tokensmappinggen_ai.*semantic conventionsUsage
Features
litellm.callbacks = [handler]Files Changed
agentops/integration/callbacks/litellm/__init__.py(new)agentops/integration/callbacks/litellm/callback.py(new)agentops/__init__.py(export LiteLLMCallbackHandler)tests/unit/integration/callbacks/litellm/test_litellm_callback.py(new)Test Plan