fix(ai): normalize messages in truncate_messages_by_size to support Pydantic (#5350)#5730
fix(ai): normalize messages in truncate_messages_by_size to support Pydantic (#5350)#5730harryautomazione wants to merge 6 commits intogetsentry:masterfrom
Conversation
…mand, parameter name, and attribute - Change install command to include [pydantic_ai] extra - Change result_type to output_type in example - Change result.data to result.output in example This ensures the docstring matches the current Pydantic AI API and correct installation instructions. Fixes getsentry#5293
…gation - Add CURRENT_LANGCHAIN_AGENT_NAME contextvar to track agent name across spans - Set agent name in agent executor wrappers (invoke/stream) - Propagate agent name to all child spans via _create_span - Add test to verify agent name is set on all spans
Semver Impact of This PR🟢 Patch (bug fixes) 📋 Changelog PreviewThis is how your changes will appear in the changelog. New Features ✨
Bug Fixes 🐛Anthropic
Other
Documentation 📚
Internal Changes 🔧
Other
🤖 This preview updates automatically when you update the PR. |
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
| truncation is based only on character count in that case. | ||
| """ | ||
| serialized_json = json.dumps(messages, separators=(",", ":")) | ||
| normalized_messages = _normalize_data(messages, unpack=False) |
There was a problem hiding this comment.
Normalization silently converts None values to string "None"
Medium Severity
_normalize_data converts None values to the string "None" (line 488's fallback: str(data)). OpenAI assistant messages with tool calls commonly have content: None. After this change, those become content: "None", altering the JSON representation from null to "None" — a semantic difference that also slightly skews the byte-size calculation used for truncation decisions.


What
messagesintruncate_messages_by_sizeusing_normalize_data(messages, unpack=False)before serializing them withjson.dumps.normalized_messagesonwards to_find_truncation_indexto protect size lookups.Why
truncate_messages_by_sizefails on Pydantic SDK objects (e.g.,ResponseFunctionToolCall) #5350ResponseFunctionToolCall), direct serialisation crashes withTypeError: Object of type X is not JSON serializable.How
sentry_sdk/ai/utils.py:truncate_messages_by_size(), apply_normalize_data()beforejson.dumps().Testing
json.dumpsexception.pytest tests/integrations/openai/.Checklist
Fixes #5350