Skip to content

Conversation

@MirrorDNA-Reflection-Protocol

Fixes #4605.

Description

This PR improves the error messages when users attempt to use LLM clients (OpenAI, Anthropic) without installing the necessary optional dependencies. Instead of a generic ImportError or ModuleNotFoundError, it now provides a helpful message instructing the user to install the specific extra (e.g., pip install autogen-ext[openai]).

Changes

  • Wrapped top-level imports in autogen_ext/models/openai/_openai_client.py with try-except ImportError.
  • Wrapped top-level imports in autogen_ext/models/anthropic/_anthropic_client.py with try-except ImportError.

Verification

  • Static Analysis: Verified that the try-except blocks correctly wrap the optional imports and raise the new ImportError with the custom message.
  • Note: Runtime verification was limited by local environment constraints (Python 3.9 vs repo requirement >=3.10), but this is a standard pattern for optional dependencies.

@MirrorDNA-Reflection-Protocol
Copy link
Author

@microsoft-github-policy-service agree

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Improve Import Error Messages for LLM Client Dependencies

1 participant