Skip to content

Update Python version requirement to <3.14#27

Open
Mojo-OG wants to merge 3 commits intoLykosAI:mainfrom
Mojo-OG:patch-1
Open

Update Python version requirement to <3.14#27
Mojo-OG wants to merge 3 commits intoLykosAI:mainfrom
Mojo-OG:patch-1

Conversation

@Mojo-OG
Copy link
Copy Markdown

@Mojo-OG Mojo-OG commented Mar 25, 2026

Python3.13.12 support was added to StabilityMatrix within the last few releases, but ComfyUI-Inference-Core-Nodes fails to install to the package due to the current dependency requiring versions less than 3.13.

I was able to install ComfyUI-Inference-Core-Nodes after modifying pyproject.toml locally to <3.14, and did not have any issues running install.py from my 3.13.12 venv after that.

Python3.13.12 support was added to StabilityMatrix within the last few releases, but ComfyUI-Inference-Core-Nodes fails to install to the package due to the current dependency requiring versions less than 3.13.

I was able to install ComfyUI-Inference-Core-Nodes after modifying pyproject.toml locally to `<3.14`, and did not have any issues running `install.py` from my 3.13.12 venv after that.
@gemini-code-assist
Copy link
Copy Markdown

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request enhances the project's compatibility by expanding the supported Python version range. It specifically addresses an installation problem encountered with ComfyUI-Inference-Core-Nodes when running on Python 3.13.12, ensuring smoother integration and broader usability for users on newer Python environments.

Highlights

  • Python Version Compatibility: The Python version requirement has been updated to include versions less than 3.14, specifically addressing compatibility with Python 3.13.12.
  • Dependency Resolution: This change resolves an installation issue for ComfyUI-Inference-Core-Nodes which previously failed to install due to a stricter Python dependency.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the pyproject.toml file to extend the supported Python version range to include Python 3.13. However, the reviewer noted that the onnxruntime-gpu dependency for the cuda-12 optional dependency has not been updated to include wheels for Python 3.13, which could lead to installation failures for users on that version. The reviewer suggests adding the correct onnxruntime-gpu wheels for Python 3.13 for both Windows and Linux to ensure full support.

Mojo-OG added 2 commits March 25, 2026 17:25
Adds the PyPI onnxruntime-gpu packages for Windows and Linux when running Python 3.13.
Copy link
Copy Markdown
Author

@Mojo-OG Mojo-OG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gemini code assist feedback addressed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant