Skip to content

Conversation

@GregoryComer
Copy link
Member

Summary: After recent changes to convert_constant_dim_order_pass, there are a few lowering breakages when constant nodes don't have associated placeholders. I didn't realize this could happen, but apparently it does. So just skip placeholder meta updates in this case.

Differential Revision: D91334285

@pytorch-bot
Copy link

pytorch-bot bot commented Jan 23, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/16841

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 07846a8 with merge base 61ae2f6 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jan 23, 2026
@meta-codesync
Copy link
Contributor

meta-codesync bot commented Jan 23, 2026

@GregoryComer has exported this pull request. If you are a Meta employee, you can view the originating Diff in D91334285.

@GregoryComer GregoryComer added the release notes: none Do not include this in the release notes label Jan 23, 2026
…der_pass (pytorch#16841)

Summary:

After recent changes to convert_constant_dim_order_pass, there are a few lowering breakages when constant nodes don't have associated placeholders. I didn't realize this could happen, but apparently it does. So just skip placeholder meta updates in this case.

Differential Revision: D91334285
Copy link
Contributor

@Gasoonjia Gasoonjia left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for update!
My major concern is whether or not that's the target behavior we want to have. In convert_constant_dim_order_pass we make const in contiguous first, then update the placeholder metadata. If in the _update_placeholder_meta we skip the update, will it introduce metadata-mismatch and lead to more issue?

)
if input_spec is None:
raise RuntimeError(f"Missing input spec for lifted tensor {target}.")
logger.warning(f"Missing input spec for constant {target}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you know when the export program may not have input_spec? Doesn't every program have that?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can track down a better repro. This showed up with an internal CV model. My guess is that there is some pass that is modifying the constant and it hasn't been fully re-lifted at this point. I'll see if I can find an exact case. There are several broken internal tests for export currently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported release notes: none Do not include this in the release notes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants