Skip to content

Conversation

@JacobSzwejbka
Copy link
Contributor

Summary: Zero initialization is non standard with pytorch models, and in particular with ET is frustrating because ET looks to greedily deduplicate weights. That means if you zero initialize a transformer model the pte size will be a lot smaller then you would expect if you didnt know about the deduplication.

Differential Revision: D91518961

@JacobSzwejbka JacobSzwejbka requested a review from lucylq as a code owner January 26, 2026 22:35
@pytorch-bot
Copy link

pytorch-bot bot commented Jan 26, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/16886

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit a3b33f1 with merge base 5690d26 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jan 26, 2026
@meta-codesync
Copy link
Contributor

meta-codesync bot commented Jan 26, 2026

@JacobSzwejbka has exported this pull request. If you are a Meta employee, you can view the originating Diff in D91518961.

@github-actions
Copy link

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

Summary:

Zero initialization is non standard with pytorch models, and in particular with ET is frustrating because ET looks to greedily deduplicate weights. That means if you zero initialize a transformer model the pte size will be a lot smaller then you would expect if you didnt know about the deduplication.

Differential Revision: D91518961
@JacobSzwejbka JacobSzwejbka merged commit c4de50d into pytorch:main Jan 27, 2026
145 of 147 checks passed
@JacobSzwejbka JacobSzwejbka deleted the export-D91518961 branch January 27, 2026 19:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants