Skip to content

Conversation

@darkestpigeon
Copy link
Contributor

@darkestpigeon darkestpigeon commented Aug 20, 2024

What?

Fixing a bug with reshape, where it mistakes a non-contiguous C layout for contiguous F layout (see issue #660).

Why

A simple illustration why this is a problem:

import arraymancer

let x = arange[float](0, 6).reshape(2, 3).permute(1, 0) # has shape [3, 2] and is not C-contiguous (but is F-contiguous)
let y = x.reshape(3*2).reshape(3, 2) # the reshapes shouldn't change the contents

echo x
echo y
doAssert x == y

Right now this assertion fails.

How

The check for whether the tensor is F-contiguous is removed. No-copy reshape is only performed if the tensor is C-contiguous.

Summary by CodeRabbit

  • Refactor
    • Simplified tensor reshape implementation for non-C-contiguous tensors. These tensors now use copy-based reshape operations instead of a specialized no-copy approach. C-contiguous tensors retain their optimized, storage-efficient behavior.

✏️ Tip: You can customize this high-level summary in your review settings.

@mratsim
Copy link
Owner

mratsim commented Jan 2, 2026

/gemini review

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses a consistency bug in the reshape operation for F-contiguous tensors. The fix correctly removes the special handling for F-contiguous tensors, ensuring they are converted to a C-contiguous layout before reshaping, which prevents data corruption. My review highlights a potential pre-existing bug in the reshape_with_copy function that this change now makes more prominent. Addressing this related issue is recommended to prevent potential memory access errors.

I am having trouble creating individual review comments. Click here to see my feedback.

src/arraymancer/tensor/private/p_shapeshifting.nim (72-74)

high

Removing this special handling for F-contiguous tensors is the correct approach to fix the reshape inconsistency. Now, non-C-contiguous tensors (including F-contiguous ones) will be handled by reshape_with_copy, which forces a copy to a C-contiguous layout.

However, I've noticed a potential issue in reshape_with_copy that will now be triggered more often. The implementation of reshape_with_copy is:

proc reshape_with_copy*[T](t: Tensor[T], new_shape: varargs[int]|Metadata|seq[int], result: var Tensor[T]) =
  contiguousImpl(t, rowMajor, result)
  reshape_no_copy(t, new_shape, result, rowMajor)

contiguousImpl creates a new tensor in result with new storage, and its data is packed starting at offset 0. But reshape_no_copy is then called with the original tensor t and sets result.offset = t.offset. If t was a view with a non-zero offset, this would be incorrect for the new storage of result and could lead to memory errors.

A fix would be to call reshape_no_copy on result itself:

reshape_no_copy(result, new_shape, result, rowMajor)

Since this part of the code is not in the diff, I can't suggest a direct change, but I strongly recommend addressing this as part of this PR or in a follow-up, as it could cause critical issues.

@mratsim
Copy link
Owner

mratsim commented Jan 2, 2026

@coderabbitai full review

@coderabbitai
Copy link

coderabbitai bot commented Jan 2, 2026

✅ Actions performed

Full review triggered.

@coderabbitai
Copy link

coderabbitai bot commented Jan 2, 2026

📝 Walkthrough

Walkthrough

The pull request removes the special-case Fortran-contiguous tensor handling from the reshape implementation. Non-C-contiguous tensors now uniformly use copy-based reshape instead of the previous no-copy path with column-major layout. C-contiguous tensors remain optimized with the existing no-copy mechanism.

Changes

Cohort / File(s) Summary
Fortran-contiguous tensor handling removal
src/arraymancer/tensor/private/p_shapeshifting.nim
Deleted the is_F_contiguous branch in reshapeImpl, eliminating special-case no-copy reshape for Fortran-contiguous tensors. Non-C-contiguous tensors now proceed through the standard reshape_with_copy path.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

🐰 A hop through reshape's winding way,
Fortran's shortcut fades to gray,
Copy now leads all but one,
C-contiguous still runs and runs,
Simpler paths for tensors at play!

Pre-merge checks

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'Reshape consistency fix' directly relates to the main change: fixing a bug in reshape where F-contiguous tensors were incorrectly handled, improving consistency by removing special-case F-contiguous handling.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

📜 Recent review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between caf0d8d and 1684c13.

📒 Files selected for processing (1)
  • src/arraymancer/tensor/private/p_shapeshifting.nim
💤 Files with no reviewable changes (1)
  • src/arraymancer/tensor/private/p_shapeshifting.nim

Comment @coderabbitai help to get the list of available commands and usage tips.

@mratsim mratsim merged commit 195c75d into mratsim:master Jan 2, 2026
@mratsim
Copy link
Owner

mratsim commented Jan 2, 2026

Thank you!

@mratsim mratsim mentioned this pull request Jan 2, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants