unfoldNd

raw JSON →
0.2.3 verified Mon Apr 27 auth: no python

Unfold and fold operations for N-dimensional tensors in PyTorch, generalizing torch.nn.Unfold and torch.nn.Fold to arbitrary dimensions. Current version: 0.2.3. Released irregularly, latest in Dec 2024.

pip install unfoldnd
error ImportError: cannot import name 'unfoldNd' from 'unfoldnd'
cause Mixing up package name (unfoldnd) with module name (unfoldNd).
fix
Run: from unfoldNd import unfoldNd
error TypeError: foldNd() missing 1 required positional argument: 'output_size'
cause foldNd requires output_size parameter; it is not optional.
fix
Add 'output_size=(D, H, W)' (or appropriate dimensions) to the foldNd call.
error RuntimeError: Expected 4D or 5D input tensor, but got 3D
cause unfoldNd expects at least a 4D tensor (batch, channels, spatial...).
fix
Ensure input tensor has batch and channel dimensions, e.g., reshape to (1, 1, *spatial_dims) if needed.
gotcha The package name on PyPI is 'unfoldnd' (all lowercase), but the import uses 'unfoldNd' (capital N). This mismatch often causes ImportError.
fix Use 'from unfoldNd import ...' after 'pip install unfoldnd'.
gotcha foldNd requires explicit 'output_size' argument; it cannot be inferred from the input tensor shape. Missing it raises a TypeError.
fix Always provide 'output_size' matching the spatial dimensions expected after folding.
deprecated Version 0.2.0 changed the API: 'foldNd' was added and some internal parameters may differ from older versions.
fix Update to >=0.2.0 and adjust if using fold operations.
gotcha The 'dilation' parameter must be a single int or tuple matching the dimension count. Using incompatible shapes causes a RuntimeError.
fix Ensure dilation, padding, stride are either int or tuples of length equal to the spatial dims of input.

Demonstrates 3D unfolding and folding with a 3x3x3 kernel.

import torch
from unfoldNd import unfoldNd, foldNd

x = torch.randn(1, 3, 10, 10, 10)  # N=3 kernel size 3x3x3
patches = unfoldNd(x, kernel_size=3, dilation=1, padding=0, stride=1)
print(patches.shape)  # torch.Size([1, 27, 512])

# Fold back (example assumes same parameters)
x_reconstructed = foldNd(patches, output_size=(10,10,10), kernel_size=3, dilation=1, padding=0, stride=1)
print(x_reconstructed.shape)  # torch.Size([1, 3, 10, 10, 10])