Torch Einops Utils
torch-einops-utils is a collection of personal utility functions designed to work with PyTorch and Einops, providing convenient abstractions for common tensor manipulations. It is currently at version 0.0.30 and receives frequent updates, indicating active development with rapid iteration.
Common errors
-
ImportError: cannot import name 'EinopsToAndFrom' from 'torch_einops_utils'
cause The specified symbol either does not exist in your installed version of `torch-einops-utils`, is misspelled, or the library itself is not installed correctly. This is common due to frequent `0.0.x` API changes.fixCheck the exact spelling of the imported symbol. If correct, ensure your `torch-einops-utils` version is up-to-date or matches the version expected by your code. Run `pip install --upgrade torch-einops-utils` or `pip install torch-einops-utils==<expected_version>`. -
TypeError: EinopsToAndFrom.__init__ missing 1 required positional argument: 'pattern_in'
cause Classes like `EinopsToAndFrom` and `EinopsToNoOp` require specific `einops` patterns (e.g., `pattern_in`, `pattern_out`) and a callable function or `nn.Module` instance during initialization.fixInstantiate the class with the correct positional arguments. For `EinopsToAndFrom`, this means providing `pattern_in`, `pattern_out`, and the callable `fn`: `EinopsToAndFrom('b n d', 'b n d', nn.Identity())`.
Warnings
- breaking The API surface can change frequently and without explicit deprecation warnings across `0.0.x` releases, as this library is a collection of personal utilities in active development.
- gotcha This library heavily relies on `einops` syntax for defining tensor manipulations and assumes a strong understanding of PyTorch tensors.
- gotcha As a 'personal utility functions' library, its design choices might be opinionated or tailored for specific use cases not broadly applicable, which can lead to unexpected behavior if used outside of its intended scope.
Install
-
pip install torch-einops-utils
Imports
- EinopsToAndFrom
from torch_einops_utils import EinopsToAndFrom
- EinopsToNoOp
from torch_einops_utils import EinopsToNoOp
- Rearrange
from torch_einops_utils import Rearrange
- Reduce
from torch_einops_utils import Reduce
- rearrange_many
from torch_einops_utils import rearrange_many
- repeat_many
from torch_einops_utils import repeat_many
Quickstart
import torch
from torch import nn
from torch_einops_utils import EinopsToAndFrom
class Foo(EinopsToAndFrom):
def __init__(self, fn: nn.Module):
# EinopsToAndFrom requires input pattern, output pattern, and a callable/nn.Module
super().__init__('b n d', 'b n d', fn)
def forward(self, x):
# The `fn` provided in __init__ is called within EinopsToAndFrom's forward
# after applying the input pattern, and before applying the output pattern.
# In this example, 'b n d' -> 'b n d' is a no-op rearrangement
# so the fn acts directly on the input shape.
return self.fn(x)
# Example usage with a simple nn.Identity
model = Foo(nn.Identity())
x = torch.randn(1, 10, 32) # Batch, Sequence Length, Dimension
y = model(x)
print(f"Input shape: {x.shape}")
print(f"Output shape: {y.shape}")
# Example with a lambda function
dummy_fn = lambda z: z * 2 # Multiply by 2
model_lambda = Foo(dummy_fn)
y_lambda = model_lambda(x)
print(f"Output with lambda: {y_lambda.shape}")
print(f"First element value: {y_lambda[0,0,0]:.2f}")