{"id":23329,"library":"axial-positional-embedding","title":"Axial Positional Embedding","description":"Implementation of Axial Positional Embedding, as described in 'Attention is All You Need' and 'Axial Attention in Transformers'. This library provides a simple way to add positional encodings to transformer models in an axial (factorized) manner, reducing the number of parameters. Current version 0.3.12, supports Python >=3.8.","status":"active","version":"0.3.12","language":"python","source_language":"en","source_url":"https://github.com/lucidrains/axial_positional_embedding","tags":["positional-encoding","transformer","axial-attention","pytorch"],"install":[{"cmd":"pip install axial-positional-embedding","lang":"bash","label":"Install from PyPI"}],"dependencies":[{"reason":"Core dependency; library requires PyTorch tensors and modules.","package":"torch","optional":false},{"reason":"Used for tensor rearranging in positional embedding.","package":"einops","optional":false}],"imports":[{"note":"Common mistake: deep import path; top-level import is correct.","wrong":"from axial_positional_embedding.axial_positional_embedding import AxialPositionalEmbedding","symbol":"AxialPositionalEmbedding","correct":"from axial_positional_embedding import AxialPositionalEmbedding"}],"quickstart":{"code":"import torch\nfrom axial_positional_embedding import AxialPositionalEmbedding\n\n# Example: create an embedding layer for a 2D image (height, width)\ndim = 128\n# For an input of shape (batch, seq_len, dim) or (batch, height, width, dim)?\n# Typically, AxialPositionalEmbedding expects shape (batch, height, width, dim) for 2D axial.\n# But the library is flexible; let's assume a 2D spatial input.\nbatch = 2\nheight = 16\nwidth = 16\n\npos_emb = AxialPositionalEmbedding(dim=dim, shape=(height, width))\n# Generate dummy input (batch, height, width, dim)\nx = torch.randn(batch, height, width, dim)\n# Apply positional embedding\nout = pos_emb(x)\nprint(out.shape)  # Expected: (batch, height, width, dim)","lang":"python","description":"Basic usage: create an axial positional embedding layer for 2D spatial data and apply it to an input tensor."},"warnings":[{"fix":"If using PyTorch's usual image format (NCHW), permute to (batch, height, width, channels) before applying: x = x.permute(0, 2, 3, 1)","message":"Input tensor shape: The library expects input shape (batch, height, width, channels) for 2D axial, not (batch, channels, height, width). If your data is in channel-first format, you must permute before passing.","severity":"gotcha","affected_versions":"all"},{"fix":"Ensure dim equals the last dimension of the input tensor after permutation.","message":"Dimension mismatch: The 'dim' parameter must match the channel dimension of the input tensor. Common mistake: setting dim=128 but input has 64 channels.","severity":"gotcha","affected_versions":"all"},{"fix":"Check the GitHub repo for latest changes; if lacking, evaluate alternatives.","message":"The library does not actively deprecate features, but the original 'axial-positional-embedding' may see reduced updates. For production, consider using Hugging Face Transformers' built-in axial positional embedding.","severity":"deprecated","affected_versions":"0.3.x"}],"env_vars":null,"last_verified":"2026-05-01T00:00:00.000Z","next_check":"2026-07-30T00:00:00.000Z","problems":[{"fix":"Use: from axial_positional_embedding import AxialPositionalEmbedding","cause":"Import path incorrect; likely using a wrong module name.","error":"ImportError: cannot import name 'AxialPositionalEmbedding'"},{"fix":"Ensure input tensor has shape (batch, height, width, dim) after permutation.","cause":"Input shape mismatch: the positional embedding expects a specific shape (height, width) and may broadcast incorrectly if dimensions don't align.","error":"RuntimeError: The expanded size of the tensor must match the existing size"},{"fix":"If you encounter this error, ensure dim is even.","cause":"Some implementations require even dimension for sin/cos positional encoding; this library currently does not enforce but version may vary.","error":"AssertionError: dim must be divisible by 2?"}],"ecosystem":"pypi","meta_description":null,"install_score":null,"install_tag":null,"quickstart_score":null,"quickstart_tag":null}