diffq-fixed

raw JSON →
0.2.4 verified Fri May 01 auth: no python

A differentiable quantization framework for PyTorch, fixed for compatibility with Python 3.11 and newer. This library provides differentiable quantization-aware training with custom quantizers and gradient approximations. Version 0.2.4 is current; no regular release cadence.

pip install diffq-fixed
error ModuleNotFoundError: No module named 'diffq'
cause Installed the pip package 'diffq-fixed' but trying to import 'diffq_fixed' instead of 'diffq'.
fix
Change import to: from diffq import DiffQuantizer
error RuntimeError: Quantizer not initialized
cause Calling forward on DiffQuantizer before initializing with the model (or calling .to_device()?).
fix
Ensure you pass the model to DiffQuantizer constructor: quantizer = DiffQuantizer(model, ...)
gotcha Import path uses 'diffq' not 'diffq_fixed' even though the package is installed as 'diffq-fixed'.
fix Use 'from diffq import DiffQuantizer' instead of 'from diffq_fixed import ...'
breaking This fork may not be compatible with the original diffq's serialization format (model checkpoints). Loading checkpoints saved with original diffq could fail.
fix Re-train or re-quantize models using diffq-fixed from scratch, or check the release notes for any conversion tools.

Initialize a DiffQuantizer wrapping a model, run a training step.

import torch
from diffq import DiffQuantizer

model = torch.nn.Linear(10, 5)
quantizer = DiffQuantizer(model, n_bits=8)
optimizer = torch.optim.SGD(model.parameters(), lr=0.01)

x = torch.randn(4, 10)
y = model(x)
loss = y.sum()
loss.backward()
optimizer.step()
print("Training step completed")