{"id":24729,"library":"transformer-lens","title":"TransformerLens","description":"A library for training and analysing transformer models, focused on mechanistic interpretability. Provides tools to probe, edit, and visualise model internals. Current version 3.1.0, released with support for Python >=3.10, <4.0. Active development.","status":"active","version":"3.1.0","language":"python","source_language":"en","source_url":"https://github.com/TransformerLensOrg/TransformerLens","tags":["transformers","mechanistic interpretability","neural networks","deep learning"],"install":[{"cmd":"pip install transformer-lens","lang":"bash","label":"Install from PyPI"}],"dependencies":[],"imports":[{"note":"HookPoint is defined in transformer_lens.components, not top-level.","wrong":"from transformer_lens import HookPoint","symbol":"HookPoint","correct":"from transformer_lens.components import HookPoint"},{"note":"Correct top-level import.","symbol":"HookedTransformer","correct":"from transformer_lens import HookedTransformer"}],"quickstart":{"code":"import torch\nfrom transformer_lens import HookedTransformer\n\nmodel = HookedTransformer.from_pretrained(\"tiny-stories-1L-21M\")\nprompts = \"The capital of France is\"\nlogits = model(prompts)\ntoken = logits.argmax(dim=-1).squeeze()\nprint(model.tokenizer.decode(token))","lang":"python","description":"Load a pretrained model and run a forward pass."},"warnings":[{"fix":"Change all imports to use underscores: 'import transformer_lens'.","message":"In version 3.0, the 'transformer_lens' package was renamed from 'transformer-lens' (hyphen) to 'transformer_lens' (underscore) for imports. Old imports using 'transformer_lens' (with hyphen) will fail.","severity":"breaking","affected_versions":">=3.0.0"},{"fix":"Check the changelog for the new location of utility functions.","message":"The old 'utils' module (transformer_lens.utils) has been deprecated in favour of individual submodules. Functions like 'to_numpy' have moved.","severity":"deprecated","affected_versions":">=3.0.0"},{"fix":"Cache tensors are of shape (batch, pos, d_model). Use cache['blocks.0.hook_mlp'].shape to verify.","message":"When using 'model.run_with_cache', the returned cache is a dictionary keyed by layer names, but the tensor dimensions can be counterintuitive (batch, pos, d_model). Ensure you permute correctly for visualisation.","severity":"gotcha","affected_versions":"all"}],"env_vars":null,"last_verified":"2026-05-01T00:00:00.000Z","next_check":"2026-07-30T00:00:00.000Z","problems":[{"fix":"Install with 'pip install transformer-lens' and import with 'import transformer_lens' (underscore).","cause":"Installing the package with a hyphen but trying to import with a hyphen, or vice versa.","error":"ModuleNotFoundError: No module named 'transformer_lens'"},{"fix":"Use specific submodules like 'transformer_lens.loading_from_pretrained' for loading utils.","cause":"The 'utils' module was removed in version 3.0.","error":"AttributeError: module 'transformer_lens' has no attribute 'utils'"},{"fix":"Print the keys of the cache dictionary to see the exact names: print(cache.keys())","cause":"The cache key may have a different naming format depending on the model and version.","error":"KeyError: 'blocks.0.hook_mlp' (when accessing cache)"}],"ecosystem":"pypi","meta_description":null,"install_score":null,"install_tag":null,"quickstart_score":null,"quickstart_tag":null}