cotengra - Hyper optimized tensor network contraction

0.7.5 · active · verified Fri Apr 17

cotengra is a Python library designed for the hyper-optimized contraction of large tensor networks and einsums. It provides advanced pathfinding algorithms, including those based on hyper-optimization, to minimize computational cost (FLOPs, memory). The current version is 0.7.5, and it is actively maintained with regular releases focusing on performance enhancements, new optimization strategies, and bug fixes.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to define a tensor network using an einsum expression and array shapes, initialize a `ContractionTree`, tune its contraction path using cotengra's hyper-optimization, and then perform the contraction. It also shows how to get the resulting shape and the optimized path.

import cotengra as ctg
import numpy as np
import opt_einsum as oe

# Define an einsum expression and corresponding tensor shapes
expr = 'ijkl,klmn,mnop->ijop'
shapes = [(2,3,4,5), (4,5,6,7), (6,7,8,9)]
arrays = [np.random.rand(*s) for s in shapes]

# Initialize a ContractionTree with the expression and shapes
tree = ctg.ContractionTree(expr, shapes)

# Tune the contraction path using cotengra's hyper-optimizer
# For real use, consider increasing max_time and max_repeats
tree.tune(max_time=5, max_repeats=16)

# Perform the contraction
result = tree.contract(arrays)

print("Contraction result shape:", result.shape)
print("Optimized path:", tree.path)

view raw JSON →