Multiple Object Tracking Metrics (motmetrics)
motmetrics is a Python library providing a comprehensive suite of metrics for benchmarking multiple object trackers (MOT). It simplifies the evaluation of tracker performance by handling associations between ground truth and hypothesis data, and calculating standard metrics like MOTA, MOTP, and more. The current version is 1.4.0, with an active but infrequent release cadence focused on maintenance and bug fixes.
Common errors
-
AttributeError: module 'motmetrics' has no attribute 'create'
cause The `create` function for instantiating a metrics host is not directly in the top-level `motmetrics` module.fixImport it from the `metrics` submodule: `from motmetrics.metrics import create` or access it as `mm.metrics.create()` if you imported `motmetrics as mm`. -
TypeError: 'numpy.ndarray' object is not callable
cause This typically occurs when attempting to call `iou_matrix` or other distance functions without providing the correct bounding box arguments, or if an argument is mistakenly treated as a function.fixEnsure you are passing numpy arrays of shape `(N, 4)` (for N bounding boxes in `xywh` format) as the first two arguments to `iou_matrix`, not attempting to call an array object itself. -
ValueError: operands could not be broadcast together with shapes (X) (Y)
cause This error commonly arises within `acc.update()` if the number of ground truth IDs, hypothesis IDs, and the dimensions of the distance matrix do not align.fixVerify that the number of ground truth IDs (`gt_ids.shape[0]`) matches the number of rows in your distance matrix (`C.shape[0]`), and similarly for hypothesis IDs (`ts_ids.shape[0] == C.shape[1]`) for the `acc.update(gt_ids, ts_ids, C)` call. The distance matrix `C` must precisely match the counts of ground truth and hypothesis objects for the current frame.
Warnings
- deprecated Older versions of motmetrics (prior to 1.4.0) may raise `FutureWarning: In the future `np.bool` will be a zero-dimensional array instead of a scalar instance.` when used with newer NumPy versions.
- gotcha The `motmetrics.utils.compare_to_groundtruth` function had a correction in version 1.4.0 regarding the usage of Euclidean distances. This could lead to subtle correctness issues or different results for users explicitly relying on that specific distance metric with older motmetrics versions.
- gotcha Bounding box formats are crucial. `iou_matrix` and related distance functions primarily expect bounding boxes in `[x, y, width, height]` format. Misinterpreting this (e.g., using `x1, y1, x2, y2`) will lead to incorrect distance calculations and subsequently wrong metric results.
- gotcha When processing data frame-by-frame, it's critical to ensure that object IDs within a single frame are unique for both ground truth and hypotheses. Duplicate IDs within a frame can lead to unexpected behavior or incorrect associations.
Install
-
pip install motmetrics
Imports
- MOTAccumulator
from motmetrics import MOTAccumulator
- iou_matrix
from motmetrics.distances import iou_matrix
- create
from motmetrics import create
from motmetrics.metrics import create
Quickstart
import motmetrics as mm import numpy as np # Dummy data: Ground truth and tracker hypotheses for two frames # Format: [id, x, y, width, height] for bounding box coordinates gt_frame1 = np.array([[1, 10, 10, 5, 5], [2, 20, 20, 5, 5]]) ts_frame1 = np.array([[1, 10, 10, 5, 5], [2, 20, 20, 5, 5]]) gt_frame2 = np.array([[1, 11, 11, 5, 5], [2, 21, 21, 5, 5]]) ts_frame2 = np.array([[1, 11, 11, 5, 5], [2, 21, 21, 5, 5]]) # Create an accumulator to store tracking results acc = mm.MOTAccumulator(auto_id=True) # Process Frame 1: # Calculate Intersection over Union (IoU) distances between ground truth and hypotheses # Bounding boxes are expected as [x, y, width, height] C1 = mm.distances.iou_matrix(gt_frame1[:, 1:], ts_frame1[:, 1:], max_iou=0.5) acc.update(gt_frame1[:, 0], ts_frame1[:, 0], C1) # Process Frame 2: C2 = mm.distances.iou_matrix(gt_frame2[:, 1:], ts_frame2[:, 1:], max_iou=0.5) acc.update(gt_frame2[:, 0], ts_frame2[:, 0], C2) # Compute and display metrics mh = mm.metrics.create() summary = mh.compute(acc, metrics=['mota', 'motp', 'num_frames'], name='dummy_tracking') print(summary)