{"id":10286,"library":"tensorflow-model-analysis","title":"TensorFlow Model Analysis","description":"TensorFlow Model Analysis (TFMA) is a library for performing deep analysis and evaluation of TensorFlow models, especially useful for understanding model performance on different data slices. It is built on Apache Beam, enabling scalable analysis. The current version is 0.48.0, and it follows a regular release cadence, often aligning with TensorFlow and TFX releases.","status":"active","version":"0.48.0","language":"en","source_language":"en","source_url":"https://github.com/tensorflow/model-analysis","tags":["tensorflow","ml-evaluation","model-analysis","data-science","apache-beam"],"install":[{"cmd":"pip install tensorflow-model-analysis apache-beam[direct,gcp] tensorflow","lang":"bash","label":"Install TFMA with Beam dependencies"}],"dependencies":[{"reason":"Core machine learning framework that TFMA analyzes models from.","package":"tensorflow","optional":false},{"reason":"TFMA is built on Apache Beam; pipelines are executed by Beam. `[direct,gcp]` extras are common for local execution and cloud deployment.","package":"apache-beam","optional":false},{"reason":"Provides shared libraries for TFX components, including data parsing and handling crucial for TFMA.","package":"tfx-bsl","optional":false}],"imports":[{"symbol":"tfma","correct":"import tensorflow_model_analysis as tfma"}],"quickstart":{"code":"import tensorflow as tf\nimport tensorflow_model_analysis as tfma\nimport os\nimport shutil\n\n# 1. Create a simple Keras model and save it\n# This model expects a 'feature_1' input and outputs 'prediction'.\nclass SimplePredictionModel(tf.keras.Model):\n    def __init__(self):\n        super().__init__()\n        self.dense = tf.keras.layers.Dense(1, activation='sigmoid')\n\n    @tf.function(input_signature=[\n        tf.TensorSpec(shape=[None], dtype=tf.float32, name='feature_1')\n    ])\n    def serving_default(self, feature_1):\n        return {'predictions': self.dense(tf.expand_dims(feature_1, axis=-1))}\n\nmodel = SimplePredictionModel()\n# Initialize weights by calling the serving function once\n_ = model.serving_default(tf.constant([1.0, 2.0]))\n\nmodel_dir = '/tmp/tfma_quickstart_model'\nif os.path.exists(model_dir): shutil.rmtree(model_dir)\ntf.saved_model.save(model, model_dir, signatures={'serving_default': model.serving_default})\n\n# 2. Create dummy data as TFRecord (TFMA expects tf.train.Example protos)\ndata_path = '/tmp/tfma_quickstart_data.tfrecord'\nif os.path.exists(data_path): os.remove(data_path)\n\nexamples_proto = []\nfor i in range(10):\n    feature_val = float(i)\n    label_val = 1.0 if i % 2 == 0 else 0.0\n    example = tf.train.Example(features=tf.train.Features(feature={\n        'feature_1': tf.train.Feature(float_list=tf.train.FloatList(value=[feature_val])),\n        'label': tf.train.Feature(float_list=tf.train.FloatList(value=[label_val])),\n    }))\n    examples_proto.append(example.SerializeToString())\n\nwith tf.io.TFRecordWriter(data_path) as writer:\n    for ex in examples_proto:\n        writer.write(ex)\n\n# 3. Define EvalConfig\neval_config = tfma.EvalConfig(\n    model_specs=[tfma.ModelSpec(\n        signature_name='serving_default',\n        label_key='label',\n        prediction_key='predictions' # Key from model output dict\n    )],\n    metrics_specs=[\n        tfma.MetricsSpec(\n            metrics=[\n                tfma.MetricConfig(class_name='ExampleCount'),\n                tfma.MetricConfig(class_name='Accuracy')\n            ]\n        )\n    ],\n    slicing_specs=[\n        tfma.SlicingSpec(), # Overall slice\n        tfma.SlicingSpec(feature_keys=['feature_1']) # Slice by feature_1\n    ]\n)\n\n# 4. Run Model Analysis\noutput_dir = '/tmp/tfma_quickstart_output'\nif os.path.exists(output_dir): shutil.rmtree(output_dir)\n\nprint(f\"Running TFMA with model: {model_dir}, data: {data_path}, output: {output_dir}\")\nresults = tfma.run_model_analysis(\n    model_location=model_dir,\n    data_location=data_path,\n    eval_config=eval_config,\n    output_path=output_dir,\n    # TFMA uses Apache Beam for execution. For local quickstart,\n    # default DirectRunner is used. For cloud, configure Beam options.\n    # e.g., beam_options=os.environ.get('BEAM_OPTIONS', '').split()\n)\n\nprint(f\"TFMA analysis complete. Results written to: {output_dir}\")\n# To inspect results (e.g., in a Jupyter Notebook):\n# from tensorflow_model_analysis.notebook import visualization\n# visualization.display_metrics(output_dir)\n","lang":"python","description":"This quickstart demonstrates how to set up a dummy TensorFlow model, create synthetic TFRecord data, define an `EvalConfig`, and run a basic model analysis with TFMA locally. It shows how to obtain overall and sliced metrics."},"warnings":[{"fix":"Always check the official `setup.py` or `pyproject.toml` for exact dependency ranges. Ensure your Python environment matches the `requires_python` range and that `tensorflow` and `apache-beam` versions are compatible.","message":"TFMA has strict requirements on Python, TensorFlow, and Apache Beam versions. Upgrading one without considering the others can lead to installation or runtime errors. For instance, Python 3.8 support was dropped in 0.45.0, and `tensorflow` must be `~2.11` and `apache-beam` `~2.54` for 0.48.0.","severity":"breaking","affected_versions":"0.45.0 and later for Python, 0.47.0 and later for TensorFlow/Beam"},{"fix":"Pre-process your data into `tf.train.Example` protos and write them to TFRecord files. Libraries like `tfx_bsl.public.tf_example_io` can assist with this.","message":"TFMA expects input data in `tf.train.Example` format, typically stored in TFRecord files. It cannot directly consume CSV, JSON, or other raw formats without prior conversion.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Inspect your `SavedModel` using `saved_model_cli show` to confirm signature names and input/output tensor keys. Ensure `label_key` matches your TFRecord feature name and `prediction_key` matches the relevant output key from your model's prediction dictionary.","message":"Incorrectly configuring `ModelSpec` parameters (`signature_name`, `label_key`, `prediction_key`) is a common source of errors. These must precisely match your `SavedModel`'s serving signature and the feature/output keys.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Install `apache-beam` with the necessary extras, for example: `pip install apache-beam[direct,gcp]`. Ensure your Beam pipeline options (`beam_options` in `tfma.run_model_analysis`) are correctly configured for your chosen runner.","message":"Apache Beam (which TFMA uses for its pipelines) requires specific extras for different runners (e.g., `[direct]` for local, `[gcp]` for Dataflow). Missing these can cause 'ModuleNotFoundError' or runtime errors when trying to use a runner.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-17T00:00:00.000Z","next_check":"2026-07-16T00:00:00.000Z","problems":[{"fix":"Verify that your `tf.train.Example` protos contain the features specified in your `eval_config.slicing_specs` and `model_specs.label_key`. Also, ensure the `signature_name` and `prediction_key` in `ModelSpec` are correct so the model can make predictions.","cause":"The input data or model setup isn't correctly providing features or predictions that TFMA can use for the defined slicing specifications. This often means the `label_key` or features defined in `slicing_specs` do not exist in the parsed `tf.train.Example` protos.","error":"ValueError: No examples found for slicing spec"},{"fix":"Install Apache Beam: `pip install apache-beam`. If planning to use cloud runners like Dataflow, include the relevant extras, e.g., `pip install apache-beam[gcp]`.","cause":"`apache-beam` is a mandatory dependency for TFMA and was not installed or is not accessible in the current environment.","error":"ModuleNotFoundError: No module named 'apache_beam'"},{"fix":"Examine your `SavedModel`'s serving signature outputs (e.g., using `saved_model_cli show --dir <model_path> --tag_set serve`) and ensure `prediction_key` matches one of the output tensor names. If the model outputs a single tensor (not a dict), you might need to omit `prediction_key` or ensure the model wraps it in a dict with a specific key.","cause":"The `prediction_key` provided in `tfma.ModelSpec` does not correspond to a key in the dictionary output by the model's serving signature.","error":"The `prediction_key` specified in `ModelSpec` did not match any predictions in the model output. Available predictions: [...]"},{"fix":"Ensure your `tf.train.Example` features are correctly defined (e.g., using `tf.train.FloatList` for floats, `tf.train.BytesList` for strings) and that your `ModelSpec` (or any custom `input_fn` if used) correctly specifies how to parse these features into tensors for the model.","cause":"This error often occurs when `tf.Example` protos are not correctly parsed before being passed to the model, or when the `tfma.FeatureSpec` (often inferred or default) does not align with the actual data types in your `tf.train.Example`.","error":"AttributeError: 'tensorflow.python.framework.ops.EagerTensor' object has no attribute 'decode_raw'"}]}