Fickling
Fickling is a static analyzer and interpreter for Python pickle data. It identifies dangerous modules, functions, and attributes used within pickle files to prevent arbitrary code execution vulnerabilities. The current version is 0.1.10, and it maintains an active release cadence, frequently publishing security updates and expanded blocklists.
Warnings
- breaking Python 3.9 is no longer supported; the minimum required version is now Python 3.10.
- breaking CLI exit codes have changed to follow ClamAV conventions: `0` for clean, `1` for unsafe, and `2` for errors.
- breaking Malformed pickle data now raises `fickling.errors.InterpretationError` instead of `ValueError`.
- gotcha The internal blocklist of unsafe modules, functions, and attributes has been significantly expanded across various versions to address new bypasses. Older versions may be vulnerable to unpickling attacks that newer versions effectively detect.
- gotcha `torch` is an optional dependency. If you intend to analyze PyTorch models, you must install `fickling[pytorch]` or `torch` separately, as it is not included in the base installation.
Install
-
pip install fickling -
pip install fickling[pytorch]
Imports
- analyze_pickle
from fickling.analysis import analyze_pickle
- interpret_pickle
from fickling.interpretation import interpret_pickle
- UnsafeError
from fickling.errors import UnsafeError
Quickstart
import pickle
from fickling.analysis import analyze_pickle
from fickling.errors import UnsafeError
# Create a benign pickle (for demonstration)
class MyObject:
def __init__(self, name):
self.name = name
obj = MyObject("safe_data")
benign_pickled_data = pickle.dumps(obj)
# Analyze the pickled data
try:
print("\n--- Analyzing benign pickle ---")
results = analyze_pickle(benign_pickled_data)
if results.is_safe():
print("Pickle is safe. No violations found.")
else:
print("Pickle is potentially unsafe. Violations:")
for violation in results.violations:
print(f"- {violation.severity.name}: {violation.message}")
except UnsafeError as e:
print(f"Analysis detected an unsafe pickle: {e}")
except Exception as e:
print(f"An unexpected error occurred during analysis: {e}")
# Example of a potentially unsafe pickle (e.g., using os.system)
# NOTE: Do NOT run this with untrusted data in production!
# This is purely illustrative of what Fickling detects.
class MaliciousObject:
def __reduce__(self):
return (getattr(os, 'system'), ('echo malicious command executed!',))
import os
malicious_obj = MaliciousObject()
unsafe_pickled_data = pickle.dumps(malicious_obj)
# Analyze the potentially unsafe pickled data
try:
print("\n--- Analyzing potentially unsafe pickle ---")
results = analyze_pickle(unsafe_pickled_data)
if results.is_safe():
print("Pickle is safe. No violations found.")
else:
print("Pickle is potentially unsafe. Violations:")
for violation in results.violations:
print(f"- {violation.severity.name}: {violation.message}")
except UnsafeError as e:
print(f"Analysis detected an unsafe pickle: {e}")
except Exception as e:
print(f"An unexpected error occurred during analysis: {e}")