SparkMagic: Spark execution via Livy
raw JSON → 0.23.0 verified Fri May 01 auth: no python
SparkMagic is a Jupyter/IPython magic command extension for interactively working with Spark clusters via Livy (REST API). It supports multiple sessions, language interpreters (PySpark, Spark, Scala, SQL), and automatic visualization of DataFrames. Current version is 0.23.0 (released 2025-02-14) with active but low-frequency releases; the project is in maintenance mode.
pip install sparkmagic Common errors
error ModuleNotFoundError: No module named 'sparkmagic' ↓
cause sparkmagic is not installed in the current Python environment.
fix
Run
pip install sparkmagic in the terminal and restart the kernel. error ImportError: cannot import name 'SparkMagic' from 'sparkmagic' ↓
cause Incorrect import path; sparkmagic provides a magic extension, not a direct importable class.
fix
Use
%load_ext sparkmagic in a notebook cell instead of a Python import. error TypeError: 'NoneType' object is not subscriptable ↓
cause Livy endpoint not configured or not responding. The extension returns None for session info.
fix
Check your Livy URL and ensure the Livy server is running. Use
%manage_spark to set the endpoint. Warnings
breaking In version 0.20.0, the extension changed to support async kernel execution in ipykernel>=6. Custom magic commands or code that assumes synchronous execution may break. ↓
fix Update any custom code to handle async execution; test with simple cells first.
deprecated Python 3.6 support was dropped in v0.20.0. Users on Python 3.6 must stay on v0.19.2 or lower. ↓
fix Upgrade Python to 3.7+.
gotcha You must have a running Livy server reachable from the notebook server. The extension does not start Livy. ↓
fix Ensure Livy is installed and running, and configure the endpoint via %manage_spark or config file.
gotcha The %manage_spark widget may not display correctly in JupyterLab without the appropriate JupyterLab extensions (jupyterlab-sparkmagic). ↓
fix Install the JupyterLab extension: `jupyter labextension install jupyterlab-sparkmagic`
Imports
- sparkmagic
import sparkmagic
Quickstart
# In a Jupyter notebook cell:
%load_ext sparkmagic
%manage_spark
# Then start a session:
%%spark -s mysession
# Create a simple DataFrame:
df = spark.range(10)
print(df.count())