Libify: Import Databricks Notebooks as Libraries/Modules
Libify (version 0.78) is a Python library designed to simplify the process of importing Databricks notebooks as modules. It enables nesting of notebook imports for complex workflows and supports Databricks Runtime Version 5.5 and above. The library was last released in September 2020, indicating an infrequent release cadence and a focus on maintaining existing functionality rather than active development.
Common errors
-
TypeError: module object is not callable
cause Attempting to call the imported notebook module directly instead of its functions/variables (e.g., `my_module()` instead of `my_module.function_name()`).fixAccess specific attributes (functions, variables, classes) from the imported notebook using dot notation (e.g., `my_module.my_function()` or `my_module.my_variable`). -
AttributeError: module 'libify' has no attribute 'importer'
cause The `libify` package is not correctly installed on the Databricks cluster, or the `import libify` statement failed. This can also happen if there's a file named `libify.py` in your workspace that's shadowing the actual package.fixVerify `libify` is installed on your cluster (see installation steps). Check your workspace for any conflicting files named `libify.py`. -
Notebook functions/variables are not accessible via the imported module (e.g., `AttributeError: 'module' object has no attribute 'my_function'`)
cause The 'importee' notebook either used `dbutils.notebook.exit()` prematurely, or the `libify.exporter(globals())` call was not the last and only statement in its final cell. The global state was not correctly captured.fixEnsure the importee notebook does not use `dbutils.notebook.exit()` and that `libify.exporter(globals())` is the *only* code in the *last* cell of the importee notebook.
Warnings
- gotcha The `dbutils.notebook.exit()` function must NOT be used in notebooks that are intended to be imported using `libify`. Its presence will prevent `libify` from properly exporting the notebook's global state.
- gotcha The `libify.exporter(globals())` call in the 'importee' notebook must be the absolute last cell and contain ONLY that specific code. Any other code or comments in that final cell will prevent correct module export.
- deprecated Databricks introduced 'Files in Repos' functionality (available since late 2021 / Feb 2023 update) which allows direct import of Python files as modules. This native feature might offer a more streamlined approach for new projects compared to `libify`, especially for standard Python modules.
- gotcha On Databricks Community Cloud, the installation and setup for `libify` needs to be rerun each time a cluster is created or restarted for imports to function correctly.
Install
-
pip install libify -
1. Click 'Clusters' icon in sidebar. 2. Click a running cluster name. 3. Click 'Libraries' tab. 4. Click 'Install New'. 5. Under 'Library Source', choose 'PyPI'. 6. Under 'Package', write 'libify'. 7. Click 'Install'.
Imports
- libify
import libify
- exporter
import libify libify.exporter(globals())
- importer
import libify imported_module = libify.importer(globals(), '/path/to/your/notebook')
Quickstart
# --- In the notebook to be imported (e.g., '/Users/my_user/my_functions') ---
# This cell must be the LAST cell in the notebook and contain ONLY this code.
def greet(name):
return f"Hello, {name} from imported notebook!"
def add(a, b):
return a + b
import libify
libify.exporter(globals())
# --- In the importing notebook (e.g., '/Users/my_user/main_workflow') ---
import libify
# The path should be relative to the Databricks workspace root or absolute DBFS path
my_module = libify.importer(globals(), '/Users/my_user/my_functions')
# Now you can access functions/variables defined in 'my_functions' notebook
message = my_module.greet("World")
print(message)
result = my_module.add(5, 3)
print(f"The sum is: {result}")
# Expected output in the importing notebook:
# Hello, World from imported notebook!
# The sum is: 8