Apache Airflow FTP Provider

3.14.2 · active · verified Thu Apr 09

The `apache-airflow-providers-ftp` package provides operators and hooks to interact with FTP and FTPS servers within Apache Airflow DAGs. It enables tasks like uploading, downloading, and deleting files from remote FTP/FTPS locations. The current version is 3.14.2, and provider packages release independently from core Airflow, typically for bug fixes, new features, or compatibility with new Airflow versions.

Warnings

Install

Imports

Quickstart

This quickstart defines an Airflow DAG that demonstrates uploading a file to an FTP server using the `FTPOperator`. It first creates a local dummy file, then uploads it to the configured FTP server, and finally cleans up the local file. Remember to configure an 'FTP' type connection in the Airflow UI with `conn_id='ftp_default'` and appropriate host, login, and password.

from __future__ import annotations

import pendulum

from airflow.decorators import dag
from airflow.operators.bash import BashOperator
from apache_airflow_providers_ftp.operators.ftp import FTPOperator

# Ensure you have an FTP connection configured in Airflow UI (Admin -> Connections)
# with conn_id='ftp_default'.
# Set: Conn Id = ftp_default, Conn Type = FTP
# Host: your_ftp_host (e.g., 'localhost' or 'ftp.example.com')
# Port: 21 (or 22 for SFTP if using an SFTP provider, 990 for implicit FTPS)
# Login: your_username
# Password: your_password

@dag(
    dag_id="ftp_example_dag",
    start_date=pendulum.datetime(2023, 10, 26, tz="UTC"),
    catchup=False,
    schedule=None,
    tags=["ftp", "example", "provider"],
)
def ftp_dag():
    # Create a dummy local file to upload
    create_local_file = BashOperator(
        task_id="create_local_file",
        bash_command="echo 'Hello from Airflow FTP provider!' > /tmp/airflow_ftp_test.txt"
    )

    # Upload the file to FTP
    upload_file_to_ftp = FTPOperator(
        task_id="upload_file_to_ftp",
        ftp_conn_id="ftp_default",
        local_filepath="/tmp/airflow_ftp_test.txt",
        remote_filepath="/remote_airflow_test.txt",
        operation="put", # 'put' (upload), 'get' (download), 'delete' (delete remote)
        create_intermediate_dirs=True,
    )

    # Clean up the local dummy file
    clean_local_file = BashOperator(
        task_id="clean_local_file",
        bash_command="rm /tmp/airflow_ftp_test.txt"
    )

    create_local_file >> upload_file_to_ftp >> clean_local_file

ftp_dag()

view raw JSON →