{"id":1904,"library":"apache-airflow-providers-sftp","title":"Apache Airflow SFTP Provider","description":"This provider package enables Apache Airflow to interact with SFTP servers, facilitating secure file transfers and management within Airflow DAGs. It includes operators and hooks for various SFTP operations like getting, putting, and listing files. The current version is 5.7.2, and new versions are released regularly as part of the Apache Airflow provider release cycle.","status":"active","version":"5.7.2","language":"en","source_language":"en","source_url":"https://github.com/apache/airflow/tree/main/airflow/providers/sftp","tags":["airflow","sftp","provider","file-transfer","ssh","etl"],"install":[{"cmd":"pip install apache-airflow-providers-sftp","lang":"bash","label":"Install core provider"}],"dependencies":[{"reason":"Minimum Airflow version supported by provider 5.x.x is 2.7.0+","package":"apache-airflow","optional":false},{"reason":"Required for underlying SSH capabilities, specifically >=4.0.0 for sftp provider >=5.0.0.","package":"apache-airflow-providers-ssh","optional":false},{"reason":"Core dependency for SSH/SFTP connections.","package":"paramiko","optional":false},{"reason":"Core dependency for SSH/SFTP connections.","package":"pysftp","optional":false},{"reason":"Core dependency for SSH/SFTP connections.","package":"sshtunnel","optional":false},{"reason":"Optional extra for SFTP filesystem capabilities (`apache-airflow-providers-sftp[sshfs]`).","package":"sshfs","optional":true}],"imports":[{"symbol":"SFTPHook","correct":"from airflow.providers.sftp.hooks.sftp import SFTPHook"},{"symbol":"SFTPOperator","correct":"from airflow.providers.sftp.operators.sftp import SFTPOperator"},{"symbol":"SFTPOperation","correct":"from airflow.providers.sftp.operators.sftp import SFTPOperation"},{"symbol":"SFTPSensor","correct":"from airflow.providers.sftp.sensors.sftp import SFTPSensor"}],"quickstart":{"code":"import os\nfrom datetime import datetime\n\nfrom airflow import DAG\nfrom airflow.providers.sftp.operators.sftp import SFTPOperator, SFTPOperation\n\n# Set default values for connection details via environment variables\nSFTP_CONN_ID = os.environ.get('AIRFLOW_CONN_SFTP_DEFAULT', 'sftp_default')\nREMOTE_HOST = os.environ.get('SFTP_REMOTE_HOST', 'localhost')\nREMOTE_PORT = int(os.environ.get('SFTP_REMOTE_PORT', '22'))\nREMOTE_USERNAME = os.environ.get('SFTP_REMOTE_USERNAME', 'sftpuser')\nREMOTE_PASSWORD = os.environ.get('SFTP_REMOTE_PASSWORD', 'sftppassword')\n\n# Ensure SFTP_CONN_ID is configured in Airflow UI or via environment variable\n# Example for environment variable:\n# export AIRFLOW_CONN_SFTP_DEFAULT='sftp://sftpuser:sftppassword@localhost:22/'\n\nwith DAG(\n    dag_id='sftp_example_dag',\n    start_date=datetime(2023, 1, 1),\n    schedule_interval=None,\n    catchup=False,\n    tags=['sftp', 'example', 'file-transfer'],\n) as dag:\n    upload_file_task = SFTPOperator(\n        task_id='upload_local_to_sftp',\n        ssh_conn_id=SFTP_CONN_ID,\n        local_filepath='/tmp/local_file_to_upload.txt',\n        remote_filepath='/tmp/remote_uploaded_file.txt',\n        operation=SFTPOperation.PUT,\n        create_intermediate_dirs=True,\n        # Optional: remote_host=REMOTE_HOST, port=REMOTE_PORT, username=REMOTE_USERNAME, password=REMOTE_PASSWORD\n        # These are usually configured in the SFTP_CONN_ID\n    )\n\n    download_file_task = SFTPOperator(\n        task_id='download_sftp_to_local',\n        ssh_conn_id=SFTP_CONN_ID,\n        local_filepath='/tmp/local_downloaded_file.txt',\n        remote_filepath='/tmp/remote_uploaded_file.txt',\n        operation=SFTPOperation.GET,\n        # Optional: remote_host=REMOTE_HOST, port=REMOTE_PORT, username=REMOTE_USERNAME, password=REMOTE_PASSWORD\n    )\n\n    # For a real scenario, you'd create /tmp/local_file_to_upload.txt before running\n    # Example of creating the file:\n    # from airflow.operators.python import PythonOperator\n    # create_local_file = PythonOperator(\n    #    task_id='create_local_file',\n    #    python_callable=lambda: open('/tmp/local_file_to_upload.txt', 'w').write('Hello from Airflow!'),\n    # )\n    # create_local_file >> upload_file_task\n\n    upload_file_task >> download_file_task\n","lang":"python","description":"This quickstart demonstrates how to use the `SFTPOperator` to upload a file from a local path to an SFTP server and then download it back. It assumes an SFTP connection named `sftp_default` is configured in Airflow. For demonstration purposes, connection details can be provided via environment variables, though typically these are managed within the Airflow UI."},"warnings":[{"fix":"Ensure your Apache Airflow environment is upgraded to at least version 2.7.0 before installing or upgrading this provider to version 5.x.x. Check the provider's changelog for specific version requirements.","message":"Provider version 5.x.x requires Apache Airflow 2.7.0+ (as of March 2026). Older provider versions had lower minimum Airflow requirements (e.g., 2.0.0 requires 2.1.0+, 3.0.0 requires 2.2.0+). Installing a newer provider version on an older Airflow might automatically upgrade Airflow, potentially requiring manual database migration.","severity":"breaking","affected_versions":">=5.0.0"},{"fix":"Instead of `sftpHook.get_conn()`, use `sftpHook.get_managed_conn()` when you need a `paramiko.SFTPClient` instance for direct interaction. Alternatively, downgrade the provider to `5.0.0` if direct `get_conn()` usage is critical and refactoring is not immediately feasible.","message":"In `apache-airflow-providers-sftp` version 5.1.0, the `SFTPHook.get_conn()` method no longer directly returns a `paramiko.SFTPClient` instance. Code expecting this specific return type will break.","severity":"breaking","affected_versions":">=5.1.0"},{"fix":"Ensure `pip install apache-airflow-providers-ssh>=4.0.0` is performed alongside or before installing `apache-airflow-providers-sftp` version 5.0.0+.","message":"For `apache-airflow-providers-sftp` versions 5.0.0 and above, the `apache-airflow-providers-ssh` dependency must be version 4.0.0 or higher. Older versions of the SSH provider will cause issues due to missing keyword arguments.","severity":"breaking","affected_versions":"apache-airflow-providers-sftp >=5.0.0"},{"fix":"Use `conn_timeout` instead of `timeout` for specifying connection timeouts in your SFTP connection 'Extra' parameters.","message":"The `timeout` parameter in SFTP connection 'Extra' field is deprecated.","severity":"deprecated","affected_versions":"All versions, specifically noted in documentation for recent versions."},{"fix":"Use `private_key_passphrase` instead of `private_key_pass` for consistency with `SSHHook` arguments.","message":"The `private_key_pass` parameter in SFTPHook's connection 'Extra' field was deprecated.","severity":"deprecated","affected_versions":"Versions around 1.1.1 to 2.0.0, likely fully removed in later versions."},{"fix":"Set `create_intermediate_dirs=True` in the `SFTPOperator` to automatically create missing directories in the remote path during file transfer operations.","message":"The `SFTPOperator` by default does not create intermediate directories on the remote host when transferring files. If the remote path's parent directory does not exist, the operation will fail.","severity":"gotcha","affected_versions":"All versions"},{"fix":"For directory transfers, you typically need to list files within the directory (e.g., using `SFTPHook`) and then iterate with `SFTPOperator` for each file, or use an external tool/script.","message":"`SFTPOperator` traditionally does not support transferring entire directories, requiring individual file paths. Although an issue was raised for this, it's not a standard feature.","severity":"gotcha","affected_versions":"All versions up to 5.7.2"}],"env_vars":null,"last_verified":"2026-04-09T00:00:00.000Z","next_check":"2026-07-08T00:00:00.000Z"}