{"library":"apache-airflow-providers-ssh","title":"Apache Airflow SSH Provider","description":"This package provides operators, hooks, and sensors for interacting with SSH, SFTP, and SCP within Apache Airflow DAGs. It enables automation of tasks on remote servers via SSH protocol, including command execution and file transfers, supporting various authentication methods. The current version is 4.3.3, and its release cadence is tied to the broader Apache Airflow provider release schedule, with frequent updates.","status":"active","version":"4.3.3","language":"en","source_language":"en","source_url":"https://github.com/apache/airflow/tree/main/airflow/providers/ssh","tags":["airflow","provider","ssh","sftp","automation","remote-execution"],"install":[{"cmd":"pip install apache-airflow-providers-ssh","lang":"bash","label":"Install stable version"}],"dependencies":[{"reason":"Core Airflow dependency. Provider version 4.3.x requires Apache Airflow >=2.11.0.","package":"apache-airflow","optional":false},{"reason":"Compatibility layer for common Airflow features.","package":"apache-airflow-providers-common-compat","optional":false},{"reason":"Used for asynchronous SSH operations.","package":"asyncssh","optional":false},{"reason":"SSHv2 protocol library for Python.","package":"paramiko","optional":false},{"reason":"Allows SSH tunneling.","package":"sshtunnel","optional":false}],"imports":[{"symbol":"SSHOperator","correct":"from airflow.providers.ssh.operators.ssh import SSHOperator"},{"symbol":"SSHHook","correct":"from airflow.providers.ssh.hooks.ssh import SSHHook"},{"note":"SFTPHook is part of the SSH provider package.","wrong":"from airflow.providers.sftp.hooks.sftp import SFTPHook","symbol":"SFTPHook","correct":"from airflow.providers.ssh.hooks.sftp import SFTPHook"}],"quickstart":{"code":"import os\nfrom datetime import datetime\nfrom airflow.models.dag import DAG\nfrom airflow.providers.ssh.operators.ssh import SSHOperator\n\n# For local testing, ensure you have an 'ssh_default' connection configured in Airflow UI\n# or via an environment variable, e.g., AIRFLOW_CONN_SSH_DEFAULT=ssh://user@hostname:22/?key_file=/path/to/key\n# For this example, we'll mock the connection_id.\n\n# Example of setting connection details via environment variable for local execution\n# os.environ['AIRFLOW_CONN_SSH_DEFAULT'] = 'ssh://your_user@your_host:22'\n# If using a private key file:\n# os.environ['AIRFLOW_CONN_SSH_DEFAULT'] = 'ssh://your_user@your_host:22?key_file=/path/to/your/private_key.pem'\n\nwith DAG(\n    dag_id='ssh_operator_quickstart',\n    start_date=datetime(2023, 1, 1),\n    schedule_interval=None,\n    catchup=False,\n    tags=['ssh', 'example'],\n) as dag:\n    ssh_task = SSHOperator(\n        task_id='run_remote_command',\n        ssh_conn_id='ssh_default',  # Ensure this connection exists in Airflow\n        command='echo \"Hello from remote host $(hostname)\" && ls -l',\n        cmd_timeout=10,  # Timeout for the command execution\n    )\n","lang":"python","description":"This example demonstrates a basic DAG using the `SSHOperator` to connect to a remote server and execute a shell command. Before running, configure an 'SSH' connection in your Airflow UI (Admin -> Connections) with `Conn Id` as `ssh_default`, providing `Host`, `Login (Username)`, and `Port`. For authentication, you can specify `Password`, `Key File` path, or `Private Key` content in the 'Extra' field as a JSON object (e.g., `{\"key_file\": \"/path/to/your/key.pem\"}`)."},"warnings":[{"fix":"Review changelog for 4.0.0 and update code to use recommended methods and parameters (e.g., `conn_timeout` instead of `timeout`, `hook` attribute instead of `get_hook()`). Ensure your Airflow environment is on version 2.9.0 or higher.","message":"Provider version 4.0.0 introduced significant breaking changes by removing many previously deprecated features. Key changes include `SSHHook.timeout` removal (use `conn_timeout`), `SSHHook.create_tunnel()` being deprecated in favor of `get_tunnel()` with altered parameters, `SSHOperator.get_hook()` removed (use `hook` attribute), and `SSHOperator.exec_ssh_client_command()` removed (call `ssh_hook.exec_ssh_client_command()` directly). The minimum supported Airflow version was also bumped to 2.9.0.","severity":"breaking","affected_versions":">=4.0.0"},{"fix":"Upgrade your Apache Airflow installation to at least 2.11.0 and ensure your Python environment is 3.10 or newer for provider versions 4.2.0+.","message":"Minimum Apache Airflow version requirements have consistently increased with provider updates. Version 4.1.0 requires Airflow 2.10.0+, version 4.2.0 and 4.3.0 require Airflow 2.11.0+. Additionally, Python 3.9 support was dropped in provider version 4.1.1.","severity":"breaking","affected_versions":">=4.1.0"},{"fix":"Explicitly configure `no_host_key_check` and `allow_host_key_change` parameters in your Airflow SSH connection (often via 'Extra' JSON) to match your security policy. Consider using `host_key` in the connection extra to pin a specific host key.","message":"When configuring SSH connections, pay close attention to host key verification settings. By default, `no_host_key_check` is `true`, meaning new host keys are automatically added to `known_hosts`. However, `allow_host_key_change` is `false` by default, preventing connections if the host key changes. For robust production environments, consider strict host key checking.","severity":"gotcha","affected_versions":"all"},{"fix":"Ensure the provider package is installed in the same Python environment as your Airflow installation. Restart Airflow scheduler and webserver components. Verify installation using `airflow providers list` or `pip list` in your Airflow environment.","message":"If the SSH connection type does not appear in the Airflow UI's 'Admin -> Connections -> Add new record' dropdown after installation, it might indicate an issue with Airflow environment refresh or installation path.","severity":"gotcha","affected_versions":"all"}],"env_vars":null,"last_verified":"2026-04-05T00:00:00.000Z","next_check":"2026-07-04T00:00:00.000Z"}