Export all airflow connections to new environment
This will allow you to export all connections so that they could be re-added elsewhere via the CLI (https://airflow.apache.org/docs/stable/howto/connection/index.html#creating-a-connection-from-the-cli). Run it in Python wherever your existing installation is:
from airflow.hooks.base_hook import BaseHook
for c in ['my', 'list', 'of', 'connections']:
conn = BaseHook.get_connection(c)
print("airflow connections --add \\ \n \
--conn_id '{conn_id}' \\ \n \
--conn_type '{conn_type}' \\ \n \
--conn_host '{conn_host}' \\ \n \
--conn_schema '{conn_schema}' \\ \n \
--conn_login '{conn_login}' \\ \n \
--conn_password '{conn_password}' \\ \n \
--conn_port '{conn_port}' \\ \n \
--conn_extra '{conn_extra}' \\".format(
conn_id=conn.conn_id,
conn_type=conn.conn_type,
conn_host=conn.host,
conn_schema=conn.schema,
conn_login=conn.login,
conn_password=conn.password,
conn_port=conn.port,
conn_extra=conn.extra,
))
This also outputs the decrypted password. Tested and working on Airflow 1.10.12.
I wrote quick Python script that generates a connection-commands.sh
file with ALL the connections in one airflow environemnt, that can be run in a second airflow environemnt to import the connections.
from airflow.models import Connection
from airflow.utils.db import create_session
with create_session() as session:
connections = session.query(Connection).order_by(Connection.conn_id).all()
connection_commands = []
conn: Connection
for conn in connections:
conn_command_lines = [
f"airflow connections",
f"--add",
f"--conn_id='{conn.conn_id}'",
f"--conn_type='{conn.conn_type}'",
]
# add `host`, if non-empty
if conn.host:
conn_command_lines.append(f"--conn_host='{conn.host}'")
# add `port`, if non-empty
if conn.port:
conn_command_lines.append(f"--conn_port='{conn.port}'")
# add `schema`, if non-empty
if conn.schema:
conn_command_lines.append(f"--conn_schema='{conn.schema}'")
# add `login`, if non-empty
if conn.login:
conn_command_lines.append(f"--conn_login='{conn.login}'")
# add `password`, if non-empty
if conn.password:
conn_command_lines.append(f"--conn_password='{conn.password}'")
# add `extra`, if non-empty
if conn.extra:
conn_command_lines.append(f"--conn_extra='{conn.extra}'")
# combine the command lines
conn_command = " \\\n ".join(conn_command_lines) + ";"
connection_commands.append(conn_command)
with open("connection-commands.sh", mode="w") as f:
f.write("#!/usr/bin/env bash")
f.write("\n\n")
for conn_command in connection_commands:
f.write(conn_command)
f.write("\n\n")
Output Format
The format of the generated connection-commands.sh
file is as follows:
#!/usr/bin/env bash
airflow connections \
--add \
--conn_id='my_connection_1' \
--conn_type='http' \
--conn_host='https://example.com' \
--conn_login='user' \
--conn_password='password';
airflow connections \
--add \
--conn_id='my_connection_2' \
--conn_type='http' \
--conn_host='https://example.com' \
--conn_login='user' \
--conn_password='password';
...
WARNING: the airflow connections --add ...
command will only add a connection if the connection does not already exist, you will need to delete the connection first if you want to "update" it.
You can either connect directly to the Airflow meta db and dump those connections, then load them in a separate database. However, if you want to automate something like this, you can get started by dumping them in a CSV file:
from airflow.utils import db
from airflow.models import Connection
import csv
outfile = open('myconnections.csv', 'w')
outcsv = csv.writer(outfile)
with db.create_session() as session:
connections = session.query(Connection).all()
conn_list = [
[getattr(c, column.name) for column in Connection.__mapper__.columns]
for c in connections
]
outcsv.writerows(conn_list)
outfile.close()
After that, you can load that to a new DB manually or with a similar script.
IMPORTANT: if you have enabled encryption, the passwords stored for these connections will be encrypted, and when you load them to the new DB, you must use the identical fernet key, or otherwise you won't be able to decrypt them.