Ansible copy ssh key from one host to another
This does the trick for me, it collects the public ssh keys on the nodes and distributes it over all the nodes. This way they can communicate with each other.
- hosts: controllers
gather_facts: false
remote_user: root
tasks:
- name: fetch all public ssh keys
shell: cat ~/.ssh/id_rsa.pub
register: ssh_keys
tags:
- ssh
- name: check keys
debug: msg="{{ ssh_keys.stdout }}"
tags:
- ssh
- name: deploy keys on all servers
authorized_key: user=root key="{{ item[0] }}"
delegate_to: "{{ item[1] }}"
with_nested:
- "{{ ssh_keys.stdout }}"
- "{{groups['controllers']}}"
tags:
- ssh
Info: This is for the user root
I created a parameterized role to make sure ssh key pair is generated in a source user in a source remote host and its public key copied to a target user in a target remote host.
You can invoke that role in a nested loop of source and target host lists as shown at the bottom:
---
#****h* ansible/ansible_roles_ssh_authorize_user
# NAME
# ansible_roles_ssh_authorize_user - Authorizes user via ssh keys
#
# FUNCTION
#
# Copies user's SSH public key from a source user in a source host
# to a target user in a target host
#
# INPUTS
#
# * ssh_authorize_user_source_user
# * ssh_authorize_user_source_host
# * ssh_authorize_user_target_user
# * ssh_authorize_user_target_host
#****
#****h* ansible_roles_ssh_authorize_user/main.yml
# NAME
# main.yml - Main playbook for role ssh_authorize_user
# HISTORY
# $Id: $
#****
- assert:
that:
- ssh_authorize_user_source_user != ''
- ssh_authorize_user_source_host != ''
- ssh_authorize_user_target_user != ''
- ssh_authorize_user_target_host != ''
tags:
- check_vars
- name: Generate SSH Keypair in Source
user:
name: "{{ ssh_authorize_user_source_user }}"
state: present
ssh_key_comment: "ansible-generated for {{ ssh_authorize_user_source_user }}@{{ ssh_authorize_user_source_host }}"
generate_ssh_key: yes
delegate_to: "{{ ssh_authorize_user_source_host }}"
register: source_user
- name: Install SSH Public Key in Target
authorized_key:
user: "{{ ssh_authorize_user_target_user }}"
key: "{{ source_user.ssh_public_key }}"
delegate_to: "{{ ssh_authorize_user_target_host }}"
- debug:
msg: "{{ ssh_authorize_user_source_user }}@{{ ssh_authorize_user_source_host }} authorized to log in to {{ ssh_authorize_user_target_user }}@{{ ssh_authorize_user_target_host }}"
Invoking role in a loop:
- name: Authorize User
include_role:
name: ssh_authorize_user
vars:
ssh_authorize_user_source_user: "{{ git_user }}"
ssh_authorize_user_source_host: "{{ item[0] }}"
ssh_authorize_user_target_user: "{{ git_user }}"
ssh_authorize_user_target_host: "{{ item[1] }}"
with_nested:
- "{{ app_server_list }}"
- "{{ git_server_list }}"
Take a look to the authorized_key module for getting info on how to manage your public keys.
The most straightforward solution I can think of would be to generate a fresh key pair for your application, to be shared accross all your app instances. This may have security implications (you are indeed sharing keys between all instances!), but it'll simplify a lot the provisioning process.
You'll also require a deploy user on each app machine, to be used later on during deployment process. You'll need your public key (or jenkins one) on each deploy user's authorized_keys
.
A sketch playbook:
---
- name: ensure app/deploy public key is present on git server
hosts: gitserver
tasks:
- name: ensure app public key
authorized_key:
user: "{{ git_user }}"
key: app_keys/id_dsa.pub
state: present
- name: provision app servers
hosts: appservers
tasks:
- name: ensure app/deploy user is present
user:
name: "{{ deploy_user }}"
state: present
- name: ensure you'll be able to deploy later on
authorized_key:
user: "{{ deploy_user }}"
key: "{{ path_to_your_public_key }}"
state: present
- name: ensure private key and public one are present
copy:
src: keys/myapp.private
dest: "/home/{{ deploy_user }}/.ssh/{{ item }}"
mode: 0600
with_items:
- app_keys/id_dsa.pub
- app_keys/id_dsa