Automatically Distribute SSH Public Keys Among Hosts Using Ansible for Passwordless Authentication


2 views

When configuring database replication between a master and multiple slaves, passwordless SSH access becomes crucial. The traditional approach of manually adding each slave's public key to the master's authorized_keys file doesn't scale well in dynamic environments where slaves might be added or removed frequently.

Ansible provides elegant ways to handle this automation. Here's how to implement a dynamic solution that automatically collects public keys from all slave hosts and deploys them to the master:

- name: Gather slave public keys
  hosts: databases_slave
  tasks:
    - name: Get slave public key
      command: cat /var/lib/postgresql/.ssh/id_rsa.pub
      register: slave_pubkey
      changed_when: false

    - name: Store slave public key
      set_fact:
        slave_key: "{{ slave_pubkey.stdout }}"
      delegate_to: localhost
      run_once: true

- name: Configure master authorized_keys
  hosts: databases_master
  tasks:
    - name: Add all slave keys to master
      authorized_key:
        user: postgres
        state: present
        key: "{{ hostvars[item]['slave_key'] }}"
      with_items: "{{ groups['databases_slave'] }}"
      when: hostvars[item].slave_key is defined

For a complete solution, you might want to ensure SSH keys exist on all slaves:

- name: Ensure SSH keys exist on slaves
  hosts: databases_slave
  tasks:
    - name: Generate SSH key if missing
      openssh_keypair:
        path: /var/lib/postgresql/.ssh/id_rsa
        type: rsa
        size: 4096
        owner: postgres
        group: postgres
        mode: '0600'

Another method uses Ansible facts to collect keys dynamically:

- name: Collect slave keys
  hosts: databases_slave
  tasks:
    - name: Get public key
      slurp:
        src: /var/lib/postgresql/.ssh/id_rsa.pub
      register: key_content

    - name: Set fact with key
      set_fact:
        slave_key: "{{ key_content.content | b64decode }}"
      delegate_to: localhost
      run_once: true

- name: Apply keys to master
  hosts: databases_master
  tasks:
    - name: Add keys to authorized_keys
      authorized_key:
        user: postgres
        key: "{{ hostvars[item]['slave_key'] }}"
      loop: "{{ groups['databases_slave'] }}"
      when: hostvars[item].slave_key is defined

When implementing this solution:

  • Ensure proper permissions on SSH directories and files
  • Consider using SSH certificates instead of raw keys for better manageability
  • Regularly rotate keys in production environments
  • Use separate SSH keys for different purposes (replication vs administration)

If you encounter issues:

- name: Debug slave keys
  debug:
    var: hostvars[item].slave_key
  with_items: "{{ groups['databases_slave'] }}"

When configuring database replication between Ansible-managed hosts, one critical requirement is establishing passwordless SSH connections from slaves to the master. The traditional approach of manually maintaining authorized_keys files becomes cumbersome as infrastructure scales.

Here's a complete implementation that dynamically collects and distributes SSH public keys among hosts:

---
- name: Configure SSH keys for database replication
  hosts: databases_master
  gather_facts: false
  vars:
    postgres_ssh_dir: /var/lib/postgresql/.ssh
  tasks:
    - name: Ensure .ssh directory exists
      file:
        path: "{{ postgres_ssh_dir }}"
        state: directory
        owner: postgres
        group: postgres
        mode: '0700'
  
    - name: Collect slave keys dynamically
      delegate_to: "{{ item }}"
      delegate_facts: true
      slurp:
        src: "{{ postgres_ssh_dir }}/id_rsa.pub"
      register: slave_keys
      loop: "{{ groups['databases_slave'] }}"
  
    - name: Add all slave keys to master
      authorized_key:
        user: postgres
        state: present
        key: "{{ item.content | b64decode }}"
      loop: "{{ slave_keys.results }}"
      when: item.content is defined

For production environments, consider these enhancements:

# In group_vars/databases_slave.yml
custom_ssh_key_path: /opt/custom_keys/postgresql_rsa.pub

# Modified task
- name: Collect slave keys from custom location
  delegate_to: "{{ item }}"
  slurp:
    src: "{{ custom_ssh_key_path | default(postgres_ssh_dir+'/id_rsa.pub') }}"
  register: slave_keys
  loop: "{{ groups['databases_slave'] }}"

Always:

  • Set proper permissions (600 for keys, 700 for .ssh directory)
  • Consider using ssh certificates instead of raw keys
  • Implement key rotation through Ansible vault

Common issues and solutions:

# Test SSH connection manually
- name: Verify SSH connectivity
  hosts: databases_slave
  tasks:
    - command: ssh -o StrictHostKeyChecking=no postgres@{{ hostvars[groups['databases_master'][0]].ansible_host }} "echo OK"
      register: ssh_test
      changed_when: false
  
    - debug:
        var: ssh_test.stdout