Ansible SSH Configuration: Setting Default Username/Password for All Hosts


2 views

When working with Ansible inventories containing multiple hosts sharing the same SSH credentials, repeating ansible_ssh_user and ansible_ssh_pass for each host becomes tedious and error-prone. In large infrastructures, this leads to maintenance headaches.

Ansible provides group variables to define common parameters for all hosts in a group. Create a group_vars/all file (in YAML format) to store credentials:

---
ansible_connection: ssh
ansible_user: vagrant
ansible_password: vagrant

The inventory file then simplifies to:

[master]
192.168.1.10

[slave]
192.168.1.11
192.168.1.12

[app]
192.168.1.13

[all:children]
master
slave

1. Using a Base Group

Create a base group containing all hosts and define variables there:

[base]
192.168.1.10
192.168.1.11
192.168.1.12
192.168.1.13

[base:vars]
ansible_connection=ssh
ansible_ssh_user=vagrant
ansible_ssh_pass=vagrant

2. Directory Structure Approach

For complex setups, use this directory structure:

inventory/
├── production/
│   ├── group_vars/
│   │   └── all.yml
│   └── hosts
└── staging/
    ├── group_vars/
    │   └── all.yml
    └── hosts

While convenient, storing passwords in plaintext is risky. Consider these alternatives:

  • SSH key authentication (preferred)
  • Ansible Vault for encrypted credentials
  • Dynamic inventories with cloud provider IAM

1. Create encrypted variables:

ansible-vault create group_vars/all.yml

2. Add encrypted content:

---
ansible_user: vagrant
ansible_password: !vault |
          $ANSIBLE_VAULT;1.1;AES256
          66386439653236326331306261616162386435643735326532653630653163363361616438373761
          3262393938333661343839373434383065313238616235320a643561633338323937386261353230
          61396338353961353938633837396266643465653366393563333034323534313535383161323935
          6531353437623566630a306662613137323066373266356635373031613139343163656466333034
          6334

3. Run playbooks with:

ansible-playbook playbook.yml --ask-vault-pass

If connections fail:

  1. Verify SSH service is running on target hosts
  2. Check firewall rules allow port 22
  3. Test manual SSH connection first
  4. Enable verbose output with -vvvv
  5. Verify Python is installed on targets

When managing multiple servers in Ansible, you often encounter situations where most hosts share identical SSH connection parameters. Having to specify ansible_ssh_user and ansible_ssh_pass for each host individually creates maintenance headaches and violates the DRY (Don't Repeat Yourself) principle.

The most elegant way to solve this is by using group variables. Create a group_vars directory in your inventory folder and define common variables there:

# Directory structure
inventory/
├── group_vars/
│   └── all.yml
└── hosts

The all.yml file would contain:

ansible_connection: ssh
ansible_ssh_user: vagrant
ansible_ssh_pass: vagrant

For simpler setups, you can define variables directly in the inventory file under group sections:

[all:vars]
ansible_connection=ssh
ansible_ssh_user=vagrant
ansible_ssh_pass=vagrant

[master]
192.168.1.10

[slave]
192.168.1.11
192.168.1.12

[app]
192.168.1.13

For organization-wide defaults, consider setting these in your ansible.cfg:

[defaults]
remote_user = vagrant
private_key_file = ~/.ssh/id_rsa
host_key_checking = False

While the above solutions work, storing passwords in plaintext isn't recommended. Better approaches include:

  • Using SSH keys instead of passwords
  • Leveraging Ansible Vault for encrypted credentials
  • Implementing ssh-agent for key management

Here's how your configuration would look using SSH keys:

# group_vars/all.yml
ansible_connection: ssh
ansible_user: vagrant
ansible_ssh_private_key_file: ~/.ssh/ansible_key

# Generate the key first:
# ssh-keygen -t rsa -b 4096 -f ~/.ssh/ansible_key -C "ansible"

Remember that Ansible applies variables in this order (later override earlier):

  1. Command line values (-e)
  2. Playbook vars
  3. Inventory host/group vars
  4. ansible.cfg defaults