How to Dynamically Assign Different Hosts to Included Playbooks in Ansible


2 views

When working with complex Ansible deployments, we often face scenarios where we need to:

  • Reuse playbooks with different host groups
  • Maintain variable context across playbook includes
  • Dynamically generate inventory during execution

The direct approach of specifying hosts within include statements doesn't work as you've discovered. Here's a better pattern:

---
- name: Main playbook
  hosts: localhost
  tasks:
    - name: Include playbook for postgres
      include_tasks: playbook_1.yml
      vars:
        target_hosts: tag_postgres

    - name: Include playbook for rabbitmq  
      include_tasks: playbook_2.yml
      vars:
        target_hosts: tag_rabbitmq

Then in your included playbooks (playbook_1.yml):

---
- name: Configure database servers
  hosts: "{{ target_hosts }}"
  tasks:
    - name: Ensure postgres is installed
      apt:
        name: postgresql
        state: present

For your AWS scenario, here's how to combine provisioning and configuration:

---
- name: Provision EC2 instances
  hosts: localhost
  tasks:
    - name: Launch web servers
      ec2_instance:
        key_name: mykey
        instance_type: t2.micro
        image: ami-123456
        wait: yes
        count: 3
        tags:
          Name: web_server
          Group: web

    - name: Refresh inventory
      meta: refresh_inventory

- name: Configure web servers
  hosts: tag_Group_web
  become: yes
  tasks:
    - name: Install nginx
      apt:
        name: nginx
        state: present

For more complex scenarios, consider these patterns:

Using Dynamic Includes

---
- name: Dynamic playbook inclusion
  hosts: localhost
  tasks:
    - name: Include appropriate playbook based on host group
      include_tasks: "{{ item.value }}"
      vars:
        target_hosts: "{{ item.key }}"
      with_dict:
        tag_postgres: playbook_1.yml
        tag_rabbitmq: playbook_2.yml

Shared Variables Between Playbooks

To share facts between playbooks targeting different hosts:

---
- name: Set shared variables
  hosts: localhost
  tasks:
    - set_fact:
        shared_value: "important_data"
      cacheable: yes

- name: Use shared variables  
  hosts: tag_servers
  tasks:
    - debug:
        msg: "Using shared value {{ hostvars['localhost']['shared_value'] }}"
  • Always use meta: refresh_inventory after provisioning
  • Consider using Ansible Tower/AWX for complex workflows
  • Test host patterns with ansible-inventory --graph
  • Use cacheable: yes for cross-playbook facts

In Ansible, each play in a playbook requires explicit host targeting through the hosts directive. When including other playbooks, we need to consider how host targeting propagates through the inclusion chain.

Here's how to properly parameterize host targeting for included playbooks:

---
- name: Main playbook coordinating deployments
  hosts: localhost
  connection: local
  tasks:
    - name: Include Postgres setup
      include_playbook: playbook_1.yml
      vars:
        target_hosts: "tag_postgres"
    
    - name: Include RabbitMQ setup  
      include_playbook: playbook_2.yml
      vars:
        target_hosts: "tag_rabbitmq"

Then in your included playbooks (playbook_1.yml):

---
- name: Configure Postgres servers
  hosts: "{{ target_hosts | default('tag_postgres') }}"
  tasks:
    - name: Install Postgres
      ansible.builtin.apt:
        name: postgresql
        state: present

For dynamic AWS EC2 provisioning and subsequent configuration:

---
- name: Provision EC2 instances
  hosts: localhost
  connection: local
  tasks:
    - name: Launch Postgres instances
      amazon.aws.ec2_instance:
        key_name: "{{ ssh_key }}"
        instance_type: t3.medium
        image_id: ami-12345678
        count: 3
        tags:
          Name: "Postgres-{{ ansible_date_time.epoch }}"
          role: postgres
      register: postgres_instances
    
    - name: Launch RabbitMQ instances
      amazon.aws.ec2_instance:
        key_name: "{{ ssh_key }}"
        instance_type: t3.small  
        image_id: ami-87654321
        count: 2
        tags:
          Name: "RabbitMQ-{{ ansible_date_time.epoch }}"
          role: rabbitmq
      register: rabbitmq_instances
    
    - name: Refresh EC2 inventory
      community.aws.aws_ec2_inventory:
        plugin: aws_ec2
        regions: us-east-1
        hostnames: private_dns_name
        keyed_groups:
          - key: tags.role
            separator: ""
      register: inventory_update

- name: Configure Postgres instances
  hosts: tag_role_postgres
  become: yes
  tasks:
    - name: Setup Postgres
      include_role:
        name: postgresql
        
- name: Configure RabbitMQ instances
  hosts: tag_role_rabbitmq
  become: yes
  tasks:
    - name: Setup RabbitMQ
      include_role:
        name: rabbitmq

To share variables between playbooks targeting different hosts:

---
- name: Shared variables setup
  hosts: localhost
  tasks:
    - name: Set shared facts
      ansible.builtin.set_fact:
        shared_variable: "value"
      cacheable: yes

- name: First deployment play  
  hosts: tag_postgres
  tasks:
    - name: Use shared fact
      ansible.builtin.debug:
        msg: "{{ hostvars['localhost']['shared_variable'] }}"

- name: Second deployment play
  hosts: tag_rabbitmq
  tasks:
    - name: Use shared fact  
      ansible.builtin.debug:
        msg: "{{ hostvars['localhost']['shared_variable'] }}"
  • Use dynamic inventory scripts (ec2.py) for AWS environments
  • Leverage host patterns with tags for flexible targeting
  • Structure playbooks by functional components rather than by host groups
  • Consider using Ansible Controller (AWX/Tower) for complex multi-host workflows
  • Utilize serial keyword for rolling updates across host groups