Handling Date-Formatted Files in Ansible: Dynamic Path Manipulation Techniques


2 views

Working with date-formatted files is a common requirement in automation, especially for log rotation, backups, and time-based file operations. While shell scripts easily handle date +%format commands, Ansible requires different approaches due to its idempotent nature.

Direct shell-style command substitution like date +%y_%m_%d won't work in Ansible tasks because:

  • Ansible evaluates the playbook before execution
  • The backtick syntax isn't processed by Jinja2 templating engine
  • File module parameters need pre-evaluated strings

Using Ansible Facts

- name: Create dated file using ansible_date_time
  file:
    path: "/path/somefile.{{ ansible_date_time.date }}"
    state: touch

Custom Date Formatting

- name: Create file with custom date format
  file:
    path: "/path/somefile.{{ '%Y-%m-%d' | strftime }}"
    state: touch

For Archive Operations

- name: Unzip dated archive
  unarchive:
    src: "/path/zomezip.{{ '%y_%m_%d' | strftime }}.tar.gz"
    dest: /target/path
    remote_src: yes

Variable Pre-calculation

- name: Set date variable
  set_fact:
    today_date: "{{ '%Y_%m_%d' | strftime }}"

- name: Use pre-calculated date
  file:
    path: "/path/backup_{{ today_date }}.log"
    state: touch

Timezone-aware Dates

- name: Get timezone-specific date
  set_fact:
    tz_date: "{{ lookup('pipe', 'TZ=Asia/Tokyo date +%Y-%m-%d') }}"

- name: Create Tokyo-time file
  file:
    path: "/path/tokyo_{{ tz_date }}.log"
  • Remember that strftime uses Python's formatting, not shell's date
  • For complex date math, consider using the datetime Python module
  • Test date formats thoroughly as YAML can interpret certain patterns as numbers

When working with log rotations, backups, or time-stamped archives in Ansible, you'll frequently encounter files named with date patterns like backup_23_11_15.tar.gz. The immediate temptation is to use shell-style command substitution (date +%y_%m_%d) directly in playbooks - but this approach fails because Ansible evaluates Jinja2 templates before shell interpretation.

Instead of shell commands, leverage Ansible's built-in strftime filter combined with the ansible_date_time fact:

- name: Create dated directory
  file:
    path: "/backups/archive_{{ ansible_date_time.date }}"
    state: directory

- name: Extract timestamped archive
  unarchive:
    src: "/path/to/backup_{{ '%y_%m_%d' | strftime }}.tar.gz"
    dest: "/restore_location/"
    remote_src: yes

For more complex date patterns, combine multiple filters:

- name: Process weekly backups
  file:
    path: "/storage/week_{{ ansible_date_time.isoweekday }}_of_{{ '%B' | strftime }}.log"
    state: touch

When working across timezones, explicitly set the reference time:

- name: UTC timestamped operation
  vars:
    utc_time: "{{ '%Y-%m-%dT%H:%M' | strftime(now(utc=true)) }}"
  copy:
    src: "/tmp/datafile"
    dest: "/archive/{{ utc_time }}_dataset.json"

1. Missing Facts: Ensure gather_facts: true (default) or manually set ansible_date_time
2. Windows Compatibility: Use win_file module with %Y%m%d format
3. Idempotency: When creating files, combine with stat module to check existence first

- name: Check if timestamped file exists
  stat:
    path: "/logs/app_{{ '%Y%j' | strftime }}.log"
  register: log_check

- name: Create log if missing
  file:
    path: "/logs/app_{{ '%Y%j' | strftime }}.log"
    state: touch
  when: not log_check.stat.exists