In my workflow, I'm running playbook on all hosts from mine inventory, but in the middle I need to execute one command on a different system (lets creatively call it "central server") for all hosts in the inventory. And whats bad, that command is not capable to run in parallel, so I need to serialize it a bit. Initial version which does not do any serialization was:
- hosts: all
remote_user: root
tasks:
- name: "Configure something on host"
command: ...
- name: "Configure something on central server for each host"
command:
some_command --host "{{ ansible_fqdn }}"
delegate_to: centralserver.example.com
- name: "Configure something else on host"
command: ...
But "some_command" can not run multiple times in parallel and I can not fix it, so this is first way I have used to serialize it (so it runs only once on the central server at any time):
- hosts: all
remote_user: root
tasks:
- name: "Configure something on host"
command: ...
- hosts: all
remote_user: root
serial: 1
tasks:
- name: "Configure something on central server for each host"
command:
some_command --host "{{ ansible_fqdn }}"
delegate_to: centralserver.example.com
- hosts: all
remote_user: root
tasks:
- name: "Configure something else on host"
command: ...
So I have created 3 plays from previous 1 in my playbook where the middle one is serialized by "serial: 1" option. I have not used "forks: 1", because you can set this value only in ansible.cfg or on ansible-playbook command line.
Another way was to keep only one play in a playbook, run given task only once and iterate over whole inventory:
- hosts: all
remote_user: root
tasks:
- name: "Configure something on host"
command: ...
- name: "Configure something on central server for each host"
command:
some_command --host "{{ item }}"
with_items: groups['all']
run_once: true
delegate_to: centralserver.example.com
- name: "Configure something else on host"
command: ...
In my case I needed hostname, so in the command I have used hostvariable {{ hostvars[item]['ansible_fqdn'] }}.
No comments:
Post a Comment