Ansible comes pretty handy when aiming to automate your infrastructure.
However, in my experience over 2+ years, combining Ansible with loops can easily reach brain-wracking complexity.
Following from my playbook demonstrates how json_query is essential to navigate a JSON jungle.
Of course, there can be other approaches but, json_query simplifies it best in my opinion.
The task at hand:
- register the output of a looped module to a variable.
- check the 'results' in the output for 'failed' or 'changed' booleans to decide actions in the next module.
Sounds simple right? Not quite ..
Trial #1:
- name: PUT files in bucket amazon.aws.aws_s3: bucket: "{{ arch_bucket }}" object: "{{ item.path|basename }}" src: "{{ item.path }}" mode: put aws_ca_bundle: "/path/to/bundle.pem" aws_access_key: "{{ bucket_access_key }}" aws_secret_key: "{{ bucket_secret_key }}" s3_url: "{{ cloud_url }}" validate_certs: yes loop: "{{ files_to_archive }}" register: archive_files_op - name: Delete files successfully archived file: path: "{{ item.path }}" state: absent loop: "{{ archive_files_op | json_query(del_query) }}" vars: del_query: "results[?changed == 'True' && failed == 'False'].item"
For some reason, the json_query doesn't like using standard comparison operators for the booleans and an error is reported:
fatal: [target.node.com]: FAILED! => {"msg": "template error while templating string: expected token ',', got 'True'. String: {{ archive_files_op | json_query('results[?changed=='True' && failed=='False'].item.path') }}"}
After hours of trying, this is what worked to my satisfaction:
- name: PUT files in bucket amazon.aws.aws_s3: ..<snip-snip>.. loop: "{{ files_to_archive }}" register: archive_files_op - name: Delete files successfully archived file: path: "{{ item.path }}" state: absent loop: "{{ archive_files_op | json_query(del_query) }}" vars: del_query: "results[?!failed && changed].item"
Happy automating!
Top comments (0)