You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Use Case
I want to deploy different versions of Splunk to different hosts in the same Ansible play, but not perform unnecessary downloads if there is already a local copy of the package available.
run_once: true would be undesirable as the play would fail if we did not have a copy of the other package already available locally. The role does not currently cleanup old tarballs post-installation from the default path of the user's home directory.
Suggested Implementation
Remove run_once: true from the download task
Before the download task, check if the desired tarball file has already been downloaded locally via stat.
If the desired tarball does not exist locally, download it.
If the desired tarball file exists locally, check for an existing local .sha512 hash file for the tarball.
If a local .sha512 file does not exist locally, download it.
Compare the SHA-512 hash of the local tarball to the expected hash in the .sha512 file.
If they do not match, remove the existing tarball file and download it again.
Compare the hash values again to ensure that they are the same.
If the hashes are the same, proceed.
If the hashes are still not the same (this should not happen) then we should fail the play.
The text was updated successfully, but these errors were encountered:
and the download part, I think something that might be useful (at least it would for me^^) is to be able to download directly from the remote hosts.
Ansible run from my own computer and the connection goes through several layer of vpn & proxy... and the playbook is then REALLY slow (like 1h+ slow), i've solved my issue by downloading manually the package directly on the remote on the expected path and add a remote_src: true on the unarchive task
@Jalkar Yeah, great suggestion. We can make the download method configurable. I'm aware of some customers that have production hosts that aren't able to directly download things from the Internet, so the current implementations works for them, but in your case, it sounds like the opposite is needed. Supporting both methods and making it configurable sounds like the way to go.
get_url has a checksum argument that can handle checksum comparisons/download logic for the splunk packages
The URL for grabbing the hash file is different from the package download URL. Example: https://download.splunk.com/products/universalforwarder/releases/8.1.3/linux/splunkforwarder-8.1.3-63079c59e632-Linux-x86_64.tgz.sha512
Use Case
I want to deploy different versions of Splunk to different hosts in the same Ansible play, but not perform unnecessary downloads if there is already a local copy of the package available.
Current Implementation
download_and_unarchive.yml
run_once: true
would be undesirable as the play would fail if we did not have a copy of the other package already available locally. The role does not currently cleanup old tarballs post-installation from the default path of the user's home directory.Suggested Implementation
run_once: true
from the download task.sha512
hash file for the tarball..sha512
file does not exist locally, download it..sha512
file.The text was updated successfully, but these errors were encountered: