-
-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
download_libs downloader using wget2 #7907
Conversation
I think it is finally passing tests. I've simplified the functions to invoke the downloader, so multiple files can be passed easily. wget2 and curl download faster in parallel, wget download in series. |
@dimitre if this is good to merge - looks good to me :) |
# echo " PARAMS $PARAMS" | ||
|
||
|
||
if command -v wget2 2>/dev/null; then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should just do this in downloader
https://github.com/openframeworks/openFrameworks/blob/master/scripts/dev/downloader.sh#L25
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if wget2, else curl, else wget
if command -v wget2 2>/dev/null; then
wget2 $@ 2> /dev/null;
elif command -v curl 2>/dev/null; then
curl -LO --retry 20 -O -s $@;
else
wget -q $@ 2> /dev/null;
fi
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks. fixed elif
about doing that in downloader I've decided to keep it outside because downloader.sh as I've understood, you are passing one URL at a time, and I need to pass multiple URLs at once. I found even curl can accept multiple URLs.
do you think it can work on downloader.sh without breaking anything else?
Closing this in favor of #8002 |
…loy logs (#8002) * downloader in parallel (cherry picked from commit 6e8819c) * Update download_libs.sh - parallel downloads (cherry picked from commit 2391c4f) * wget2 changes downloader fixes * Updated downloader script to support changes for parellel downloads, no ssl, wget2, curl, wget Cleaned up download_libs with better logs and overwrite settings to the target download type * downloader fix up URLs parellel * Downloader fix parallel curl - add commands * Apothecary Fix up commands * Download script osx - no need to send -a causing double download * Downloader updates - fixes command determination artefact now tracks installed of curl/wget2/wget in variable - added wget2 params for override or continue last download (-nc) - t20 retries - console formatting - added error message if no commands installed * cURL Test Ci * wget og Ci test * wget2 Ci Test / SSL off * download libs ci just BITS target * ci focused VS bits off for the moment * ci test downloader --silent * Download libs fix arch * Download script v3.0.1 - fixes for cURL in silent mode cURL now has progress bar Version in echo Wget2/wget Verbose turned off Download_libs silent now passed as bool * Download_libs fixed Overwrite - Overwrite / remove old libs was not working at all - not looping correctly for core libs - rewrote loop - Overwrite now removes old libs for that target type for the download... will remove only osx if download osx etc rather than all for that library - Overwrite Addons now will work if folders are deleted * downloader 3.1.0 wget/wget2 - now checks remote file vs local if found and optimises - won't download if same * download_libs 2.1 - - files downloaded (zip/tar.bz2) are now stored in libs/download/ - creates libs/download for downloading zips/tars for optimisation - clean up of logs - not removing dl zip now - code to remove zips on start also done commented out * download_libs - set PLATFORM to target if set - include overwrite off * downloader 3.1.1 - fixes for cURL - no ssl fixed * github actions workflow fix for concurrency in PRs being cancelled by other PRs of not the same hash. fallback to sha for non-prs * cURL optimisation with remote and local file check. Using HEAD flags so it just queries the content size and time rather than getting whole file * github actions cleanup cache * Linux Cache off * apt-get broken for ssl-dev * no cache on libssl * cache remove ssl/curl * chmod +x for msys2 * cache disable for linux * linux cache? * ubuntu-22.04 * Downloader 3.2.1 - wget2 re-enabled default if available --------- Co-authored-by: Dimitre <dimitre@users.noreply.github.com>
So all files can be downloaded faster, in parallel
if wget2 is installed locally
if not it will fallback to curl, then wget
cc @danoli3