#] #] ********************* #] "$d_SysMaint"'internet & wifi/wget notes.txt' - # www.BillHowell.ca 01Sep2023 initial # from 26Mar2021 split-up of "$d_SysMaint"'Linux/files online - [curl,lftp,wget] notes.txt' #48************************************************48 #24************************24 # Table of Contents, generate with : # $ grep "^#]" "$d_SysMaint"'internet & wifi/wget notes.txt' | sed "s/^#\]/ /" # #24************************24 08********08 #] ??Feb2024 08********08 #] ??Feb2024 08********08 #] ??Feb2024 08********08 #] ??Feb2024 08********08 #] ??Feb2024 08********08 #] ??Feb2024 08********08 #] ??Feb2024 08********08 #] ??Feb2024 08********08 #] ??Feb2024 08********08 #] 22Feb2024 download URL directory listing : $ wget --list-only (use curl???!!!) see "$d_bin""wget download.sh" --list-only is NOT an option for LMDE version of wget maybe just use lftp? 08********08 #] ?date? - use [wget for downloads, curl for attributes, etc]? https://www.howtogeek.com/447033/how-to-use-curl-to-download-files-from-the-linux-command-line/ see also : "$d_bin""wget download.sh" "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" [lftp,curl,FileZilla,wget]... 08********08 #] 26Mar2021 First thing - I must be able to get a full directory listing of the website! 08********08 #] 26Mar2021 man wget NAME Wget - The non-interactive network downloader. SYNOPSIS wget [option]... [URL]... DESCRIPTION GNU Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies. Wget is non-interactive, meaning that it can work in the background, while the user is not logged on. This allows you to start a retrieval and disconnect from the system, letting Wget finish the work. By contrast, most of the Web browsers require constant user's presence, which can be a great hindrance when transferring a lot of data. Wget can follow links in HTML, XHTML, and CSS pages, to create local versions of remote web sites, fully recreating the directory structure of the original site. This is sometimes referred to as "recursive downloading." While doing that, Wget respects the Robot Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in downloaded files to point at the local files, for offline viewing. Wget has been designed for robustness over slow or unstable network connections; if a download fails due to a network problem, it will keep retrying until the whole file has been retrieved. If the server supports regetting, it will instruct the server to continue the download from where it left off. Wget does not support Client Revocation Lists (CRLs) so the HTTPS certificate you are connecting to might be revoked by the siteowner. -l, --list-only (FTP POP3) (FTP) When listing an FTP directory, this switch forces a name-only view. This is especially useful if the user wants to machine-parse the contents of an FTP directory since the normal directory view doesn't use a standard look or format. When used like this, the option causes a NLST command to be sent to the server instead of LIST. Note: Some FTP servers list only files in their response to NLST; they do not include sub-directories and symbolic links. (POP3) When retrieving a specific email from POP3, this switch forces a LIST command to be performed instead of RETR. This is particularly useful if the user wants to see if a specific message id exists on the server and what size it is. Note: When combined with -X, --request, this option can be used to send an UIDL command instead, so the user may use the email's unique identifier rather than it's message id to make the request. Added in 7.21.5. >> 22Feb2024 --list-only doesn't work on my LMDE just use lftp??? # enddoc