#] #] ********************* #] "$d_SysMaint"'Linux/lftp notes.txt' - [upload, maintain] webSite, my main app for this!! 04Sep2022 initial # see also : # "$d_SysMaint"'Linux/permissions notes.link.txt' - list of some somewhat-related files #48************************************************48 #24************************24 # Table of Contents, generate with : # $ grep "^#]" "$d_SysMaint"'Linux/lftp notes.txt' | sed "s/^#\]/ /" # ********************* "$d_SysMaint"'Linux/lftp notes.txt' - main program for [upload, maintain] of webOnline 18Sep2023 permissions : "$d_PROJECTS"'webHtmlOnly/' 18Sep2023 rsync with ssh? 18Sep2023 search "Linux lftp and --only-newer doesn't work" 17Sep2023 https://linoxide.com/linux-how-to/lftp-commands/ 12Sep2023 search "Linux and how do I set permissions for a file extension on a web site?" 04Sep2023 check links in d_TrNNs_ART -> most work well! 03Sep2023 run dir_updateHtml_dWeb '230902 14h50m28' 03Sep2023 create fileops.sh function to generate list of d_web html files webSite_errLinkLines() - extract linkErrorLines, and sub-select linkNotErrorOnly '!!linkError!!' this is also done by webSite_link_counts n Qnial operator : 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' 02Sep2023 run dir_update_dWeb() with updateLast argument 02Sep2023 make-like exclusion of files that haven't changed since last update 02Sep2023 "$d_PROJECTS"'webHtmlOnly/' - many files no extension... 01Sep2023 "ProjMini" now missing from /billhowell.ca/webHtmlOnly 01Sep2023 [fil, dir] permissions : 01Sep2023 upload "$d_TrNNs_ART" 01Sep2023 lftp errorMsg : `*.html': Invalid preceding regular expression 01Sep2023 test dir_update_dWeb(), see if corrected html uploaded 01Sep2023 rm all z_[Archive, Old] from "$d_PROJECTS"'webHtmlOnly/' 31Aug2023 create dir "$d_PROJECTS"'webHtmlOnly/', rsync html files there, change 31Aug2023 search "Linux lftp and can I use bash expressions after connecting?" 31Aug2023 How do I change all paths in html files from local to online during lftp? 30Aug2023 update d_TrNNs_ART 10May2023 upload pPthL 09Jan2023 doesn't upload new files? "$d_PROJECTS""bin - secure/lftp update specified dir.sh" 04Sep2022 search 'lftp exclude directories' #24************************24 #] Setup, ToDos, #] permissions $ sudo find "$d_web" -type d -print0 | xargs -0 chmod 755 $ sudo find "$d_web" -type f -print0 | xargs -0 chmod 644 $ sudo find "$d_PROJECTS"'webHtmlOnly/' -type d -print0 | xargs -0 chmod 755 $ sudo find "$d_PROJECTS"'webHtmlOnly/' -type f -print0 | xargs -0 chmod 644 FileZilla used to do change permissions : /billhowell.ca/ [dir = 755, files = 644] >> may have to allow additional permissions for [html, ndf, sh, txt]? >> I haven't been able to be that specific with FileZilla #] +-----+ #] timeLog of notes #24************************24 08********08 #] ??Jan2024 08********08 #] ??Jan2024 08********08 #] ??Jan2024 08********08 #] ??Jan2024 08********08 #] ??Jan2024 08********08 #] ??Jan2024 08********08 #] ??Jan2024 08********08 #] ??Jan2024 08********08 #] ??Jan2024 08********08 #] 24Jan2024 pWebPageL_upload_online() : may be deleting files not in d_webOnly??? man lftp -e, --delete delete files not present at the source dWeb_uploadNonWebPage_online() mirror --reverse --delete --exclude-rx-from="$pExcludes" --only-newer --log="$pLogTmp" "$LOCALDIR" "$REMOTEDIR" pWebPageL_upload_online() mirror --reverse --exclude-glob="*.*" --include-glob="*.html" --only-newer --log="$pLogTmp" "$LOCALDIR" "$REMOTEDIR" >> so ONLY dWeb_uploadNonWebPage_online deletes files from webSiteOnline that aren't on d_web BUT Index of /economics, markets Parent Directory 0_BillHowell market news/ Fischer, David 1996 The Great Wave/ Freeman 27Oct2000 The Quality Adjustment Method, How Statistical Fakery Wipes Out Inflation.html Long term market indexes & PPI 0582.html Nuclear for tar sands 23Sep05.html PE Schiller forward vs 10yr Tbills/ [interest, exchange] rates/ currency-crypto/ market data/ versus local >> MUCH larger! so most [dir,fils] have been deleted!! "$d_PROJECTS"'bin - secure/webSite update online lftp.sh' do full : # onLine update dWeb_uploadNonWebPage_online pWebPageL_upload_online then check results 08********08 #] 18Sep2023 dir_updateWebPage_dWeb - pwd in final report! "$d_PROJECTS"'bin - secure/dir_updateWebPage_dWeb log 230918 23h16m10s.txt' : get -O ftp://billhowellweb:ad91019a@billhowell.ca/billhowell.ca/ProjMajor/Sun pandemics, health/corona virus/References file:/home/bill/web/ProjMajor/Sun pandemics, health/corona virus/References/Seneff, Nigh 10May2021 Worse Than the Disease: Reviewing Some Possible Unintended Consequences of the mRNA Vaccines Against COVID-19.pdf 08********08 #] 18Sep2023 lfpt -N, --newer-than=SPEC download only files newer than specified time As opther options fell flat, I could go back to this, which I did use for a short while with LOCAL files? eg notes below : 02Sep2023 make-like exclusion of files that haven't changed since last update >> but my earlier work was sith while-loops, sed? change : mirror --reverse --delete --exclude-rx-from="$pExcludesNoHtml" --only-newer "$LOCALDIR" "$REMOTEDIR" to : mirror --reverse --delete --exclude-rx-from="$pExcludesNoHtml" --newer-than=SPEC "$LOCALDIR" "$REMOTEDIR" >> bit eam,ples of spec? +-----+ +-----+ search "lftp --newer-than=SPEC examples" +-----+ https://superuser.com/questions/548692/how-can-i-automate-ftp-downloads-based-on-date-without-bi-directional-syncing How can I automate FTP downloads based on date without bi-directional syncing? Asked 10 years, 7 months ago Modified 8 years, 7 months ago Viewed 2k times +--+ This answer is probably coming too late for you but I'll answer for anyone else who stumbles across this in their search for something similar. LFTP's mirror function includes a --newer-than= function. For files newer than last week: --newer-than=now-7days For files newer than a specified date: --newer-than=2015-02-03 Here's an example: lftp -p -u , sftp:// set mirror:use-pget-n 5 mirror -L -c -P5 --newer-than=now-7days answered Feb 11, 2015 at 20:45 melcron >> Howell: BINGO! --newer-than=2015-02-03 +-----+ https://stackoverflow.com/questions/26314881/lftp-download-only-files-created-the-same-day-i-execute-lftp LFTP, download only files created the same day I execute LFTP Asked 8 years, 11 months ago Modified 8 years, 9 months ago Viewed 6k times It has this --newer-than=SPEC option to download only files newer than specified time. For your specific needs, use --newer-than=now-1days. Now - 1 day should be yesterday therefore lftp will download all the file newer than yesterday. Refer here for more info: http://lftp.yar.ru/lftp-man.html EDIT: While I was tweaking my script, I notice there's an --only-newer option which download only newer file which is also useful for your case but with slight changes. --only-newer check the destination folder and download any files from source that's not in the destination folder while --newer-than download any files that's newer than the time you specified without checking the destination folder. edited Dec 2, 2014 at 9:08 answered Dec 2, 2014 at 8:21 Archie >> promissing "--newer-than=now-1days" I can easily calculate that based on date of last backup What other SPECs are available? 08********08 #] 18Sep2023 fNam had caused stop in upload fNam had between irrefutable evidence : Bhakdi, Burkhardt 10Dec2021 On COVID vaccines- why they cannot work, and irrefutable evidence of their causative role in deaths after vaccination.pdf caused stop in upload : `...ence of their causative role in deaths after vaccination.pdf' at 63997 (100%) eta:0s [Waiting for transfer t `...ence of their causative role in deaths after vaccination.pdf' at 63997 (100%) [Waiting for transfer to compl `...ence of their causative role in deaths after vaccination.pdf' at 63997 (100%) [Store failed - you have to re `...ence of their causative role in deaths after vaccination.pdf' at 63997 (100%) [Logging in...] `...ence of their causative role in deaths after vaccination.pdf' at 63997 (100%) [Waiting for response...] `...ence of their causative role in deaths after vaccination.pdf' at 63997 (100%) [Not connected] `...ence of their causative role in deaths after vaccination.pdf' at 63997 (100%) [Logging in...] ^Cidence of their causative role in deaths after vaccination.pdf' at 63997 (100%) [Waiting for response...] 08********08 #] 18Sep2023 search "lftp and how do I limited log output to exceptions?" mirror --reverse --delete --exclude-rx-from="$pExcludesNoHtml" --only-newer --log="$pLog" 2>>"$pLog" "$LOCALDIR" "$REMOTEDIR" 15:42$ bash "$d_PROJECTS"'bin - secure/lftp update entire webSite.sh' mirror: Access failed: /home/bill/2: No such file or directory Unknown command `/home/bill/web/'. change : lftp $PROTOCOL://$URL <<- UPLOAD user $USER "$PASS" mirror --reverse --delete --exclude-rx-from="$pExcludesNoHtml" --only-newer --log="$pLog" 2>>"$pLog" "$LOCALDIR" "$REMOTEDIR" close UPLOAD to : lftp $PROTOCOL://$URL <<- UPLOAD user $USER "$PASS" debug -t -o debug.log 9 mirror --reverse --delete --exclude-rx-from="$pExcludesNoHtml" --only-newer --log="$pLog" "$LOCALDIR" "$REMOTEDIR" close UPLOAD >> but what is the "9" for? YIKES!! still wayyyy too much verbiage - take out debug +-----+ https://stackoverflow.com/questions/53047443/how-can-i-log-lftp-errors-in-bash-script How can I log lftp errors in bash script Asked 4 years, 10 months ago Modified 6 months ago Viewed 6k times Just redirect and append standard output and standard error output to the logfile. lftp -u ${USER},${PASSWD} $1 2>>"$LOGFILE" >>"$LOGFILE" <> Howell : nyet +--+ Increasing the debug level does not log all errors and warnings (e.g. clobber not set)! Redirecting stderr does. Even if OP could solve his problem with the debug it is not a correct answer to this question. This should be the accepted answer. FYI: lftp sftp://$IP:$PORT -u"$USERNAME","$PASSWORD" -e "set xfer:log-file $download_log;get $FILE;" >> $download_log 2>&1 does the same as above with another style. – Michael P May 13, 2020 at 7:12 >> Howell : nyet +--+ You can use this approach and be able to save the debug log where you can refer later when you want to check what happened. example with all variables lftp -d -u ${ftpUser},${ftpPasswd} ${ftpUrl} << EOF set xfer:clobber on; set net:max-retries 2; debug -t -o debug.log 9 set ssl:verify-certificate no; cd ${FOLDER_LOCATION} mget ${VERSION}*.zip bye EOF answered Mar 13 at 23:02 Kidane 08********08 #] 18Sep2023 permissions : "$d_PROJECTS"'webHtmlOnly/' I added : $ sudo find "$d_PROJECTS"'webHtmlOnly/' -type d -print0 | xargs -0 chmod 755 $ sudo find "$d_PROJECTS"'webHtmlOnly/' -type f -print0 | xargs -0 chmod 644 FileZilla used to do change permissions : /billhowell.ca/ [dir = 755, files = 644] >> may have to allow additional permissions for [html, ndf, sh, txt]? >> I haven't been able to be that specific with FileZilla 08********08 #] 18Sep2023 rsync with ssh? +-----+ https://stackoverflow.com/questions/11490145/why-lftp-mirror-only-newer-does-not-transfer-only-newer-file Why lftp mirror --only-newer does not transfer "only newer" file? Asked 11 years, 2 months ago Modified 9 months ago Viewed 22k times I want to automate to upload files of my websites. But, remote server does not support ssh, so I try lftp command below instead of rsync. lftp -c "set ftp:use-mdtm no && set ftp:timezone -9 && open -u user,password ftp.example.com && mirror -Ren local_directory remote_directory" If local files are not changed, no files are uploded by this command. But, I change a file and run the command, all files are uploaded. I know lftp/ftp's MDTM problem. So, I tried "set ftp:use-mdtm no && set ftp:timezone -9", but all files are uploaded though I changed only one file. Is anyone know why lftp mirror --only-newer does not transfer "only newer" file? lftp asked Jul 15, 2012 at 7:07 Noah Kobayashi 08********08 #] 18Sep2023 search "Linux lftp and --only-newer doesn't work" +-----+ https://stackoverflow.com/questions/11490145/why-lftp-mirror-only-newer-does-not-transfer-only-newer-file Why lftp mirror --only-newer does not transfer "only newer" file? Asked 11 years, 2 months ago Modified 9 months ago Viewed 22k times I also tried this combination without success: "lftp -e ""mirror -c --reverse --only-newer --ignore-time" – Lukas Lukac Oct 20, 2015 at 21:03 +-----+ https://stackoverflow.com/questions/9611271/why-does-the-lftp-mirror-command-chmod-files Why does the lftp mirror command chmod files Asked 11 years, 6 months ago Modified 2 years, 9 months ago Viewed 11k times Stumped for now... 18Sep2023 search "lftp option --only-newer doesn't work" +-----+ https://github.com/lavv17/lftp/issues/14 mirror --only-newer doesn't work because of wrong timestamp #14 dreaming-augustin Jan 8 2012 As reported several times on the mailing list and here http://linux.overshoot.tv/ticket/211 , the mirror option doesn't work because of a wrong offset between the local timestamp and the remote timestamp. The timestamp of the local file are correct (they report the proper timstamp GMT). However, lftp misrepresents the timestamps of the remote files. The remote server is set at GMT + 1. However, since the connection to the server is via FTPES, the server properly displays the dates/times as UT (GMT) just like the protocol requires. The problem is that, for some reasons, lftp still interprets those times as being GMT+1. Thus, the local files always appear to be newer than the remote files, rendering the mirror options --only-newer and --reverse useless: the whole file directory is always uploaded to the remote server as if every single file had been updated. If required, I can set up an account on the remote server in question to help you reproduce this behaviour. >> This might be the best answered. Years ago, I tried a time offset, but maybe I should do it again? ******** 18Sep2023 search "lftp time offset between server and local" +-----+ https://superuser.com/questions/1189234/add-time-offset-for-mirroring-via-lftp Add time offset for mirroring via lftp Asked 6 years, 6 months ago Modified 6 years, 6 months ago Viewed 1k times f you look at the manual you've linked lftp has an option that enables you to define the timezone for the remote site. It might be worth it to try to set it and see whenever lftp correctly compensates. ftp:timezone (string) Assume this timezone for time in listings returned by LIST command. This setting can be GMT offset [+|-]HH[:MM[:SS]] or any valid TZ value (e.g. Europe/Moscow or MSK-3MSD,M3.5.0,M10.5.0/3). The default is GMT. Set it to an empty value to assume local timezone specified by environment variable TZ. In addition it has a switch to ignore the time (--ignore-time which might enable you to make it rely only on file size and some switches (--newer-than and --older-than) to define a time "that matters" so you might be able to set it up in a way that not all files are transfered. answered Mar 16, 2017 at 9:57 Seth Try : mirror ftp:timezone GMT-3 --reverse --delete --exclude-rx-from="$pExcludesNoHtml" --only-newer "$LOCALDIR" "$REMOTEDIR" >> can't get the right [placement, format] ******** 18Sep2023 search "lftp time offset" nothing useful 08********08 #] 17Sep2023 https://linoxide.com/linux-how-to/lftp-commands/ # good description of basics old_bag() { mirror --reverse --only-newer "/media/bill/HOWELL_BASE/Website/" "/billhowell.ca/" mirror --reverse --only-newer "/media/bill/HOWELL_BASE/Website/economics, markets/SP500/PE Schiller forward vs 10yr Tbills/" "/billhowell.ca/economics, markets/SP500/PE Schiller forward vs 10yr Tbills/" mirror --reverse --only-newer "/media/bill/HOWELL_BASE/Website/economics, markets/SP500/multi-fractal/" "/billhowell.ca/economics, markets/SP500/multi-fractal/" # dry-runs : mirror --reverse --only-newer --dry-run "/media/bill/HOWELL_BASE/Website/" "/billhowell.ca/" mirror --reverse --only-newer --dry-run "/media/bill/HOWELL_BASE/Website/economics, markets/SP500/multi-fractal/" "/billhowell.ca/economics, markets/SP500/multi-fractal/" # specialDir_update 'Personal/230115 Calgary - New Zealand/' # specialDir_update 'Mythology/' # specialDir_update 'economics, markets/market data/' # doesn't recurse! # specialDir_update 'webWork files/' } 08********08 #] 12Sep2023 search "Linux and how do I set permissions for a file extension on a web site?" see "$d_SysMaint"'Linux/permissions notes.txt' 08********08 #] 04Sep2023 check links in d_TrNNs_ART -> most work well! browser loads well : [txt, png, html, ] problems : [sh (?), ] .sh gives 2 choices : [open with, save file] open with : couldn't find bash, I need to verify that it CAN't be run!! I did set geany as default txtEditor - works well 08********08 #] 03Sep2023 run dir_updateHtml_dWeb '230902 14h50m28' I comment-out upload part of dir_updateHtml_dWeb() only 3 files updated!!?? >> haven't been changed in d_web since '230902 14h50m28' set date to '170101 12h00m00' to force all to be redone? >> OK 369 files (not 377) processed check 'home.html' >> OK, works!! un-comment upload part of dir_updateHtml_dWeb() run again $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' date: extra operand ‘%y%m%d %kh%Mm’ Try 'date --help' for more information. open: Not connected >> why wouldn't it connect!!!???? was missing : PROTOCOL="ftp" URL="billhowell.ca" USER="billhowellweb" PASS="ad91019a" REGEX="*" put it back, run again $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' date: extra operand ‘%y%m%d %kh%Mm’ Try 'date --help' for more information. mirror: : No such file or directory >> I had --log="$pLog_html_html" I changed to pLog for all 3 dir_update[Non, Caption, ]Html_dWeb 3 different pLogs defined This time, it seemed to work - now to check webSite Same link probkem : Forbidden You don't have permission to access this resource.Server unable to read htaccess file, denying access to be safe >> server must have execute access? change chmod rules, see "$d_SysMaint"'Linux/chown & chmod notes.txt' OPM Open Porous Media is ridiculaously too large - move to d_PROJECTS just leave part online ClimateGate emails also very large +-----+ olde code "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' # Procedures - To run : # 22Sep2020 - "dry-run is giving NO indication of file transfers that will be made!? # try -v for verbose option # - still no output!?!?!? # 1. find all z_[Archive, Old] directories on d_webSite (no -maxdepth) : # $ find "$d_webSite" -type d | grep "z_Old\|z_Archive" | sort # add check to top of pLog : "$d_web""webWork files/$date_ymdhm rsync web_to_webOnline log.txt" # manually delete remaining dirs, not sure why they weren't deleted by rsync? # 2a. Count d_webSite html files : # $ find "$d_webSite" -type f \( -name "*.html" \) | wc -l # webPage count from 'webSite webPageList.txt' (2022 dates below) # date 05Jul # .html 379 # webPage 225 # 2b. [transfer, overwrite] of .html files # >> 05Jul2022 ouch!! too many to check (counts above), some other day when I have much more time # - all legitimate webPages should have been retained, but not overwritten!! 08********08 #] 03Sep2023 create fileops.sh function to generate list of d_web html files This is working well, still needs some code twigs... see "$d_web"'webWork/[htmlL all.txt, htmlL [head, foot].txt, htmlL Menus.txt, htmlL non-webWork.txt, htmlL webPage Howell.txt] which .ndf file generated this?? "$d_Qndfs"'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' webRawe_extract_pathsSubDirsFnames IS - all relevant webSite paths to p_[all, html]FileList sorted [fname, all] lists, auto-run on initial loaddef, manually invoke to update while working +-----+ URLs - check and count, helps for debugging webURLs_extract IS - extract all link urls from a website [external, internal, menu, pagePosn] this is independent of other optrs, it creates d_temp files, to be further processed by urls_check urls_check IS OP linkType - create sublists of [internal,xternal] links classed as [fail, OK] check internal with path_exists "f, externals with curl webSite_link_counts IS - summarize the counts for links [external, internal, menu, tableOfContent]s >> but this file has disappeared??? last modified 30Aug2023!! $ find "$d_web"'Qnial/MY_NDFS/z_Old/' -type f -name "*webSite maintain [menu*" | sed 's|\/home\/bill\/web\/Qnial\/MY_NDFS\/z_Old\/||' | sort >> most recent : 210901 19h53m33s webSite maintain [menu, header, footer, body] links, TableOfContents.ndf >> sheesh - I had already opened this, dudn't notice the operators above but the whole section is missing! URLs - check and count, helps for debugging Did I move it to file_ops.ndf? try : 210608 15h51m36s webSite maintain [menu, header, footer, body] links, TableOfContents.ndf >> got it! looks like htmlPathsSortedByPath was generated separately somewhere - probably a bash cmd? webRawe_extract_pathsSubDirsFnames IS - all relevant webSite paths to p_[all, html]FileList webRawe_extract_pathsSubDirsFnames IS { LOCAL fnames fnamePosns paths p_allFileList p_nonuniqueFileList subDirs ; NONLOCAL d_webRawe p_webPageList flag_webRawe_extract_pathsSubDirsFnames d_webWork allPathsSortedByFname allSubDirsSortedByFname allFnamesSortedByFname allPathsSortedByPath allSubDirsSortedBySubdir allMulplicateIndxs allMulplicateFnames allMulplicateSubDirs htmlPathsSortedByFname htmlSubDirsSortedByFname htmlFnamesSortedByFname htmlPathsSortedByPath htmlSubDirsSortedBySubdir ; bingo!! all html files : host link 'find "' d_webRawe '" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations|i9018xtp.default/extensions/" | sort -u >"' p_webPageList '" ' ; htmlPathsSortedByPath := strList_readFrom_path p_webPageList ; I generated "$pHtmlOnlyL", adding --invert-match items as I perused the file find "$d_web" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive\|z_old\|z_archive\|System_maintenance\|Qnial_bag\|Cool emails\|Yoonsuck Choe - conf program book\|captions html\|Top 75 Immunotherapy startups_files\|OPM\/OPM\|code develop_test\|Qnial\/Manuals\|Weart 2003 - The Discovery of Global Warming\|Randi Foundation 2008-2011 ANthony Peratt's model of univese" | sort -u >"$pHtmlOnlyL" >> resulted in 377 html files, almost all are my own I added the function : webSite_get_pHtmlL() - generate list of Howell's html files on webSite to : "$d_Qndfs"'file_ops.ndf' of course, it's also in : "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' also created in fileops.sh : webSite_get_internalLinks() - extract internal links (to within webSite) >> not ready yet +-----+ olde code # exclude : \|i9018xtp.default\/extensions # find html files in d_web, exclude Grossberg 'captions html/' in d_TrNNs_ART # initial version: find "$d_web" -type f -name "*.html" | grep --invert-match "z_Old\|z_old\|z_Archive\|z_archive\|captions html" | sort >"$pHtmlOnlyL" # does d_TrNNs_ART ONLY!!! #] webSite_errLinkLines() - extract linkErrorLines, and #] sub-select linkNotErrorOnly '!!linkError!!' #] this is also done by webSite_link_counts n Qnial operator : #] 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' webSite_errLinkLines() { grep '!!linkError!!' "$d_webWork"'webURLs_extract allLinkLines.txt' | sed 's|\(.*\)!!linkError!!\(.*\)">\(.*\)[A,a]>\(.*\)|!!linkError!!\2|g' >"$d_webWork"'webURLs_extract errLinkLines.txt' grep --invert-match "^!!linkError!!$" "$d_webWork"'webURLs_extract errLinkLines.txt' >"$d_webWork"'webURLs_extract linksNotErrorOnly.txt' grep "^!!linkError!!$" "$d_webWork"'webURLs_extract errLinkLines.txt' >"$d_webWork"'webURLs_extract linksErrorOnly.txt' } 08********08 #] 02Sep2023 run dir_update_dWeb() with updateLast argument Hah! Funny, I have http://wwwBillHowell.ca/home/bill/web/ !!! >> filemanager crashed and env variables didn't work, try again >> the connection is kicking out!! Can I redict screen output to pLog? >> hung up in LibTopoART_v0.97.0/data/AIL12-like_video_dataset/test/ filMgr nemo wouldn't work at this point, then started working again 1,083 jpg files!! added dir to excludes Ctrl-C then retry [upload failed, filemanager lost d_[web, PROJECTS]] during html upload at : +-----+ mirror: /home/bill/web/webOther/Neil Howell/Dads paintings/scan0240.jpg: Input/output error Transferring file `webOther/Neil Howell/Dads paintings/scan0388.jpg' mirror: /home/bill/web/webOther/Neil Howell/Dads paintings/scan0388.jpg: Input/output error chmod: Access failed: 550 ./035 1983 Joel Quarrington.jpg: No such file or directory chmod: Access failed: 550 ./036 1983 David Carroll.jpg: No such file or directory +-----+ +-----+ olde code if [ "$timeUnixPth" -gt "$timeUnixUpdate" ]; then sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' "$pHtmlOnly" >"$pTmp" mv "$pTmp" "$pHtmlOnly" pOnline=$( echo "$pHtmlOnly" | sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' ) echo >>"$pLog" " $pHtmlOnly" # echo >>"$pLog" " $pOnline" fi updateLast='230902 9h19m03' timeUnixUpdate=$( date_ymdhms_to_timeUnix "$updateLast" ) echo "updateLast = $updateLast; timeUnixUpdate = $timeUnixUpdate" # example : updates of Authors' Guide : # lftp billhowellweb@BillHowell.ca:/> mput -c -O "/billhowell.ca/Neural nets/Conference guides/Author guide website" "/media/bill/SWAPPER/Website/Neural nets/Conference guides/Author guide website/*" # Olde Code # lftp # open -u billhowellweb billhowell.ca # mirror --reverse --recursion=newer "$d_webSite/Mythology/" "/billhowell.ca" # close # NOTE: "lftp billhowellweb@BillHowell.ca:/> " is the prompt when in lftp # start : # $ lftp # remote ftp logon : # $ open -u billhowellweb billhowell.ca % then type remote # test # $ echo 'directory ' | sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' # directory # OK pOnline=$( echo "$pHtmlOnly" | sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' ) # echo >>"$pLog" " $pOnline" # upload html files date_ymdhm=$( date + "%y%m%d %kh%Mm" ) echo >>"$pLog" "$date_ymdhm starting upload html..." LOCALDIR="$d_PROJECTS"'webHtmlOnly/' REMOTEDIR='/billhowell.ca/' if ! [ -d "$LOCALDIR" ]; then echo >>"$pLog" 'ltml upload: Cant cd to $LOCALDIR. Please make sure this local directory is valid' else lftp $PROTOCOL://$URL <<- UPLOAD user $USER "$PASS" mirror --reverse --exclude-glob="*.*" --include-glob="*.html" --only-newer --verbose=1 --log="$pLog_html" "$LOCALDIR" "$REMOTEDIR" close UPLOAD fi # +-----+ # get last update from pLog in "$d_PROJECTS"'bin - secure/' : # example '230902 9h19m03 lftp update www-BillHowell-ca log.txt' # see dir_updateNonHtml_dWeb examples at end of this file timeUnixUpdate=$( date_ymdhms_to_timeUnix "$updateLast" ) echo >>"$pLog" "updateLast = $updateLast; timeUnixUpdate = $timeUnixUpdate" while IFS='' read -u 9 pHtmlOnly; do timeUnixPth=$( pinn_timeModTo_timeUnix "$pHtmlOnly" ) if [ "$timeUnixPth" -gt "$timeUnixUpdate" ]; then sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' "$pHtmlOnly" >"$pTmp" mv "$pTmp" "$pHtmlOnly" echo >>"$pLog" " $pHtmlOnly" fi done 9<"$pHtmlOnlyL" 08********08 #] 02Sep2023 make-like exclusion of files that haven't changed since last update Unix date-time code? search "Linux lftp and how do I set the date for newer?" man lftp - hard to pick this out, so many options -N, --newer-than=SPEC download only files newer than specified time +-----+ https://unix.stackexchange.com/questions/35069/is-it-possible-to-transfer-files-in-a-date-range-via-ftp Is it possible to transfer files in a date range via FTP Asked 11 years, 5 months ago Modified 11 years, 5 months ago Viewed 5k times You can use lftp for that, utilizing its mirror command. Here's a snip from the manpage: mirror [OPTS] [source [target]] Mirror specified source directory to local target directory. If target directory ends with a slash, the source base name is appended to target directory name. Source and/or target can be URLs pointing to directories. [cut...] -N, --newer-than=SPEC download only files newer than specified time --on-change=CMD execute the command if anything has been changed --older-than=SPEC download only files older than specified time [...] Definitely have a look at the manual, as there are really many useful options to mirror - like --allow-chown, --allow-suid or --parallel[=N] for example. Lftp also works with other access protocols, like sftp, fish or http(s). edited Mar 27, 2012 at 6:22 answered Mar 26, 2012 at 23:37 rozcietrzewiacz also +-----+ https://stackoverflow.com/questions/11490145/why-lftp-mirror-only-newer-does-not-transfer-only-newer-file Why lftp mirror --only-newer does not transfer "only newer" file? Asked 11 years, 1 month ago Modified 9 months ago Viewed 22k times +--+ On the following page http://www.bouthors.fr/wiki/doku.php?id=en:linux:synchro_lftp the authors state: When uploading, it is not possible to set the date/time on the files uploaded, that's why --ignore-time is needed. so if you use the flag combination --only-newer and --ignore-time you can achieve decent backup properties, in such a way that all files that differ in size are replaced. Of course it doesn't help if you really need to rely on time-synchronization but if it is just to perform a regular backup of data, it'll do the job. edited Dec 1, 2022 at 19:04 A.L answered Mar 10, 2013 at 9:42 p6majo +-----+ olde code timeStat_to_timeUnix() { pinn="$1" timeStat=$( stat "$pinn" | grep 'Modify' | sed 's|Modify: ||' ) year=$( echo "$lastUpdate" | cut --bytes=3,4 ) mnth=$( echo "$lastUpdate" | cut --bytes=6,7 ) dayy=$( echo "$lastUpdate" | cut --bytes=9,10 ) hour=$( echo "$lastUpdate" | cut --bytes=12,13 ) mint=$( echo "$lastUpdate" | cut --bytes=15,16 ) scnd=$( echo "$lastUpdate" | cut --bytes=18,19 ) #ymdhms_to_timeUnix "$year" "$mnth" "$dayy" "$hour" "$mint" "$scnd" date -d '2022-11-09T14:41:15.555641007Z' +%s } #] ymdhms_to_timeUnix() - use linux stat format : $ date -d "$timeStat" +%s # 02Sep2023 initial ymdhms_to_timeUnix() { year='20'"$1" mnth="$2" dayy="$3" hour="$4" mint="$5" scnd="$6" let "yearDays = (year - 1970) *36525" # = 365.25 days/y, result is 0.01 days since 01Jan1970 let "mnthUnix = yearDays + (mnth" let "dayyUnix = dayy" let "hourUnix = hour * 1000*60*60" let "mintUnix = mint * 1000*60" let "scndUnix = scnd * 1000" timeUnix= } # t_year_to_millisecs = 365.25*24*60*60*1000 # year_to_minutes = 365.25*24*60 # year_to_hours = 365.25*24 # year_to_days = 365.25 # year_to_weeks = 52.1786 # year_to_months = 12.0 $ let "yearUnix = 2023*365*24*60*60*1000" $ echo "$yearUnix" 63797328000000 year=2023 $ let "yearUnix = year*365*24*60*60*1000" $ echo "$yearUnix" 63797328000000 08********08 #] 02Sep2023 "$d_PROJECTS"'webHtmlOnly/' - many files no extension... # see "$d_SysMaint"'Linux/rm notes.txt' : # backer_htmlOnly() : # rm "z_Archive$\|z_archive$\|z_Old$\|z_old$", also high-volume dirs # check result : # $ find "$d_PROJECTS""webHtmlOnly/" -type d | grep "z_Archive$\|z_archive$\|z_Old$\|z_old$\|System_maintenance\/tex$\|ProjMajor\/OPM\/OPM\/matlab_mrst$\|ProjMajor\/OPM\/OPM/Octave$\|References\/Climategate $\|garrybog\/pine$" >"$d_temp"'rm webHtmlOnly z_[Archive, Old, high-volume, etc].txt' # rm dirs : # $ find "$d_PROJECTS""webHtmlOnly/" -type d | grep "z_Archive$\|z_archive$\|z_Old$\|z_old$\|System_maintenance\/tex$\|ProjMajor\/OPM\/OPM\/matlab_mrst$\|ProjMajor\/OPM\/OPM/Octave$\|References\/Climategate $\|garrybog\/pine$" | tr \\n \\0 | xargs -0 -IFILE rm -r "FILE" # rm files without extension # examples - see "$d_temp"'rm webHtmlOnly files without extension list.txt' # eg '/home/bill/PROJECTS/webHtmlOnly/economics, markets/Econometric models/Indexes' # check result : # $ find "$d_PROJECTS""webHtmlOnly/" -type f | grep "\/[A-Za-z0-9]*$" >"$d_temp"'rm webHtmlOnly files without extension list.txt' # rm files without extension # $ find "$d_PROJECTS""webHtmlOnly/" -type f | grep "\/[A-Za-z0-9]*$" | tr \\n \\0 | xargs -0 -IFILE rm -r "FILE" 08********08 #] 01Sep2023 "ProjMini" now missing from /billhowell.ca/webHtmlOnly why??? "ProjMini" is still in main d_web... dir_update_dWeb '/home/bill/web/ProjMini/TrNNs_ART/' '/billhowell.ca/ProjMini/TrNNs-ART/' "$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca excludes.txt' >> doesn't use "$d_PROJECTS"'webHtmlOnly/' for EITHER [, non-]html files! 02Sep2023 html part set LOCALDIR="$d_PROJECTS"'webHtmlOnly/' >> nyet - is creating webHtmlOnly/ in www.BillHowell.ca rename 'webHtmlOnly/' to web? >> yikes! - rm good dirs d_TrNNs_ART is being updated dir_update_dWeb() - remove --delete in html part! change : mirror --reverse --delete --exclude-glob="*.*" --include-glob="*.html" --only-newer --verbose=1 "$LOCALDIR" "$REMOTEDIR" to : mirror --reverse --exclude-glob="*.*" --include-glob="*.html" --only-newer --verbose=1 "$LOCALDIR" "$REMOTEDIR" now writing entire d_web into d_TrNNs_ART LOCALDIR="$d_PROJECTS"'web/' REMOTEDIR='/billhowell.ca/' 08********08 #] 01Sep2023 [fil, dir] permissions : see "$d_SysMaint"'Linux/chmod notes.txt' 08********08 #] 01Sep2023 upload "$d_TrNNs_ART" dir_update_dWeb '/home/bill/web/ProjMini/TrNNs_ART/' '/billhowell.ca/ProjMini/TrNNs-ART/' "$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca excludes.txt' worked very nicely - all html uploads listed - needs make-like dating +-----+ BUT - link of browser works, Menu doesn't? brow: http://www.billhowell.ca/ProjMini/TrNNs-ART/webWork/Menu%20zhome,%20all.html Menu: http://www.billhowell.ca/ProjMini/TrNNs_ART/webWork/Menu%20zhome,%20all.html webPage link doesn't work : http://www.billhowell.ca/ProjMini/TrNNs_ART/[definitions,%20models]%20of%20consciousness.html browser dir does work http://www.billhowell.ca/ProjMini/TrNNs-ART/%5bdefinitions,%20models%5d%20of%20consciousness.html Incorrect link : page: http://www.billhowell.ca/ProjMini/TrNNs_ART/John%20Taylors%20concepts.html dir: http://www.billhowell.ca/ProjMini/TrNNs-ART/Taylors%20consciousness.html Diff link same webPage, pageLink doesn't work! : page: http://www.billhowell.ca/ProjMini/TrNNs_ART/Taylors%20consciousness.html dir : http://www.billhowell.ca/ProjMini/TrNNs-ART/Taylors%20consciousness.html /billhowell.ca/ProjMini/TrNNs-ART : owner/group: billhowellweb billhowellweb dir drwxr-xr-x file -rw-r--r-- /billhowell.ca/ProjMini/Puetz & Borchardt owner/group: billhowellweb billhowellweb file -rw-r(w)--- very inconsistent!! I need to check my past notes on permissions [chown, chgrp, etc] for webSites Probably can use FileZilla for this? if not I forget which software.. 08********08 #] 01Sep2023 lftp errorMsg : `*.html': Invalid preceding regular expression mirror: regular expression `*.html': Invalid preceding regular expression change : mirror --reverse --delete --include="*.html" --exclude="*.*" --only-newer --verbose=1 "$LOCALDIR" "$REMOTEDIR" to : mirror --reverse --delete --exclude="*.*" --include="*.html" --only-newer --verbose=1 "$LOCALDIR" "$REMOTEDIR" $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' dir_update_dWeb() properly converts via : sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' "$pHtmlOnly" However, the html file is STILL NOT uploaded!?!?!? test other upload by adding to '/home/bill/web/bin/0_test/lftp one file' : '20120 [before, after] running head-on into a semi-tractor trailor hauling propane.jpg' >> worked - must be very careful to refressh FileZilla dir because it doesn't show! So non-html uploads work well, but not html!!???? try to capture errors : chage : mirror --reverse --delete --exclude="*.*" --include="*.html" --only-newer --verbose=1 "$LOCALDIR" "$REMOTEDIR" to : mirror >&2 --reverse --delete --exclude="*.*" --include="*.html" --only-newer --verbose=1 "$LOCALDIR" "$REMOTEDIR" try again change back : REMOTEDIR = /billhowell.ca/bin/0_test/lftp one file to : REMOTEDIR = /billhowell.ca/bin/0_test/lftp one file/ >> NYET - worked for non-html, so that is NOT the problem!! try glob patterns : change : mirror --reverse --delete --exclude="*.*" --include="*.html" --only-newer --verbose=1 "$LOCALDIR" "$REMOTEDIR" to : mirror --reverse --delete --exclude-glob="*.*" --include-glob="*.html" --only-newer --verbose=1 "$LOCALDIR" "$REMOTEDIR" $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' Transferring file `home.html' >> crap - it worked!!!?? 08********08 #] 01Sep2023 test dir_update_dWeb(), see if corrected html uploaded # test file dir_update_dWeb '/home/bill/web/bin/0_test/lftp one file/' '/billhowell.ca/bin/0_test/lftp one file/' "$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca excludes.txt' $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' LOCALDIR = /home/bill/web/bin/0_test/lftp one file/ REMOTEDIR = /billhowell.ca/bin/0_test/lftp one file/ Usage: cd remote-dir cd: Access failed: 550 /home/bill/web/bin/0_test/lftp one file: No such file or directory 01/09/2023-09:50:03 Starting non-html upload... 01/09/2023-09:50:06 Finished upload... /home/bill/web/bin/0_test/lftp one file/home.html http://www.BillHowell.ca/bin/0_test/lftp one file/home.html Usage: cd remote-dir cd: Access failed: 550 /home/bill/PROJECTS/webHtmlOnly: No such file or directory mirror: regular expression `*.html': Invalid preceding regular expression 01/09/2023-09:50:06 Starting html upload... 01/09/2023-09:50:08 Finished html upload... >> oops, use FileZilla to upload "$d_PROJECTS"'webHtmlOnly/' FileZilla STOOPED - why are non-html files included?? >> mostly executables, files without extension "$d_bin" - many, manually removed huge number of html file - add to "$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca excludes.txt' /home/bill/PROJECTS/webHtmlOnly/ProjMajor/OPM/OPM/Octave /home/bill/PROJECTS/webHtmlOnly/References/Climategate emails OK - manually checked all dirs, removed non-html, excluded [Octave, Climategsate emails] +-----+ try again : $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' LOCALDIR = /home/bill/web/bin/0_test/lftp one file/ REMOTEDIR = /billhowell.ca/bin/0_test/lftp one file/ Usage: cd remote-dir cd: Access failed: 550 /home/bill/web/bin/0_test/lftp one file: No such file or directory 01/09/2023-11:26:23 Starting non-html upload... 01/09/2023-11:26:26 Finished upload... /home/bill/web/bin/0_test/lftp one file/home.html http://www.BillHowell.ca/bin/0_test/lftp one file/home.html Usage: cd remote-dir cd: Access failed: 550 /home/bill/PROJECTS/webHtmlOnly: No such file or directory mirror: regular expression `*.html': Invalid preceding regular expression 01/09/2023-11:26:26 Starting html upload... 01/09/2023-11:26:28 Finished html upload... >> WHAT!!?!??? /media/bill/9e05b040-df7b-4e71-936b-1f7c459b8842/web/bin/0_test/lftp one file >> looks like I have to cold reboot?!? +-----+ after cold reboot, FileZilla check if-exists http://www.BillHowell.ca/bin/0_test/lftp one file/ >> YES /billhowell.ca/bin/0_test/lftp one file $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' LOCALDIR = /home/bill/web/bin/0_test/lftp one file/ REMOTEDIR = /billhowell.ca/bin/0_test/lftp one file Usage: cd remote-dir cd: Access failed: 550 /home/bill/web/bin/0_test/lftp one file: No such file or directory 01/09/2023-12:37:13 Starting non-html upload... 01/09/2023-12:37:17 Finished upload... /home/bill/web/bin/0_test/lftp one file/home.html http://www.BillHowell.ca/bin/0_test/lftp one file/home.html Usage: cd remote-dir cd: Access failed: 550 /home/bill/PROJECTS/webHtmlOnly: No such file or directory mirror: regular expression `*.html': Invalid preceding regular expression 01/09/2023-12:37:17 Starting html upload... 01/09/2023-12:37:19 Finished html upload... +-----+ Maybe trailing `/ problem? - '/billhowell.ca/bin/0_test/lftp one file/' change : dir_update_dWeb '/home/bill/web/bin/0_test/lftp one file/' '/billhowell.ca/bin/0_test/lftp one file/' "$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca excludes.txt' to : dir_update_dWeb '/home/bill/web/bin/0_test/lftp one file/' '/billhowell.ca/bin/0_test/lftp one file' "$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca excludes.txt' >> no help - besides in lftp output above : LOCALDIR = /home/bill/web/bin/0_test/lftp one file/ REMOTEDIR = /billhowell.ca/bin/0_test/lftp one file Maybe lftp can't take spaces in cd? +----+ NO - I might have a mix-up in dirs for html-part of upload, change : LOCALDIR="$d_PROJECTS"'webHtmlOnly/' REMOTEDIR="$2" (= '/billhowell.ca/bin/0_test/lftp one file') to (for overall webSite htmls) : LOCALDIR="$d_PROJECTS"'webHtmlOnly/' REMOTEDIR='/billhowell.ca/webHtmlOnly/' to (for html upload test) : LOCALDIR='/home/bill/web/bin/0_test/lftp one file/' REMOTEDIR='/billhowell.ca/bin/0_test/lftp one file' Comment out for BOTH [, non-]html uploads cd $REMOTEDIR cd "$LOCALDIR" loss of log output!? remove pLog : change : mirror --reverse --delete --exclude-rx-from="$pExcludesNoHtml" --only-newer --verbose=1 --log="$pLog" "$LOCALDIR" "$REMOTEDIR" to : mirror --reverse --delete --exclude-rx-from="$pExcludesNoHtml" --only-newer --verbose=1 "$LOCALDIR" "$REMOTEDIR" +-----+ olde code retain user $USER "$PASS" remove cd $REMOTEDIR cd "$LOCALDIR" echo " LOCALDIR = $LOCALDIR" echo "REMOTEDIR = $REMOTEDIR" echo '' must not change dirs [m non-]html! LOCALDIR="$d_PROJECTS"'webHtmlOnly/' REMOTEDIR="$2" echo >>"$pLog" " LOCALDIR = $LOCALDIR" echo >>"$pLog" "REMOTEDIR = $REMOTEDIR" echo >>"$pLog" '+-----+' 08********08 #] 01Sep2023 rm all z_[Archive, Old] from "$d_PROJECTS"'webHtmlOnly/' see "$d_SysMaint"'Linux/rm notes.txt' final result : $ find "$d_PROJECTS"'webHtmlOnly/' -type d | grep "z_Archive$\|z_archive$\|z_Old$\|z_old$" | tr \\n \\0 | xargs -0 -IFILE rm -r "FILE" check result : $ find "$d_PROJECTS"'webHtmlOnly/' -type d | grep "z_Archive$\|z_Old$" >"$d_temp"'rm 08********08 #] 31Aug2023 create dir "$d_PROJECTS"'webHtmlOnly/', rsync html files there, change >> oops - transferred ALL files search "rsync and example to transfer only files of a pattern" +-----+ https://unix.stackexchange.com/questions/421566/rsync-only-files-matching-pattern Rsync only files matching pattern [duplicate] Asked 5 years, 7 months ago Modified 3 years, 7 months ago Viewed 8k times +--+ From your code, it looks like you are trying to copy from the remote host to your local machine. With that being said: rsync -avP --include='*000.csv' --exclude='*.*' user@host:~/data ./ You don't need r as the a option implies r. edited Jan 24, 2020 at 8:31 roaima answered Feb 3, 2018 at 2:45 Nasir Riley >> Howell : try beval 'rsync '"$options"' --include="*.html" --exclude="*.*" "'"$d_src"'" "'"$d_out"'" >>"'"$p_log"'" ' This works, but includes : z_[Archive, Old, archive, old] files with no extension 08********08 #] 31Aug2023 search "Linux lftp and can I use bash expressions after connecting?" +-----+ https://stackoverflow.com/questions/27635292/transfer-files-using-lftp-in-bash-script Transfer files using lftp in bash script Asked 8 years, 8 months ago Modified 3 years, 7 months ago Viewed 85k times >> not helpful as I don't understand... +-----+ https://stackoverflow.com/questions/41351405/how-to-put-if-statement-inside-a-lftp-block How to put if statement inside a lftp block Asked 6 years, 8 months ago Modified 6 years, 8 months ago Viewed 3k times +--+ A command substitution will do: #!/bin/bash cd "$1" || exit mode=$2 lftp -u found,e48RgK7s sftp://ftp.xxx.org << EOF set xfer:clobber on mget *.xml $( if [ "$mode" = "prod" ]; then echo "Production mode. Deleting." >&2 # this is logging (because of >&2) echo "mrm *.xml" # this is substituted into the heredoc else echo "Non-prod mode. Keeping files" >&2 fi ) EOF Note that inside the substitution for the heredoc, we're routing log messages to stderr, not stdout. This is essential, because everything on stdout becomes a command substituted into the heredoc sent to lftp. Other caveats to command substitution also apply: They run in subshells, so a assignment made inside the command substitution will not apply outside of it, and there's a performance cost to starting them. A more efficient approach is to store your conditional components in a variable, and expand it inside the heredoc: case $mode in prod) echo "Production mode. Deleting files" >&2 post_get_command='mget *.xml' ;; *) echo "Non-production mode. Keeping files" >&2 post_get_command= ;; esac lftp ... <"$pTmp" pOnline=$( echo "$pHtmlOnly" | sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' ) echo "$pHtmlOnly" echo "$pOnline" done 9<"$pHtmlOnlyL" ) EOF # close # UPLOAD pHtmlOnlyL="$d_temp"'dir_update_dWeb pHtmlOnlyL.txt' # find "$d_ProjMini"'TrNNs_ART/' -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive\|z_historical" | sort >"$pHtmlOnlyL" # 31Aug2023 683 html files! find "$LOCALDIR" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive\|z_historical" | sort >"$pHtmlOnlyL" pTmp="$d_temp"'dir_update_dWeb html temp.txt' USER="billhowellweb" PASS="ad91019a" REGEX="*" while IFS='' read -u 9 pHtmlOnly; do sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' "$pHtmlOnly" >"$pTmp" pOnline=$( echo "$pHtmlOnly" | sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' ) echo "$pHtmlOnly" echo "$pOnline" #lftp -u billhowellweb,ad91019a done 9<"$pHtmlOnlyL" date_ddmmmyyyy_hm=$(date +"%e%b%Y %kh%Mm") echo >>"$pLog" 'ended html: $date_ddmmmyyyy_hm' # dir_update_dWeb_test - test one-by-one upload, with file change in the middle # dWeb is a SUB-dir - doesn't include [http://www.BillHowell.ca/, /home/bill/web/] dir_update_dWeb_test() { LOCALDIR="$1" REMOTEDIR="$2" USER="billhowellweb" PASS="ad91019a" REGEX="*" while IFS='' read -u 9 pHtmlOnly; do sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' "$pHtmlOnly" >"$pTmp" pOnline=$( echo "$pHtmlOnly" | sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' ) echo "$pHtmlOnly" echo "$pOnline" #lftp -u billhowellweb,ad91019a done 9<"$pHtmlOnlyL" date_ddmmmyyyy_hm=$(date +"%e%b%Y %kh%Mm") echo >>"$pLog" 'ended html: $date_ddmmmyyyy_hm' } # LOCALDIR="$d_temp$1" # # LOCALDIR="$d_temp$1" # mirror --reverse --delete --only-newer --verbose=1 --include="*.html" 08********08 #] 31Aug2023 How do I change all paths in html files from local to online during lftp? maybe more reliable to copy to a dirTmp, make changes, then upload?? I've probably already done this!!?? Old approach did it... $d_web has 30.5 Gb search "Linux lftp and how do I upload one file at a time?" +-----+ https://unix.stackexchange.com/questions/637413/sending-only-one-file-through-lftp Sending only one file through lftp Asked 2 years, 6 months ago Modified 2 years, 6 months ago Viewed 517 times I am trying to create a command line to send files to my server. Everything is working great to send a folder, but it fails for a single file. The command I am running is lftp -e "set ftp:ssl-allow off; mirror --reverse --verbose --delete -i .htaccess ./ /my-folder/; bye" -u user:password host.com But the command is sending everything and not only .htaccess file, and I do not understand why. What is happening and how do I fix it? lftp edited Mar 3, 2021 at 20:44 G-Man Says 'Reinstate Monica' asked Mar 3, 2021 at 16:22 Ajouve +--+ There is an extraneaous ./ in your mirror command. mirror --reverse --verbose --delete -i .htaccess ./ /my-folder/ ^ HERE answered Mar 3, 2021 at 21:43 xhienne +-----+ https://superuser.com/questions/323214/how-to-upload-one-file-by-ftp-from-command-line How to upload one file by FTP from command line? Asked 12 years ago Modified 11 months ago Viewed 455k times You can also try lftp. Here is an example: lftp -e 'cd folder1/folder2; put /home/path/yourfile.tar; bye' -u user,password ftp.theserver.com Refer here for more details and also refer to LFTP Manual edited Feb 4, 2021 at 15:43 answered Jan 10, 2013 at 9:06 divinedragon 08********08 #] 30Aug2023 update d_TrNNs_ART how do I exclude dirs? $ man lftp -x RX, --exclude=RX exclude matching files search "Linux lftp -- exclude directory?" for d_TrNNS_ART is excluded : mirror --reverse --delete --exclude-rx-from="$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca excludes.txt' --only-newer --verbose=1 --log="$d_bin"'lftp update www-BillHowell-ca html log.txt' "$LOCALDIR" "$REMOTEDIR" search "Linux lftp --exclude-rx-from" +-----+ +-----+ https://unix.stackexchange.com/questions/517435/lftp-exclude-specific-folder-only LFTP exclude specific folder only Asked 4 years, 3 months ago Modified 2 months ago Viewed 2k times +-----+ https://stackoverflow.com/questions/69406250/skip-some-subdirectories-on-lftp skip some subdirectories on lftp Asked 1 year, 11 months ago Modified 1 year, 7 months ago Viewed 898 times Here's the solution for everyone with the same problem. the command should be lftp -u $ftpuser,$ftppass -p 21 $ftpserver -e "set ftp:ssl-allow on; set ssl:check-hostname no; mirror -P $ftpconnections $ftproot $backupfolder/$date-$user/ --exclude popup_images/ --exclude info_images/ --exclude thumbnail_images/ --exclude-glob '*.log*' --exclude-glob '*.zip*'; quit" So my problem was that i tried to exclude a server path instead of just the name of the folder. answered Jan 27, 2022 at 13:01 Karsten 08********08 #] 10May2023 upload pPthL maybe just use except & include with lftp mirror see "$d_PROJECTS""bin - secure/lftp update specified dir.sh" +-----+ https://unix.stackexchange.com/questions/93587/lftp-login-put-file-in-remote-dir-and-exit-in-a-single-command-proper-quoting >> nyet, logs in every time +-----+ https://unix.stackexchange.com/questions/213184/run-lftp-on-a-list-of-files Run LFTP on a list of files Asked 7 years, 10 months ago Modified 7 years, 9 months ago Viewed 5k times +--+ How about something like this. [root@localhost foo]# ls -l file* -rw-r--r--. 1 root root 33 Jun 30 15:09 filelist [root@localhost foo]# cat filelist /tmp/file1 /tmp/file2 /tmp/file3 [root@localhost foo]# awk 'BEGIN { print "open localhost\nuser steve steve\n" } { print "get " $0 } END { print "exit" }' filelist | lftp [root@localhost foo]# ls -l file* -rw-r--r--. 1 root root 0 Jun 30 14:57 file1 -rw-r--r--. 1 root root 0 Jun 30 14:57 file2 -rw-r--r--. 1 root root 0 Jun 30 14:57 file3 -rw-r--r--. 1 root root 33 Jun 30 15:09 filelist [root@localhost foo]# answered Jun 30, 2015 at 22:11 steve >> looks good, except it uses awk +--+ To expand steve's answer, this script mirrors a list of files if needed while preserving directories. #!/bin/bash gawk 'BEGIN { print "open ftp://example.com\n user username password\ncd /remote/dir/\n" } { if (match ($0 ,/.+\//, m)) print "mirror -v -O localbasedir/" m[0] " -f " $0 } END { print "exit" }' filelist | lftp answered Jul 21, 2015 at 12:14 MarZab Any chance you could give some more detail about that GAWK string? – Robert Mark Bram Aug 3, 2016 at 6:06 lftp mirror can only mirror folders, so I take a list of paths, extract the folder from them, then construct a lftp script with mirror commands for each of the files with their folders ... this means lftp will only copy them from the ftp if they have changed since the last sync – MarZab Aug 4, 2016 at 5:27 08********08 #] 09Jan2023 doesn't upload new files? "$d_PROJECTS""bin - secure/lftp update specified dir.sh" *.html was in exclides - I removed it $ bash "$d_PROJECTS""bin - secure/lftp update specified dir.sh" LOCALDIR = /home/bill/web/Mythology/ REMOTEDIR = /billhowell.ca/Mythology/ Transferring file `Cardona - gods are planets, planets are gods (plus Sun).html' 09/01/2023-12:30:29 Starting upload... 09/01/2023-12:30:34 Finished upload... >> it worked 08********08 #] 04Sep2022 search 'lftp exclude directories' After myriad fixes, 'lftp update www-BillHowell-ca.sh' works fairly well many improvements to come : "$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca.sh' 685 files to "remove" - not all of them!!! "$d_web"'webWork files/z_Archive/220905 12h37m webOnline file deletes.txt' >> leave for next time +-----+ https://stackoverflow.com/questions/44340636/including-and-excluding-files-with-lftp-mirror Including and excluding files with lftp mirror Asked 5 years, 3 months ago Modified 5 years, 3 months ago >> nothing good +-----+ https://www.cyberciti.biz/faq/lftp-command-mirror-x-exclude-files-sub-directory-syntax/ lftp Mirror Command Exclude Matching Files [ Regex ] Author: Vivek Gite Last updated: September 13, 2012 0 comments You can use --exclude multiple times: mirror --exclude logs/ --exclude reports/ --exclude-glob *.bak --exclude-glob *~$ man lftp --exclude-glob-from=FILE load include/exclude patterns from the file, one per line # enddoc