#] #] ********************* #] "$d_Qndfs""webSite/1_webSite QNial process instructions.txt" - fixes for webSite link problems www.BillHowell.ca 01Jun2021 initial gathered mostly from "$d_Qndfs"'0_website notes.txt' bash-related!! : "$d_SysMaint"'webSite/0_website bash notes.txt' - bash-related webSite maintenance & tracking "$d_SysMaint"'webSite/1_website bash upload instructions.txt' - bash-related webSite instructions "$d_SysMaint"'internet & wifi/lftp notes.txt' "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" - main tool for webSite updates (not FileZilla) QNial-related : "$d_Qndfs"'webSite/0_webSite QNial notes.txt' - QNial webSite notes for maintenance & tracking "$d_Qndfs""webSite/1_webSite QNial process instructions.txt" - fixes for webSite link problems #**************************** # Table of , generated with : # $ grep "^#] " "$d_Qndfs/webSite/""1_webSite QNial process instructions.txt" | sed 's/^#\]/ /' # ********************* "$d_Qndfs""webSite/1_webSite QNial process instructions.txt" - fixes for webSite link problems +-----+ Commandments to avoid time-costly problems!!! NEVER put [apo, quote, &??]s in [subDir, fname]s - instructions below for easy removal (impractical by hand) backup files!! - important safety measure during code development, make sure this works automatically restore files from backups - important safety measure during code development +-----+ Overall process 1. [add, change, delete, restructure] d_webRawe [page, file, subDir] 2. Update menues and hand-check link in menus - these are NOT modified by my software!! 3. 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' 4. individual clusterOps when things aren't working - select optr from webRaweSite_doAll below 5. Overall doAll once everything is working well : 6. before uploads online - make sure that permissions are public!! 9. Upload to website - use "$d_PROJECTS"'Website secure/lftp update www-BillHowell-ca.sh' +-----+ Replacements involving '!!linkError!!' that won't cause massive damage. +-----+ Detailed instructions - [search, replace] '!!linkError!!s' due to an apostrophe in the [subDir, fname] 0. Setup - define variables to make processing easier, reduce errors that be catastrophic 1. change the affected [subDir, fname]s in "$d_webRawe" (not $d_webSite") 2. find pList - files with fname 3. prepare QNial "pList" variable assignment expression use htmlList of last step, regular expression search-replace with txt-editor geany 4. replace erroneous [subDir, fname] in pList with QNial QNial is slow, but much safer with [apo, quote]s than [find, sed, grep], although the latter do work 5. Re-list to temp2 file, but this time for d_webRawe where changes were made : +-----+ Instructions for individual web-[page, site] updates (check at each step!) +-----+ other instructions update a webPageRawe after editing - webPageRawe_update & webPageSite_update [find, grep] specific str search allFnamesSortedByFname for an fname search allFnamesSortedByFname for part of an fname [find, grep, sed] to replace a specific str, example : !!linkError!! be careful - can screw up many files if str is not unique!!! change strOld to strNew in pathList, for strPattern, automatic path backups to d_backup test for path existance, example failure +-----+ for an example, see "$d_Qndfs"'0_website notes.txt' : 08********08 #] +-----+ #] Commandments to avoid time-costly problems!!! #] NEVER put [apo, quote, &??]s in [subDir, fname]s - instructions below for easy removal (impractical by hand) >> usually happens absent-mindedly when creating [subDir, fname]s #] backup files!! - important safety measure during code development, make sure this works automatically #] especially for webPageRawe_update # pathList_backupTo_dir htmlPathsSortedByPath (link d_webRawe 'z_Archive/') #] restore files from backups - important safety measure during code development # dirBackup_restoreTo_paths l (link d_webRawe 'z_Archive/201117 17h00m21s backups/') # htmlPathsSortedByPath # flag_fnamesDated = l means that fnames are dated, for [one, several] files at a time # flag_fnamesDated = o means that fnames are NOT dated, for [pathList, dir] processing, many paths 08********08 #] +-----+ #] Overall process #] 1. [add, change, delete, restructure] d_webRawe [page, file, subDir] #] 2. Update menues and hand-check link in menus - these are NOT modified by my software!! #] 3. 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' It's easiest to follow the code of webRaweSite_doAll, shown below with "^>> ". Descriptions of steps for each of the webRaweSite_doAll below [8,9] : +-----+ Re-try : qnial> bye $ qnial qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> select one : clusterOps_Rsync clusterOps_WebRawe clusterOps_WebSite clusterOps_urlExternCheck qnial> webRaweSite_doAll #] webSite_setChownPublic - before uploads online, make sure that permissions are public!! I ran from terminal - very fast! >> I added optr to web maintenance program #] webSite_lftpUpload_online - upload to website, uses "$d_PROJECTS"'Website secure/lftp update www-BillHowell-ca.sh' [FileZilla, lftp, wget, curl] - which? FileZilla is [fastest, easiest] but may re-upload a huge pile! It is easy to make serious mistakes My own preus notes in 'lftp update www-BillHowell-ca.sh' : # webSite_to_webOnln use "$d_SysMaint""Linux/curl - [exist, [up, down]load] [files, url] notes.txt" # 25May2021 need to modify code above!!!! https://www.baeldung.com/linux/curl-wget 09Jun2021 I went with lftp - VERY slow : $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" 08/06/2021-22:43:21 Starting upload... 09/06/2021-01:18:28 Finished upload... 02:35:07 duration - not fast, but do-able It would take forever to do full website, 09Jun2021 lftpUpload - rather than commenting out" code, it would be far safer to call with arguments! #] 8. individual clusterOps when things aren't working - select optr from webRaweSite_doAll below #] 9. Overall doAll once everything is working well : webRaweSite_doAll IS { NONLOCAL d_webRawe d_webSite ; % ; % 04Jun2021 ??circular updates [webRawe, webURLs_extract] ; write 'stepin : rsync website.sh' ; host 'bash "$d_bin""rsync website.sh"' ; clusterOps_WebRawe ; write 'stepin : webPageSite_update' ; webAllRawOrSite_update "webPageSite_update ; write 'stepin : webURLs_extract' ; webURLs_extract ; write 'stepin : webPageSite_update' ; webAllRawOrSite_update "webPageSite_update ; % write 'stepin : urls_check "extern"' ; % urls_check 'extern' ; write 'stepin : webSite_link_counts' ; webSite_link_counts ; % write 'stepin : webSite_setChownPublic' ; % webSite_setChownPublic ; % write 'stepin : webSite_lftpUpload_online' ; % webSite_lftpUpload_online ; EACH write '' '' ; } 08********08 #] +-----+ #] Replacements involving '!!linkError!!' that won't cause massive damage. Just like what I have been doing, bypass hand stuff as it causes brain damage. 0n2021 I need a lot more useful [detail, examples] here Example - select paths from 'urls errors list.txt' : /media/bill/WebSite/!!linkError!!/national/nationalpost/ /media/bill/WebSite/!!linkError!!/national/nationalpost/search/ /media/bill/WebSite/!!linkError!!Software programming & code/bin/SSH/ /media/bill/WebSite/!!linkError!!Software programming & code/System_maintenance/ /media/bill/WebSite/!!linkError!!SP500 1928-2020 yahoo finance.dat p_fList_extractPathsWith_str.sh $ linkErrorSubDir_targetPathsReplace IS OP linkErrorSubDir { LOCAL ; NONLOCAL d_webRawe ; qnial> target_paths := strList_readFrom_path (link d_temp 'html target paths.txt') use text editor (geany) to convert form below, being careful to choose replacement subDirs : qnial> EACH str_replaceIn_pathList (o d_webRawe '!!linkError!!/national/nationalpost/' 'Climate - Kyoto Premise fraud/' target_paths) (o d_webRawe '!!linkError!!/national/nationalpost/search/' 'Climate - Kyoto Premise fraud/' target_paths) (o d_webRawe '!!linkError!!Software programming & code/bin/SSH/' 'bin/SSH/' target_paths) (o d_webRawe '!!linkError!!Software programming & code/System_maintenance/' 'System_maintenance/' target_paths) When working, convert to generalized Qial optr?>> nyet - simple enough as is 08********08 #] +-----+ #] Detailed instructions - [search, replace] '!!linkError!!s' due to an apostrophe in the [subDir, fname] # requires some adaptation #] 0. Setup - define variables to make processing easier, reduce errors that be catastrophic NOTICE : include ['!!linkError!!', subDir, fname] as needed to make specific to LINKS ONLY!! don't want regular text to be changed, don't bother with [#=; backtrack ;=#] - not necessary DON'T include "#d_webSite",as this is replace with "$d_webRawe"!! DO include [start,trail]ing slashes for subDir replacement term in the search term include trailing slash for subDir replacements ONLY For [apo,quote,&]s cannot use directly with unix : "!!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF" >>[yes,no] is a problem here $ unix_noRootLinkErrorApoQuote='!!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle' $ unix_fixed='Paul L Vaughan/Vaughan 120324 The Solar Cycles Footprint on Terrestrial Climate.PDF' $ echo "$unix_noRootLinkErrorApoQuote" ; echo "$unix_fixed" Pay attention to [apo, quote, &]s in [subDir, fname]s qnial> qnial_error := (link '!!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle' chr_apo 's Footprint on Terrestrial Climate.PDF') qnial> qnial_replace := 'Paul L Vaughan/Vaughan 120324 The Solar Cycles Footprint on Terrestrial Climate.PDF' qnial> EACH write qnial_error qnial_replace ; #] 1. change the affected [subDir, fname]s in "$d_webRawe" (not $d_webSite") This is done manually via the fileManager, before creating pList #] 2. find pList - files with fname $ find "$d_webSite" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "$unix_noRootLinkErrorApoQuote" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""linkError fix temp1.txt" #] 3. prepare QNial "pList" variable assignment expression #] use htmlList of last step, regular expression search-replace with txt-editor geany geany regexpr multiline search-replace search: (/media/bill/WebSite/)(.*):(.*):(.*) replace : /media/bill/Dell2/Website - raw/\2 geany add apos to above list pre-pended by 'qnial> pList := ' : geany escSeq search: \n replace : ' ' qnial> pList := ' #] 4. replace erroneous [subDir, fname] in pList with QNial #] QNial is slow, but much safer with [apo, quote]s than [find, sed, grep], although the latter do work qnial> str_replaceIn_pathList o d_webRawe qnial_error qnial_replace pList #] 5. Re-list to temp2 file, but this time for d_webRawe where changes were made : $ find "$d_webRawe" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '$unix_fixed' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""linkError fix temp2.txt" 08********08 #] +-----+ #] Instructions for individual web-[page, site] updates (check at each step!) # for Quick [searches, fixes] for bad webSite links - example [QNial-bash] fixes # see "$d_Qndfs"'1_webSite QNial process instructions.txt' # 24Oct2020 This is the THIRD stage, after [creating, editing] cart [webPages, Menus, headers, footers] # 1. make structural changes to the website - adapt [directories, files, menus] a) menus! these are key changes affecting many webPages 25May2021 this is not working properly, probably a "hidden step" is required? maybe manual? b) do NOT include d_webRawe = "/media/bill/Dell2/Website - raw/" in links (or directories will not be linked properly) qnial> pathList := host_result (link 'find "$d_webRawe" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive" ') qnial> str_replaceIn_pathList l d_webRawe '/media/bill/Dell2/Website - raw/' '' pathList c) for analysis of problems, see section above : #] Quick [searches, fixes] for bad webSite links - example [QNial-bash] fixes # 2. check for errors and required updates in test mode a) 'webSite header.ndf' - normally won't require alteration b) 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' # 3. mount "WebSite" USB drive!!! # 4. manually test "$d_bin""rsync website.sh" settings - see notes in "$d_SysMaint""Linux/rsync notes.txt" ONLY activate webRawe_to_webSite set options="test" and run manually, NOT from QNial!! set options="change" when ready to use this QNial program & make changes (eg run webRaweSite_doAll etc) true changes can take portions of an hour for a large-size transfer of many files 25May2021 this took ~15 minutes (1 year since last rsyn, many files) # 5. manually test bash "$d_bin""webSite check for [z_Archive, z_Old].sh" if there are [z_Archive, z_Old], then this is a problem! # Run a test for individual webPages : # For manual running of webPageRawe_update : # FIRST -run first lines of code below to make sure d_htmlBackup is present!!! # d_htmlBackup := link d_webRawe (link 'z_Archive/' timestamp_YYMMDD ' backups/') ; # IF (NOT path_exists "d d_htmlBackup) THEN host link 'mkdir "' d_htmlBackup '" ' ; ENDIF ; # replace with actual path) & run : # webPageRawe_update flag_backup p_webPage d_backup - for single webPage. Example : p_webPage := link d_webRawe 'economics, markets/currency-crypto/Cryptos versus [currencies, 10 year [rates, bonds]].html' webPageRawe_update l p_webPage d_htmlBackup # Update whole d_[webRawe, webSite] ONLY when everything looks good : 25May2021 current problem - error messages for every z_[Archive, Old] perhaps caused by major directory revamps, 2 in the last year+ webAllRawOrSite_update l "webPageRawe_update webAllRawOrSite_update l "webPageSite_update CHECK results # Update whole d_webSite ONLY when everything is working OK : # webRaweSite_doAll # test several webPages on d_webSite - links, images etc # for uploads online - make sure that permissions are public!! # loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' 08********08 #] +-----+ #] Manually check links of [normal, conference guide] webPages of webSite 08Jun2021 Re-check links in d_webSite see "$d_Qndfs""webSite/1_webSite QNial process instructions.txt" - fixes for webSite link problems 08Jun2021 Go through webPages [menu, content, footer] : NOTE : I only checked webPage menus at this time, not links in the [body, footer] My webSite link management software programming does a MUCH more thorough test, but I feel it's still good to go through the site manually as a spot check. Menus are a good focus, as problems here are most serious. +--+ Summary of Menu [error, ToDo]s : ** Randell Mills- hydrinos - wants to download ods document * Robert Prechter - Socionomics - bookmarked pandemic webPage, OK, OK online ** Stephen Puetz - Greatest of cycles - wants to download ods document, OK, OK online **!!! S&P500 P/E ratios vs Treasury rates - file not found (might be slash in fname?) ** QNial programming language - OK list of files, but doesn't goto webPage ** Solar modeling and forecasting - needs projects menu, OK, OK online ** Big Data, Deep Learning, Safety - goes directly to video, needs webPage, OK, OK online ** How Nazis saved (some) Norweigian lives - goes directly to video, needs webPage, OK, OK online ** Venus et Mars - Saint Valentin - goes directly to video, needs webPage, OK, OK online ** Google analytics - can't find, should a header item! [normal, online] Also - conference guides : header, footers are html, not execute embeds leave it... maybe the next round of webSite [software, site] upgrades in 6-12 months maybe never Header error : '09Jun2021 webSite status' doesn't work for the case that I saw Most menues don't have the link +--+ main OK, OK online Neural nets root OK, OK online MindCode - OK, OK online NN earlier work - OK, OK online Computational neuro-genetic models - OK, OK online Holidays : NNs & genomics - OK, OK online Paper reviews - OK, OK online Conference Guides - see special check below Projects root seems fine PROJECTS major (1 active) - OK, OK online MindCode neural network - OK, OK online Bill Lucas - Universal Force - OK, OK online ** Randell Mills- hydrinos - wants to download ods document, OK, OK online IceBreaker unchained (WWII) - OK, OK online "Hope-to-do-soon" projects - OK, OK online Failures of thinking : Lies, Damned Lies, and Scientists - OK, OK online Climate - Kyoto Premise fraud - OK, OK online Robert Prechter - Socionomics - OK, OK online Economics & Markets : S&P500 1872-2020, 83y trend - OK, OK online ** Stephen Puetz - Greatest of cycles - wants to download ods document, OK, OK online * Robert Prechter - Socionomics - bookmarked pandemic webPage, OK, OK online **!!! S&P500 P/E ratios vs Treasury rates - file not found, (might be slash in fname?) Pandemics, health, Sun : Fun, crazy stuff - OK, OK online Influenza - OK, OK online Corona virus - OK, OK online Suicide - OK, OK online Life & [Pre,]-history : Civilisations and sun - OK, OK online Stephen Puetz - Greatest of cycles - OK, OK online Steve Yaskell - sun & history - doesn't have projects menu, has hosted menu, OK, OK online Anthony Peratt -petroglyphs - OK, OK online Galactic rays and evolution - OK, OK online Astronomy, Earth, Climate : Ivanka Charvatova - solar inertial motion - OK, OK online Climate and sun - OK, OK online Stephen Puetz - Greatest of cycles - OK, OK online ** Solar modeling and forecasting - needs projects menu, OK, OK online SAFIRE - electric sun experiment - OK, OK online Software programming & code ** QNial programming language - OK list of files, but doesn't goto webPage Linux bash scripts - OK list of files, no webPage LibreOffice macros - OK list of files, no webPage [en, de]crypt instructions - OK System_maintenance - OK list of files, no webPage TradingView PinScripts - OK list of files, no webPage Professional & Resume Resume - OK, OK online Education - OK, OK online Publications & reports - OK, OK online Howell-produced videos - OK, OK online Birkeland rotation in galaxy - not dark matter? - OK, OK online (very slow) Past & future worlds (for schoolkids) - OK ** Big Data, Deep Learning, Safety - goes directly to video, needs webPage, OK, OK online Icebreaker Unchained (WWII) - download pdf, needs webPage, OK, OK online ** How Nazis saved (some) Norweigian lives - goes directly to video, needs webPage, OK, OK online ** Venus et Mars - Saint Valentin - goes directly to video, needs webPage, OK, OK online * blogs doesn't have theme link (not really needed) Howell's "Blog" - OK, OK online Howell's cool emails - OK, OK online Cool images (various sources) - OK, OK online Suspicious Observers comments - OK, OK online Hosted sub-sites OK, but some show projects menu! (confusing) Neil Howell's Art - OK, OK online Paul Vaughan - top Climate modeller - OK, OK online Steven Yaskell, sun & history - OK, OK online Steve Wickson - extinction events - OK, OK online Neil Howell's Art - OK, OK online Go through Conference Guide webPages [menu, content, footer] : $ ls -1 '/media/bill/Dell2/Website - raw/Neural nets/Conference guides' Conference guides - main - OK, OK online Authors' guide - OK, OK online Publications Guide - OK, OK online Publicity Guide - OK, OK online Reviewers' Guide - OK, OK online Sponsors' Guide - OK, OK online Authors' guide - OK Authors & Publish - chair page, blog all-OK, all-OK online Paper formatting - page, blog all-OK, all-OK online Initial paper submission - chair page, blog all-OK, all-OK online Final paper submission - chair page, blog all-OK, all-OK online Problematic papers - corrections - page, blog all-OK, all-OK online Author [PDF,CrossCheck] tests - page, blog all-OK, all-OK online IEEE PDF eXpress - paper format - chair page, blog all-OK, all-OK online IEEE electronic Copyright (eCf) - chair page, blog all-OK, all-OK online Attendee downloads of papers - page, blog all-OK, all-OK online Conference registration - page, blog all-OK, all-OK online Travel visas to Hungary - page, blog all-OK, all-OK online Conference presentations - page, blog all-OK, all-OK online HELP contacts - WCCI2020, system, all-OK, all-OK online Non-Author actions Paper reviews - authors' perspective - page, blog all-OK, all-OK online IEEE CrossCheck text similarity - chair page, blog, all-OK online IEEE Xplore web-publish - chair page, blog all-OK, all-OK online IEEE Conference Application - chair OK, OK online IEEE Letter of Acquisition - chair OK, OK online IEEE Publication Form - chair OK, OK online Software systems - page OK, OK online ** Google analytics - can't find, should a menu item! [normal, online] Publications Guide - same menu as for Authors' Guide, I checked this too, OK, OK online Publicity Guide - OK, OK online Responsibilities - OK, OK online Publicity channels - OK, OK online Planning - OK, OK online Website tie-ins & tracking - OK, OK online Mass emails : - OK, OK online IEEE-CIS ListServers - OK, OK online SENDER Instructions - OK, OK online EDITOR Instructions - OK, OK online OWNER Instructions - OK, OK online Reviewers' Guide - just one page that redirects, OK, OK online Sponsors' Guide - OK, OK online Call for Sponsors & Exhibitors - OK, OK online Why should you Sponsor IJCNN2019? - OK, OK online Sign up to be a Sponsor and/or Exhibitor - OK, OK online Instructions for confirmed Sponsors and Exhibitors - OK, OK online Venue & layout of conference - OK, OK online Repeat NOTE : I only checked webPage menus at this time, not links in the [body, footer] My webSite link management software programming does a MUCH more thorough test, but I feel it's still good to go through the site manually as a spot check. Menus are a good focus, as problems here are most serious. 08********08 #] +-----+ #] other instructions 08********08 #] update a webPageRawe after editing - webPageRawe_update & webPageSite_update # webPageRawe_update l (link d_webRawe 'Neil Howell/_Neil Howell.html') # webPageSite_update l (link d_webRawe 'Neil Howell/_Neil Howell.html') 08********08 #] [find, grep] specific str # $ find "' d_webRawe '" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" 08********08 #] search allFnamesSortedByFname for an fname # (= (link 'Howell 120729 Why was France' chr_apo 's King Louis XIV called the Sun King.pdf') allFnamesSortedByFname ) sublist allPathsSortedByFname Howell 120729 Why was France's King Louis XIV called the Sun King.pdf 08********08 #] search allFnamesSortedByFname for part of an fname # ((link 'Howell 120729 Why was France' chr_apo 's King Louis XIV called the Sun King.pdf') subStr_in_str allFnamesSortedByFname ) sublist allPathsSortedByFname Howell 120729 Why was France's King Louis XIV called the Sun King.pdf 08********08 #] [find, grep, sed] to replace a specific str, example : !!linkError!! #] be careful - can screw up many files if str is not unique!!! # $ find "/media/bill/SWAPPER/Website - raw/" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | sed "s/!!linkError!!/[#=; backtrack ;=#]/g" 08********08 #] change strOld to strNew in pathList, for strPattern, automatic path backups to d_backup # be careful - can screw up many files if str is not unique!!! # str_replaceIn_pathList l d_webRawe '!!linkError!!' '' htmlPathsSortedByPath # all webPages : str_replaceIn_pathList l d_webRawe (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'confHead.html' chr_apo ')' chr_tab 'fout ;') (link '[#!: path_insertIn_fHand (link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' Howell : [#=; fname ;=#] ' chr_newline '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ') htmlPathsSortedByPath # just test with one file str_replaceIn_pathList o d_webRawe (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'confHead.html' chr_apo ')' chr_tab 'fout ;') (link '[#!: path_insertIn_fHand '(link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' Howell : [#=; fname ;=#] ' chr_newline '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ') (solitary (link d_webRawe 'Neural nets/Conference guides/Author guide website/Author guide.html')) # confGuide webPages : str_replaceIn_pathList l d_webRawe (link '[#!: path_executeEmbedsInsertIn_fHand (link d_webWork ' chr_apo 'confMenu_publications.html' chr_apo ') phraseValueList ;') (link '[#!: path_executeEmbedsInsertIn_fHand (link d_webWork ' chr_apo 'confMenu_authors.html' chr_apo ') phraseValueList ;') htmlConfGuidePages qnial> str_replaceIn_pathList o d_webRawe (link '/120214 Venus et Mars, au dela d' chr_apo 'une histoire d amour/') '/120214 Venus et Mars, au dela d une histoire d amour/' pList Re-check to temp2 file, but this time for d_webRawe where changes were made : $ find "$d_webRawe" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '120214 Venus et Mars, au dela d' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""Venus et Mars line-links2.txt" >> still have 16 instead of 19 before!!??? >> but it's OK! the subDir has been corrected 08********08 #] test for path existance, example failure # qnial> (find_Howell 'Howell - Are we ready for global cooling.pdf' allFnamesSortedByFname) pick allPathsSortedByFname ?address #] +-----+ #] for an example, see "$d_Qndfs"'0_website notes.txt' : 0Jun2021 70 failed links I can handle manually >> quite possibly menu links? Check one-at-a-time and seek ways of fixing computer coding # enddoc