#] #] ********************* #] "$d_Qndfs"'webSite/0_webSite QNial notes.txt' - QNial webSite notes for maintenance & tracking # www.BillHowell.ca 06Oct2020 initial, based on earlier testing etc bash-related!! : "$d_SysMaint"'webSite/0_website bash notes.txt' - bash-related webSite maintenance & tracking "$d_SysMaint"'webSite/1_website bash upload instructions.txt' - bash-related webSite instructions "$d_SysMaint"'internet & wifi/lftp notes.txt' "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" - main tool for webSite updates (not FileZilla) QNial-related : "$d_Qndfs"'webSite/0_webSite QNial notes.txt' - QNial webSite notes for maintenance & tracking "$d_Qndfs""webSite/1_webSite QNial process instructions.txt" - fixes for webSite link problems +-----+ webOnline updates via FileZilla : 2May2021 - use [curl, wget] instead? for now still FileZilla make SURE that FileZilla -> Menu -> View -> Directory listing filters : set 'html_files' filter or not, depending on whether [html, all] files are to be uploaded Problem with online html file permissions - must set execute for [owner, group, public] +-----+ ToDos in the future, plus status of menu links : see "$d_Qndfs""website updates [summary, ToDos, status].txt" +-----+ qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webRaweSite_doAll o +-----+ Big link problems 1. [apo, quote] in [subDir, fname] where does this links appear? #] find specific '!!linkError!!' $ find "$d_webSite" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Allegre' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" +-----+ for ToDos, see "$d_Qndfs""webSite/2_website updates [summary, ToDos, status].txt" 48************************************************48 #24************************24 # Table of Contents, generate with : # $ grep "^#]" "$d_Qndfs"'webSite/0_webSite QNial notes.txt' | sed "s/^#\]/ /" # ********************* "$d_Qndfs"'webSite/0_webSite QNial notes.txt' - QNial webSite notes for maintenance & tracking find specific '!!linkError!!' 01Sep2021 Andy Hall's Electric Universe geology webPage - post to internet 08Jun2021 create home page "eye candy" for Icebreaker with Dad's painting webRawePageL_to_online IS OP webRawePageL - process webPageLthrough to online upload 08Jun2021 check online links - add to d_webRawe checks 08Jun2021 Go through webPages [menu, content, footer] : 08Jun2021 upload d_webSite to one website +-----+ Run the whole shebang except manual [edits, fixes, etc] writeDoStep IS OP str - easy tracking of steps in webRaweSite_doAll clusterOps_Rsync IS - rsync d_webRawe -> d_webSite clusterOps_WebRawe IS - update optrs for d_webRawe clusterOps_WebSite IS - update optrs for d_webSite clusterOps_urlExternCheck IS - update optrs for d_webSite 08Jun2021 Re-check links in d_webSite 08Jun2021 check online links - add to d_webRawe checks 08Jun2021 webRaweSite_doAll 08Jun2021 webSite_link_counts -> writefile of 'Count of files on this webSite' not working 07Jun2021 back to upload website problems 07Jun2021 Failures from successive loaddefs - [boolean, path_exists] tests example 06Jun2021 backup problems 04Jun2021 path_exists, boolean [optr, test] 06Jun2021 continue path_exists fixes 04Jun2021 upload website 04Jun2021 webSite_link_counts - loaddef problem 03Jun2021 continued fixes of LinkErrors 02Jun2021 webSite_doAll - how much is lt to fix? 01Jun2021 more fixes pathLinn_str_extractPathsWithStrTo_pathLout() - given pathLinn with a list of paths, write those with str to pathLout webSite_fixes_noChrBads() - website fixes of [changed, moved] subDirs with noChrBads like [apo, quote, &] must change the inputs as required. QNial programs could do this instead pathLinn_strInn_changeStr_strOut_pathLout() - given pathLinn with a list of paths, each containing strInn, replace strInn with strOut and write to pathLout 01Jun2021 70 failed links I can handle manually find specific '!!linkError!!' 0. Setup - define variables to make processing easier, reduce errors that be catastrophic 1. change the affected [subDir, fname]s in "$d_webRawe" (not $d_webSite") 2. find pList - files with fname 3. prepare QNial "pList" variable assignment expression use htmlList of last step, regular expression search-replace with txt-editor geany 4. replace erroneous [subDir, fname] in pList with QNial QNial is slow, but much safer with [apo, quote]s than [find, sed, grep], although the latter do work 5. Re-list to temp2 file, but this time for d_webRawe where changes were made : 0. Setup - define variables to make processing easier, reduce errors that be catastrophic 1. change the affected [subDir, fname]s in "$d_webRawe" (not $d_webSite") 2. find pList - files with fname 3. prepare QNial "pList" variable assignment expression use htmlList of last step, regular expression search-replace with txt-editor geany 4. replace erroneous [subDir, fname] in pList with QNial QNial is slow, but much safer with [apo, quote]s than [find, sed, grep], although the latter do work 5. Re-list to temp2 file, but this time for d_webRawe where changes were made : 31May2021 same problem with 'urls errorslist.txt' - !!linkError!! 31May2021 Major remaining issue is 'urls errors list.txt' 31May2021 continue fixes 30May2021 webURLs_extract webRawLinks_remove_pct20 IS - generate htmlPathLists 29May2021 fname_get_sDirPath IS OP subDir fname fname_get_subDirPath IS OP subDir fname - return path if fname in allFnamesSortedByFname, else error webSite_readpathsSubDirsFnames IS - read stable [path, dir] lists 28May2021 !!linkError!! problem search allFnamesSortedByFname for an fname search allFnamesSortedByFname for part of an fname 27May2021 Again, back to the link problem with '/media/bill/Dell2/Website - raw/' 27May2021 Now back to the link problem with '/media/bill/Dell2/Website - raw/' 27May2021 problem in time-naming backups : webPageRawe_update versus webAllRawOrSite_update 26May2021 current priorities : loaddefs link d_Qndfs 'webSite/webSite header.ndf' 25May2021 yet another attempt to update website 25May2021 problems with menu changes with uploads 25May2021 fix Software Programming link, update 17Dec2020 problems with menu changes with uploads - added crypto page 14Dec2020 fileZilla update webPages & check previous problems 14Dec2020 rerun to check link status 14Dec2020 recurring problems that don't get fixed -why? 13Dec2020 fix menu errors of normalSite (not confGuideSite) 12Dec2020 check for failures 09Dec2020 STOP working on this! - simple patch, get onto MindCode etc!!! 07Dec2020 Now I need to : change strOld to strNew in pathList, for strPattern, automatic path backups to d_backup 08Dec2020 resume work 25Nov2020 'webSite [menuHeadFoot, link, TableOfContents, link] tools.html' 25Nov2020 index.html - put in a smaller sized image! 25Nov2020 lftp instead of fileZilla upload 25Nov2020 fixes : 25Nov2020 Problem is, my coding probably destroyed many images during initial code development. 24Nov2020 upload to webOnln via fileZilla 24Nov2020 webSite_doAll - add a check to see if [z_Archive, z_Old] dirs in d_webSite 24Nov2020 Add rsync to webSite_doAll 23Nov2020 Should webPageSite_update be doing internalLinks_return_relativePath? 23Nov2020 re-retry webSite_doAll - see if webPageRawes are updated 30Oct2020 Simple check of multiple " with actual path) & run : # webPageRawe_update flag_backup p_webPage d_backup - for single webPage. Example : p_webPage := link d_webRawe 'Electric Universe/' webPageRawe_update l p_webPage d_htmlBackup Adapt to current task : qnial> d_htmlBackup := link d_webRawe (link 'z_Archive/' timestamp_YYMMDD ' backups/') qnial> p_webPageRawe := link d_webRawe 'Electric Universe/Andrew Halls electric geology [thunderblog, video]s.html' qnial> webPageRawe_update l p_webPage d_htmlBackup >> OK - but none of my own website links w"relative" - leave it for now qnial> webPageSite_update p_webPageRawe >> OK - 'Andrew Halls electric geology [thunderblog, video]s.html' was created check [menu, link]s : unicode special characters geany : ’ etc firefox : Earth’s weather was like Jupiter’s fix [apo, quote, hyphen –]s - left & right [GNU Pulic, Creative Commons] Licences do not show images?
- creates a mess. Just get rid of it For now, just upload with FileZilla and check >> OOPS! I uploaded webPageRawe -> replaced with webPageSite Menus don't work!!! 08********08 #] 08Jun2021 create home page "eye candy" for Icebreaker with Dad's painting I created : #] webRawePageL_to_online IS OP webRawePageL - process webPageLthrough to online upload I commented out for now : % write 'stepin : webPage_setChownPublic' ; % webSite_setChownPublic ; % write 'stepin : webPage_lftpUpload_online' ; % webSite_lftpUpload_online ; qnial> webRawePageL_to_online [link d_webRawe 'index.html'] >> nuts, must refresh fileLists with webRawe_extract_pathsSubDirsFnames Added it to [webRawePageL_to_online, webRaweSite_doAll] re-try qnial> webRawePageL_to_online [link d_webRawe 'index.html'] d_webRawe 'index.html' - looks good d_webSite 'index.html' - image [too large, doesn't show] in browser Why doesn't the image display!!??? It does have the right linked in the "updated" d_webRawe 'index.html' Something is rotten in the State of Denmark >> idiot! I didn't rsync it +-----+ Maybe easier to upload manually with FileZilla I'm afraid that [lftp, FileZilla] will be too slow, and will upload touch redundant stuff they shouldn't. I need a much smaller image for 'index.html', click to see full image /media/bill/Dell2/Website - raw/Icebreaker/images/Neil Howell - Hilter refuses to invade Great Britain, Operation Eagle denied half-sized.xcf Seagate 1.8Tbyte drive 130909 to 200208 backups /media/bill/Seagate Expansion Drive/160715 Toshiba_monthly_backup 07.9Gb/Projects/ 141226 Hitler&Stalin Turning point Stalin supported Hitler OK - so 'index.html' now includes what i have for Icebreakere 09Jun2021 Icebreaker needs a captured video of Part 1 of 6 08********08 #] 08Jun2021 check online links - add to d_webRawe checks #] 08Jun2021 Go through webPages [menu, content, footer] : NOTE : I only checked webPage menus at this time, not links in the [body, footer] My webSite link management software programming does a MUCH more thorough test, but I feel it's still good to go through the site manually as a spot check. Menus are a good focus, as problems here are most serious. +--+ Summary of Menu [error, ToDo]s : ** Randell Mills- hydrinos - wants to download ods document * Robert Prechter - Socionomics - bookmarked pandemic webPage, OK, OK online ** Stephen Puetz - Greatest of cycles - wants to download ods document, OK, OK online **!!! S&P500 P/E ratios vs Treasury rates - file not found (might be slash in fname?) ** QNial programming language - OK list of files, but doesn't goto webPage ** Solar modeling and forecasting - needs projects menu, OK, OK online ** Big Data, Deep Learning, Safety - goes directly to video, needs webPage, OK, OK online ** How Nazis saved (some) Norweigian lives - goes directly to video, needs webPage, OK, OK online ** Venus et Mars - Saint Valentin - goes directly to video, needs webPage, OK, OK online ** Google analytics - can't find, should a header item! [normal, online] Also - conference guides : header, footers are html, not execute embeds leave it... maybe the next round of webSite [software, site] upgrades in 6-12 months maybe never Header error : '09Jun2021 webSite status' doesn't work for the case that I saw Most menues don't have the link +--+ main OK, OK online Neural nets root OK, OK online MindCode - OK, OK online NN earlier work - OK, OK online Computational neuro-genetic models - OK, OK online Holidays : NNs & genomics - OK, OK online Paper reviews - OK, OK online Conference Guides - see special check below Projects root seems fine PROJECTS major (1 active) - OK, OK online MindCode neural network - OK, OK online Bill Lucas - Universal Force - OK, OK online ** Randell Mills- hydrinos - wants to download ods document, OK, OK online IceBreaker unchained (WWII) - OK, OK online "Hope-to-do-soon" projects - OK, OK online Failures of thinking : Lies, Damned Lies, and Scientists - OK, OK online Climate - Kyoto Premise fraud - OK, OK online Robert Prechter - Socionomics - OK, OK online Economics & Markets : S&P500 1872-2020, 83y trend - OK, OK online ** Stephen Puetz - Greatest of cycles - wants to download ods document, OK, OK online * Robert Prechter - Socionomics - bookmarked pandemic webPage, OK, OK online **!!! S&P500 P/E ratios vs Treasury rates - file not found, (might be slash in fname?) Pandemics, health, Sun : Fun, crazy stuff - OK, OK online Influenza - OK, OK online Corona virus - OK, OK online Suicide - OK, OK online Life & [Pre,]-history : Civilisations and sun - OK, OK online Stephen Puetz - Greatest of cycles - OK, OK online Steve Yaskell - sun & history - doesn't have projects menu, has hosted menu, OK, OK online Anthony Peratt -petroglyphs - OK, OK online Galactic rays and evolution - OK, OK online Astronomy, Earth, Climate : Ivanka Charvatova - solar inertial motion - OK, OK online Climate and sun - OK, OK online Stephen Puetz - Greatest of cycles - OK, OK online ** Solar modeling and forecasting - needs projects menu, OK, OK online SAFIRE - electric sun experiment - OK, OK online Software programming & code ** QNial programming language - OK list of files, but doesn't goto webPage Linux bash scripts - OK list of files, no webPage LibreOffice macros - OK list of files, no webPage [en, de]crypt instructions - OK System_maintenance - OK list of files, no webPage TradingView PinScripts - OK list of files, no webPage Professional & Resume Resume - OK, OK online Education - OK, OK online Publications & reports - OK, OK online Howell-produced videos - OK, OK online Birkeland rotation in galaxy - not dark matter? - OK, OK online (very slow) Past & future worlds (for schoolkids) - OK ** Big Data, Deep Learning, Safety - goes directly to video, needs webPage, OK, OK online Icebreaker Unchained (WWII) - download pdf, needs webPage, OK, OK online ** How Nazis saved (some) Norweigian lives - goes directly to video, needs webPage, OK, OK online ** Venus et Mars - Saint Valentin - goes directly to video, needs webPage, OK, OK online * blogs doesn't have theme link (not really needed) Howell's "Blog" - OK, OK online Howell's cool emails - OK, OK online Cool images (various sources) - OK, OK online Suspicious Observers comments - OK, OK online Hosted sub-sites OK, but some show projects menu! (confusing) Neil Howell's Art - OK, OK online Paul Vaughan - top Climate modeller - OK, OK online Steven Yaskell, sun & history - OK, OK online Steve Wickson - extinction events - OK, OK online Neil Howell's Art - OK, OK online Go through Conference Guide webPages [menu, content, footer] : $ ls -1 '/media/bill/Dell2/Website - raw/Neural nets/Conference guides' Conference guides - main - OK, OK online Authors' guide - OK, OK online Publications Guide - OK, OK online Publicity Guide - OK, OK online Reviewers' Guide - OK, OK online Sponsors' Guide - OK, OK online Authors' guide - OK Authors & Publish - chair page, blog all-OK, all-OK online Paper formatting - page, blog all-OK, all-OK online Initial paper submission - chair page, blog all-OK, all-OK online Final paper submission - chair page, blog all-OK, all-OK online Problematic papers - corrections - page, blog all-OK, all-OK online Author [PDF,CrossCheck] tests - page, blog all-OK, all-OK online IEEE PDF eXpress - paper format - chair page, blog all-OK, all-OK online IEEE electronic Copyright (eCf) - chair page, blog all-OK, all-OK online Attendee downloads of papers - page, blog all-OK, all-OK online Conference registration - page, blog all-OK, all-OK online Travel visas to Hungary - page, blog all-OK, all-OK online Conference presentations - page, blog all-OK, all-OK online HELP contacts - WCCI2020, system, all-OK, all-OK online Non-Author actions Paper reviews - authors' perspective - page, blog all-OK, all-OK online IEEE CrossCheck text similarity - chair page, blog, all-OK online IEEE Xplore web-publish - chair page, blog all-OK, all-OK online IEEE Conference Application - chair OK, OK online IEEE Letter of Acquisition - chair OK, OK online IEEE Publication Form - chair OK, OK online Software systems - page OK, OK online ** Google analytics - can't find, should a menu item! [normal, online] Publications Guide - same menu as for Authors' Guide, I checked this too, OK, OK online Publicity Guide - OK, OK online Responsibilities - OK, OK online Publicity channels - OK, OK online Planning - OK, OK online Website tie-ins & tracking - OK, OK online Mass emails : - OK, OK online IEEE-CIS ListServers - OK, OK online SENDER Instructions - OK, OK online EDITOR Instructions - OK, OK online OWNER Instructions - OK, OK online Reviewers' Guide - just one page that redirects, OK, OK online Sponsors' Guide - OK, OK online Call for Sponsors & Exhibitors - OK, OK online Why should you Sponsor IJCNN2019? - OK, OK online Sign up to be a Sponsor and/or Exhibitor - OK, OK online Instructions for confirmed Sponsors and Exhibitors - OK, OK online Venue & layout of conference - OK, OK online Repeat NOTE : I only checked webPage menus at this time, not links in the [body, footer] My webSite link management software programming does a MUCH more thorough test, but I feel it's still good to go through the site manually as a spot check. Menus are a good focus, as problems here are most serious. 08********08 #] 08Jun2021 upload d_webSite to one website 2 items from : "$d_Qndfs""webSite/1_webSite instructions.txt" - fixes for webSite link problems +-----+ 6. before uploads online - make sure that permissions are public!! Howell : from https://linuxize.com/post/chmod-recursive/ : The most common scenario is to recursively change the website file’s permissions to 644 and directory’s permissions to 755. I ran from terminal - very fast! $ find "$d_webSite" -type d -print0 | xargs -0 chmod 755 $ find "$d_webSite" -type f -print0 | xargs -0 chmod 644 $ find "$d_webRawe" -type d -print0 | xargs -0 chmod 755 $ find "$d_webRawe" -type f -print0 | xargs -0 chmod 644 Check results : Yaskell directory now same as QNial with g:p access +-----+ 9. Upload to website - use "$d_PROJECTS"'Website secure/lftp update www-BillHowell-ca.sh' [FileZilla, lftp, wget, curl] - which? FileZilla is [fastest, easiest] but may re-upload a huge pile! It is easy to make serious mistakes My own preus notes in 'lftp update www-BillHowell-ca.sh' : # webSite_to_webOnln use "$d_SysMaint""Linux/curl - [exist, [up, down]load] [files, url] notes.txt" # 25May2021 need to modify code above!!!! https://www.baeldung.com/linux/curl-wget >> I went with lftp - VERY slow, $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" 08/06/2021-22:43:21 Starting upload... 09/06/2021-01:18:28 Finished upload... 02:35:07 duration - not ffa, but do-able It would take forever to do full website, 09Jun2021 lftpUpload - rather than commenting out" code, it would be far safer to cl with arguments! +----+ BAD! see "$d_SysMaint""Linux/lftp [up,down]loads, mirror notes.txt" $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" >> OOPS!! # webSite_to_webOnln use "$d_SysMaint""Linux/curl - [exist, [up, down]load] [files, url] notes.txt" # 25May2021 need to modify code above!!!! In sh script, I made sure that wwwBillHowell_update was selected : # Take your pick : # ls_remote # ls_remote_tests # delete_WebSite_diffs wwwBillHowell_update # wwwBillHowell_html_update +-----+ olde code - not used now with tod's changes #**************************** #] +-----+ #] Run the whole shebang except manual [edits, fixes, etc] IF flag_debug THEN write '+-----+' ; ENDIF ; IF flag_debug THEN write 'URLs - check and count, helps for debugging' ; ENDIF ; IF flag_debug THEN write 'loading writeDoStep' ; ENDIF ; #] writeDoStep IS OP str - easy tracking of steps in webRaweSite_doAll writeDoStep IS OP str { write (link timestamp_YYMMDD_HMS ' ' str) ; execute str ; } # loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' IF flag_debug THEN write 'loading clusterOps_Rsync' ; ENDIF ; #] clusterOps_Rsync IS - rsync d_webRawe -> d_webSite # 04Jun2021 initial from webRaweSite_doAll IF flag_break THEN BREAK ; ENDIF ; # comment-out steps to avoid clusterOps_Rsync IS { NONLOCAL d_webRawe d_webSite ; % ; EACH write '' 'clusterOps_Rsync : ' ; % rsync (ensure up-to-date-copies) from d_webRawe -> d_webSite ; % only non-html files are transferred with this command ; % html files are handled by 'webAllRawOrSite_update "webPageSite_update' below ; write 'stepin : rsync website.sh' ; host 'bash "$d_bin""rsync website.sh"' ; } # loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' IF flag_debug THEN write 'loading clusterOps_WebRawe' ; ENDIF ; #] clusterOps_WebRawe IS - update optrs for d_webRawe # Obviously, all [subDir, fname, menu] changes must be done BEFORE clusterOps_WebRawe # 04Jun2021 initial from webRaweSite_doAll IF flag_break THEN BREAK ; ENDIF ; # comment-out steps to avoid clusterOps_WebRawe IS { LOCAL d_backup ; NONLOCAL d_webRawe d_webSite ; % ; EACH write '' 'clusterOps_WebRawe : ' ; % ; % force 1st update of [subDir, fname] lists. ; write 'stepin : webRawe_extract_pathsSubDirsFnames' ; webRawe_extract_pathsSubDirsFnames ; % ; % d_webRawe - update lists of [path, link, url] links as [OK, bad, unknown] ; write 'stepin : webPageRawe_update' ; d_backup := webAllRawOrSite_update "webPageRawe_update ; EACH write 'd_backup from webAllRawOrSite_update : ' d_backup ; write 'stepin : urls_check "intern"' ; urls_check 'intern' d_backup ; write 'stepin : urls_check "bkmkEx"' ; urls_check 'bkmkEx' d_backup ; write '' ; d_backup } # olde code % d_webRawe - initial cleanup of html files ; % reset [linkError, dotSlash] links to be ready for fixes ; % 04Jun2021 is this step even useful? dotSlash - not used any more? ; % write 'stepin : str_replaceIn_pathList' ; % str_replaceIn_pathList l d_webRawe '!!linkError!!' '' htmlPathsSortedByPath ; # loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' IF flag_debug THEN write 'loading clusterOps_WebSite' ; ENDIF ; #] clusterOps_WebSite IS - update optrs for d_webSite # 04Jun2021 initial from webRaweSite_doAll IF flag_break THEN BREAK ; ENDIF ; # comment-out steps to avoid clusterOps_WebSite IS { NONLOCAL d_webRawe d_webSite ; % ; EACH write '' 'clusterOps_WebSite : ' ; % work on d_webSite ; write 'Any "z_Archive|z_Old" subDirs in d_webSite will be listed below their respective commands below' ; write 'stepin : find z_Archive' ; host 'find "$d_webSite" -type d -name "z_Archive"' ; write 'stepin : find z_Old' ; host 'find "$d_webSite" -type d -name "z_Old"' ; % execute_embeds to fill out each webPage with calculated relative addressing ; write 'stepin : webPageSite_update' ; webAllRawOrSite_update "webPageSite_update ; % ??circular updates [webRawe, webURLs_extract] ; write '' ; } # loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' IF flag_debug THEN write 'loading clusterOps_urlExternCheck' ; ENDIF ; #] clusterOps_urlExternCheck IS - update optrs for d_webSite # 04Jun2021 initial from webRaweSite_doAll IF flag_break THEN BREAK ; ENDIF ; # comment-out steps to avoid clusterOps_urlExternCheck IS { NONLOCAL d_webRawe d_webSite ; % ; EACH write '' 'clusterOps_urlExternCheck : ' ; % write 'stepin : urls_check "extern"' ; urls_check 'extern' ; write '' ; } 08********08 #] 08Jun2021 Re-check links in d_webSite see "$d_Qndfs""webSite/1_webSite instructions.txt" - fixes for webSite link problems >> OK - as per menu checks below, it all looks good. I d't check links in webPages Added online checks and moved to : #] 08Jun2021 check online links - add to d_webRawe checks 08********08 #] 08Jun2021 webRaweSite_doAll There may be problems with clusterOps, but run the whole thing and see! +-----+ Try : qnial> bye $ qnial qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webRaweSite_doAll >> OK, now it's working, after small fixes webSite_link_counts -> formatting of tables OK for terminal, but not for writefile +-----+ Re-try : qnial> bye $ qnial qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webSite_link_counts #] 08Jun2021 webSite_link_counts -> writefile of 'Count of files on this webSite' not working see strings.ndf 08********08 #] 07Jun2021 back to upload website problems backup problems - does revamped coding work now? (properly?) webPageRawe_update doesn't work Best to 1st try new optr : webPageRawe_update_tests IS { LOCAL backtrack depther depther_global finn fout line path pathList paths pathsIndxs p_temp strList subDir ; NONLOCAL d_temp d_webRawe d_webSite ; % ; webPageRawe_update l (link d_webRawe 'index.html') ; } +-----+ Try from cold-load of QNial qnial> bye $ qnial qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webPageRawe_update_tests >> I didn't work - was missing backup argument (put in & re-loaddefed) webPageRawe_update_tests IS { LOCAL backtrack depther depther_global finn fout line path pathList paths pathsIndxs p_temp strList subDir ; NONLOCAL d_temp d_webRawe d_webSite ; % ; d_backup := link d_webRawe (link 'z_Archive/' timestamp_YYMMDD ' backups/') ; webPageRawe_update l (link d_webRawe 'index.html') d_backup ; } qnial> webPageRawe_update_tests inside : webPageRawe_update >> OK - nice to see "inside" message >> index.html was updated, except for recurring problems wsubDir not being "filled out" :
  • Cool emails
  • System_maintenance +-----+ Now do qnial> clusterOps_WebRawe +--+ clusterOps_WebRawe : stepin : webRawe_extract_pathsSubDirsFnames inside : webRawe_extract_pathsSubDirsFnames executing webRawe_extract_pathsSubDirsFnames This is done only by [initial loaddefs, webURLs_extract, webRawLinks_remove_pct20 webRaweSite_doAll] or when manually invoked after changes stepin : webPageRawe_update webAllRawOrSite_update error, unknown d_backup : /media/bill/Dell2/Website - raw/z_Archive/210607 11h25m08s webURLs_extract backups/ stepin : urls_check "intern" inside : urls_check "intern" ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/Dell2/Website - raw/webWork files/urls intern fails.txt /media/bill/Dell2/Website - raw/z_Archive/210607 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/Dell2/Website - raw/webWork files/urls intern OK.txt /media/bill/Dell2/Website - raw/z_Archive/210607 backups/ rm: cannot remove '/media/bill/Dell2/Website - raw/webWork files/urls intern fails.txt': No such file or directory rm: cannot remove '/media/bill/Dell2/Website - raw/webWork files/urls intern OK.txt': No such file or directory stepin : urls_check "bkmkEx" inside : urls_check "bkmkEx" ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/Dell2/Website - raw/webWork files/urls bkmkEx fails.txt /media/bill/Dell2/Website - raw/z_Archive/210607 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/Dell2/Website - raw/webWork files/urls bkmkEx OK.txt /media/bill/Dell2/Website - raw/z_Archive/210607 backups/ rm: cannot remove '/media/bill/Dell2/Website - raw/webWork files/urls bkmkEx fails.txt': No such file or directory rm: cannot remove '/media/bill/Dell2/Website - raw/webWork files/urls bkmkEx OK.txt': No such file or directory +--+ >> at least it ran! again "stepin" appears which is good >> big problem didn't create : 210607 11h25m08s webURLs_extract backups/ /media/bill/Dell2/Website - raw/z_Archive/210607 backups/ there is a [conflict, inconsistency] here in d_backup for 'urls [bkmkEx, intern] [fails, OK].txt' Change : +.....+ urls_check IS OP linkType { LOCAL p_link line flst fbad fOK fbok p_list p_bad p_OKK ; NONLOCAL d_backupDay d_temp d_webRawe d_webSite htmlPathsSortedByPath p_webPageList p_webSiteURLlist ; +.....+ To : +.....+ urls_check IS OP linkType d_backup { LOCAL p_link line flst fbad fOK fbok p_list p_bad p_OKK ; NONLOCAL d_temp d_webRawe d_webSite htmlPathsSortedByPath p_webPageList p_webSiteURLlist ; also search-replace d_backupDay -> d_backup +.....+ Re-try qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> clusterOps_WebRawe +--+ clusterOps_WebRawe : stepin : webRawe_extract_pathsSubDirsFnames inside : webRawe_extract_pathsSubDirsFnames executing webRawe_extract_pathsSubDirsFnames This is done only by [initial loaddefs, webURLs_extract, webRawLinks_remove_pct20 webRaweSite_doAll] or when manually invoked after changes stepin : webPageRawe_update webAllRawOrSite_update error, unknown d_backup : /media/bill/Dell2/Website - raw/z_Archive/210607 11h43m19s webURLs_extract backups/ stepin : urls_check "intern" stepin : urls_check "bkmkEx" +--+ >> looks good overall, but : >> BAD!! no backups even if correct '' was created!??! Why aren't the backups happening? - I just spent 2-4 days trying to get them to work!! webPageRawe_update IF flag_backup THEN p_webPage path_backupDatedTo_dir d_backup ; ENDIF ; >> Do I really want dated backups when d_backup is already dated? This just complicated re-instatement, but I will have to modify the "recovery" optr in fileops.ndf : dirBackup_restoreTo_paths IS OP flag_fnamesDated d_backup pathList - restore paths from a backup >> still handy for d_backupDated files >> rename it > dirBackup_restoreDatedTo_pathL >> 07Jun2021 and create a non-dated path version when I have to use it pathList_backupTo_dir IS OP pathList dirToCreateBackupDir dname >> what a mess - do when I need it!! webPageRawe_update Change : +.....+ IF flag_backup THEN p_webPage path_backupDatedTo_dir d_backup ; ENDIF ; +.....+ To : +.....+ IF flag_backup THEN p_webPage path_backupTo_dir d_backup ; ENDIF ; +.....+ So why didn't the backups happen? Recent changes to [boolean, path_exists] are likely the problem!? path_backupTo_dir IS OP path dirBackup { LOCAL fname p_out ; IF (NOT AND (EACH path_exists ("p_old path) ("d_old dirBackup))) THEN EACH write '?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : ' path dirBackup '' ; ELSE fname := path_extract_fname path ; p_out := link dirBackup fname ; host link 'cp -p "' path '" "' p_out '" ' ; ENDIF ; } Change : +.....+ p_out := link dir fname ; +.....+ To : +.....+ p_out := link d_backup fname ; +.....+ >> That would have been a problem! +-----+ Re-try from cold-load of QNial qnial> bye $ qnial qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> clusterOps_WebRawe >> Still no backups?! This is ridiculous. neither [webPageRawe_update, urls_check] sucessfully backup >> [boolean_test, path_exists_tests] both still work fine fileops.ndf -> path_exists I added write write result ; boolean result } lq_fileops qnial> lq_fileops qnial> clusterOps_WebRawe What's wrong with me? I've ignored a consistent error message : webAllRawOrSite_update error, unknown d_backup : /media/bill/Dell2/Website - raw/z_Archive/210607 15h11m32s webURLs_extract backups/ webAllRawOrSite_update : d_backup := link d_webRawe 'z_Archive/' timestamp_YYMMDD_HMS ' webURLs_extract backups/' ; host link 'mkdir "' d_backup '" ' ; Maybe I have to wait a bit for the creation of the dir? host 'sleep 1s' ; created as can be seen in zArchive : /media/bill/Dell2/Website - raw/z_Archive/210607 15h20m14s webURLs_extract backups stepin : webPageRawe_update webAllRawOrSite_update error, unknown d_backup : /media/bill/Dell2/Website - raw/z_Archive/210607 15h20m14s webURLs_extract backups/ qnial> path_exists "d_old '/media/bill/Dell2/Website - raw/z_Archive/210607 15h20m14s webURLs_extract backups/' l >> so the dir has been created and path_exists sees it So why the error result? Do I need to "sleep" more? I jacked it to 3s qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> clusterOps_WebRawe >there was NO sleep of 3s!! webAllRawOrSite_update error, unknown d_backup : webAllRawOrSite_update error, unknown d_backup : /media/bill/Dell2/Website - raw/z_Archive/210607 16h21m23s webURLs_extract backups/ IF (NOT path_exists "d_old d_backup) qnial> path_exists_tests >> results are fine! What is wrong? qnial> path_exists "d_old (link d_webRawe 'z_Archive/210607 16h21m23s webURLs_extract backups/') l >> it's fine, why the error? This is VERY [frustrate, time-consume]ing Try IF (NOT (path_exists "p_old webPage)) qnial> 1 take htmlPathsSortedByPath ?fill NUTTS! back to that problem, I forgot! +-----+ webRawe_extract_pathsSubDirsFnames doesn't work p_allFileList := link d_webWork 'webSite allFileList.txt' ; +--+ OOPS - this was done from within path_exists : -->[nextv] p_allFileList ?no_value >> big oops. what is happening? -->[nextv] d_webWork /media/bill/Dell2/Website - raw/webWork files/ p_allFileList WAS updated the last webRawe_extract_pathsSubDirsFnames +--+ -->[nextv] 1 take allPathsSortedByPath +------------------------------------------------------------------------------------------------------------- |/media/bill/Dell2/Website - raw/20120 [before, after] running head-on into a semi-tractor trailor hauling pro +------------------------------------------------------------------------------------------------------------- --------+ pane.jpg| --------+ >> OK, that works qnial> webRawe_extract_pathsSubDirsFnames_show1st +-------------------------+----------------------------------------------------------------------------------- |allPathsSortedByFname |/media/bill/Dell2/Website - raw/Lucas/math Howell/∫dAθpc, cos×sin^z.txt +-------------------------+----------------------------------------------------------------------------------- |allSubDirsSortedByFname |Lucas/math Howell/ +-------------------------+----------------------------------------------------------------------------------- |allFnamesSortedByFname |∫dAθpc, cos×sin^z.txt +-------------------------+----------------------------------------------------------------------------------- |allPathsSortedByPath |/media/bill/Dell2/Website - raw/20120 [before, after] running head-on into a semi-t +-------------------------+----------------------------------------------------------------------------------- |allSubDirsSortedBySubdir | +-------------------------+----------------------------------------------------------------------------------- |allMulplicateIndxs |995 6500 +-------------------------+----------------------------------------------------------------------------------- |allMulplicateFnames |005-0-59 Basic evolve 3.45624e+21 removed one deglace.txt +-------------------------+----------------------------------------------------------------------------------- |allMulplicateSubDirs |+---------------------------------------------+------------------------------------ | ||Climate and sun/Glaciation model 005/Results/|Qnial/MY_NDFS/Climate and Sun/Glacia | |+---------------------------------------------+------------------------------------ +-------------------------+----------------------------------------------------------------------------------- |htmlPathsSortedByFname |?address +-------------------------+----------------------------------------------------------------------------------- |htmlSubDirsSortedByFname |Lucas/math How +-------------------------+----------------------------------------------------------------------------------- |htmlFnamesSortedByFname |∫dAθpc, cos×sin^z.txt +-------------------------+----------------------------------------------------------------------------------- |htmlPathsSortedByPath |?address +-------------------------+----------------------------------------------------------------------------------- |htmlSubDirsSortedBySubdir| +-------------------------+----------------------------------------------------------------------------------- >> OK, so what's wrong with [all, html]SubDirsSortedBySubdir, and htmlPathsSortedByFname? Look at code : allFnamesSortedByFname allSubDirsSortedByFname allPathsSortedByFname := lists_sortupCullOn1st ( fnames subDirs allPathsSortedByPath) ; htmlFnamesSortedByFname htmlSubDirsSortedByFname htmlPathsSortedByFname := lists_sortupCullOn1st ( fnames subDirs htmlPathsSortedByPath) ; -->[nextv] EACH (gage shape) allPathsSortedByPath subDirs fnames 7299 7299 7299 -->[nextv] EACH (gage shape) htmlFnamesSortedByFname htmlSubDirsSortedByFname htmlPathsSortedByFname 6237 6237 6237 -->[nextv] EACH (gage shape) htmlPathsSortedByPath -->[nextv] p_webPageList ?no_value I put back into 'webSite header.ndf' : p_webPageList := link d_webWork 'webSite webPageList.txt' ; Re-try : qnial> bye $ qnial qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> clusterOps_WebRawe -->[nextv] EACH (gage shape) allPathsSortedByPath subDirs fnames 7299 7299 7299 -->[nextv] EACH (gage shape) htmlPathsSortedByPath subDirs fnames ++----+----+ ||7299|7299| ++----+----+ -->[nextv] resume -->[stepv] nextv -->[nextv] EACH (gage shape) htmlPathsSortedByPath subDirs fnames 222 222 222 qnial> clusterOps_WebRawe Same problem : '210607 18h32m02s webURLs_extract backups' was created, but NO backups! qnial> EACH (gage shape) allPathsSortedByFname allSubDirsSortedByFname allFnamesSortedByFname 6237 6237 6237 qnial> EACH (gage shape) allPathsSortedByPath allSubDirsSortedBySubdir 7299 383 qnial> EACH (gage shape) allMulplicateIndxs allMulplicateFnames allMulplicateSubDirs 678 678 678 qnial> EACH (gage shape) htmlPathsSortedByFname htmlSubDirsSortedByFname htmlFnamesSortedByFname 206 206 206 qnial> EACH (gage shape) htmlPathsSortedByPath htmlSubDirsSortedBySubdir 222 59 >> GOOD! so all of these look OK now >> BUT, same problem : webAllRawOrSite_update error, unknown d_backup : /media/bill/Dell2/Website - raw/z_Archive/210607 18h32m02s webURLs_extract backups/ >> and as in the past, dir WAS created! WAIT!! IF the dir is created, then the error message shouldn't appear!!??? IF (NOT path_exists "d_old d_backup) THEN host link 'mkdir "' d_backup '"' ; ELSE noError := o ; EACH write '?webAllRawOrSite_update error, unknown d_backup : ' (link ' ' d_backup) ; ENDIF ; None of the html files were updated ... IDIOT!!!! conditional is WRONG!! It was removed. +-----+ Re-try : qnial> bye $ qnial qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> clusterOps_WebRawe +--+ clusterOps_WebRawe : stepin : webRawe_extract_pathsSubDirsFnames inside : webRawe_extract_pathsSubDirsFnames executing webRawe_extract_pathsSubDirsFnames This is done only by [initial loaddefs, webURLs_extract, webRawLinks_remove_pct20 webRaweSite_doAll] or when manually invoked after changes stepin : webPageRawe_update stepin : urls_check "intern" stepin : urls_check "bkmkEx" +--+ >> No error message : 'webAllRawOrSite_update error, unknown d_backup' >> No backups either >> htmls NOT updated check : webPageRawe_update IS OP flag_backup p_webPage d_backup % backup file - mistakes can be VERY time-costly!! ; IF flag_backup THEN p_webPage path_backupTo_dir d_backup ; ENDIF ; >> maybe it's the flag_backup? qnial> webPageRawe_update_tests inside : webPageRawe_update >> OK, index.html was updated So is there something wrong with webAllRawOrSite_update? I just fixed one error with this : IF ("webPageRawe_update = optr_rawOrSite) THEN % create a new backup directory for every use of "webPageRawe_update, as damage can be VERY time-costly!!! ; d_backup := link d_webRawe 'z_Archive/' timestamp_YYMMDD_HMS ' webURLs_extract backups/' ; host link 'mkdir "' d_backup '" ' ; ENDIF ; >> the actual backup is done file-by-file by webPageRawe_update IF (= "webPageRawe_update optr_rawOrSite) THEN webPageRawe_update l webPage d_backup ; ELSE webPageSite_update webPage ; ENDIF ; >> where l = flag_backup qnial> webRawe_extract_pathsSubDirsFnames_gageShape +-------------------------+----+ |allPathsSortedByFname |6237| +-------------------------+----+ |allSubDirsSortedByFname |6237| +-------------------------+----+ |allFnamesSortedByFname |6237| +-------------------------+----+ |allPathsSortedByPath |7299| +-------------------------+----+ |allSubDirsSortedBySubdir | 383| +-------------------------+----+ |allMulplicateIndxs | 678| +-------------------------+----+ |allMulplicateFnames | 678| +-------------------------+----+ |allMulplicateSubDirs | 678| +-------------------------+----+ |htmlPathsSortedByFname | 206| +-------------------------+----+ |htmlSubDirsSortedByFname | 206| +-------------------------+----+ |htmlFnamesSortedByFname | 206| +-------------------------+----+ |htmlPathsSortedByPath | 222| +-------------------------+----+ |htmlSubDirsSortedBySubdir| 59| +-------------------------+----+ qnial> webRawe_extract_pathsSubDirsFnames_showNth 200 +-------------------------+----------------------------------------------------------------------------------- |allPathsSortedByFname |/media/bill/Dell2/Website - raw/Lucas/0_Lucas notes.txt +-------------------------+----------------------------------------------------------------------------------- |allSubDirsSortedByFname |Lucas/ +-------------------------+----------------------------------------------------------------------------------- |allFnamesSortedByFname |0_Lucas notes.txt +-------------------------+----------------------------------------------------------------------------------- |allPathsSortedByPath |/media/bill/Dell2/Website - raw/Bill Howells videos/160901 Big Data, Deep Learning, +-------------------------+----------------------------------------------------------------------------------- |allSubDirsSortedBySubdir |Neural nets/Conference guides/2020 WCCI Glasgow/ +-------------------------+----------------------------------------------------------------------------------- |allMulplicateIndxs |1125 2254 +-------------------------+----------------------------------------------------------------------------------- |allMulplicateFnames |fires, Hoyte & Schatten p161 - Canadian forest fires.JPG +-------------------------+----------------------------------------------------------------------------------- |allMulplicateSubDirs |+-----------+-----------------+ | ||Cool stuff/|Hussar/FireFight/| | |+-----------+-----------------+ +-------------------------+----------------------------------------------------------------------------------- |htmlPathsSortedByFname |/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page +-------------------------+----------------------------------------------------------------------------------- |htmlSubDirsSortedByFname |Qnial/code develop_test/webSite/ +-------------------------+----------------------------------------------------------------------------------- |htmlFnamesSortedByFname |webSite test- page Howell - blog.html update.html +-------------------------+----------------------------------------------------------------------------------- |htmlPathsSortedByPath |/media/bill/Dell2/Website - raw/Qnial/Manuals/05mmdd NialLecture.html +-------------------------+----------------------------------------------------------------------------------- |htmlSubDirsSortedBySubdir|?address +-------------------------+----------------------------------------------------------------------------------- webPageRawe_update : pt a flag_break in : +--+ IF flag_backup THEN p_webPage path_backupTo_dir d_backup ; IF flag_break THEN BREAK ; ENDIF ; ENDIF ; +--+ qnial> fonn l qnial> clusterOps_WebRawe clusterOps_WebRawe : stepin : webRawe_extract_pathsSubDirsFnames inside : webRawe_extract_pathsSubDirsFnames executing webRawe_extract_pathsSubDirsFnames This is done only by [initial loaddefs, webURLs_extract, webRawLinks_remove_pct20 webRaweSite_doAll] or when manually invoked after changes stepin : webPageRawe_update stepin : urls_check "intern" stepin : urls_check "bkmkEx" >> YIKES! break never occurred! Why? +--+ IF (= "webPageRawe_update optr_rawOrSite) THEN webPageRawe_update l webPage d_backup ; ELSE webPageSite_update webPage ; ENDIF ; ENDIF ; +--+ >> "stepin" indicates that webPageRawe_update is being run W webAllRawOrSite_update IS OP optr_rawOrSite +--+ IF (= "webPageRawe_update optr_rawOrSite) THEN webPageRawe_update l webPage d_backup ; ELSE webPageSite_update webPage ; ENDIF ; +--+ >> flag_break should be set!? I put in to see 200+ trues : write flag_backup ; >> "stepin" indicates that webPageRawe_update is being run WRONG! - "inside" shows that!!! [webPageRawe_update, urls_check "intern", urls_check "bkmkEx"] are NOT called! Why? webAllRawOrSite_update Change : +.....+ IF (NOR ("webPageRawe_update "webPageSite_update EACHLEFT = optr_rawOrSite)) THEN write link '?webAllRawOrSite_update - unrecognized optr_rawOrSite : ' (string optr_rawOrSite) ; ELSE noError := o ; ENDIF ; +.....+ To : +.....+ IF (NOR ("webPageRawe_update "webPageSite_update EACHLEFT = optr_rawOrSite)) THEN write link '?webAllRawOrSite_update - unrecognized optr_rawOrSite : ' (string optr_rawOrSite) ; noError := o ; ENDIF ; +.....+ What about urls check? IF (NOR ('intern' 'extern' 'bkmkEx' EACHLEFT = linkType)) THEN write link '?urls_check error unknown linkType : ' linkType ; noError := o ; ENDIF ; >> this was correct urls_check IS OP linkType d_backup write 'stepin : urls_check "intern"' ; urls_check 'intern' ; write 'stepin : urls_check "bkmkEx"' ; urls_check 'bkmkEx' ; >> missing backup!! Oops, I'm screwed, as this is defined in webAllRawOrSite_update it should return d_backup, so should clusterOps_WebRawe >> done +-----+ Re-try : qnial> bye $ qnial qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> clusterOps_WebRawe >> FINALLY!!! backups were made!!! what a mess I made... 08********08 #] 07Jun2021 Failures from successive loaddefs - [boolean, path_exists] tests example +-----+ identify conditions leading to failure failure if 'fileops.ndf' is reloaded does 'QNial setup.ndf' re-load the same? >> NO! both [boolean, path_exists] tests still work fine if fails, will lq_setup fix it? > yes So what causes the problem in 'filops.ndf"?? +-----+ LOCAL issue?? wasn't included : noError +-----+ Pre-defined optr symbols for loaddefs? 'QNial setup.ndf' - these come before loaddefs of [strings, fileops].ndf : is_variableDefined IS OP symbolPhr { null } writefile_debug IS OP AAA { null } file_exists IS OP p_type p_name { null } array_findAll_subArray IS OP subArray array_to_search { null } 'fileops.ndf' : # from 'QNial setup.ndf' : boolean IS OP chrIntStr { null } # from further down in 'fileops.ndf' : strList_readFrom_path IS OP path { null } strList_writeTo_path IS OP strList p_out { null } csvTable_readFrom_path IS OP path { null } >> Ah hah?!? >> I moved boolean to 'QNial setup.ndf' Note : as a general principle, pre-defined symbols should be put into : 'QNial setup.ndf' - if the symbol is used in >1 ndf file - if the symbol is used ONLY in the ndf file of the pre-define +-----+ Re-try from cold-load of QNial qnial> bye $ qnial qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> boolean_test qnial> path_exists_tests qnial> boolean_test >> OK, it works!!! 08********08 #] 06Jun2021 backup problems from 04Jun2021 upload website NYET - backup problems - revamped coding (much work on legacy path_exists!!!) urls_check works very well again webPageRawe_update doesn't work Comment out : % d_webRawe - initial cleanup of html files ; % reset [linkError, dotSlash] links to be ready for fixes ; % 04Jun2021 is this step even useful? dotSlash - not used any more? ; % write 'stepin : str_replaceIn_pathList' ; % str_replaceIn_pathList l d_webRawe '!!linkError!!' '' htmlPathsSortedByPath ; qnial> bye qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' loading : QNial setup.ndf ?undefined identifier: LINK 'grep "^#] " "$d_SysMaint""Linux/find notes.txt" | sed ' ?undefined identifier: LINK 'grep "^#] " "$d_SysMaint""Linux/sed summary.txt" | sed ' >> oops, must move AFTER loaddefs [strings.ndf, fileops.ndf] qnial> clusterOps_WebRawe stepin : webPageRawe_update >> there is NO "inside" statement qnial> write post (3 take htmlPathsSortedByPath) ?fill ?fill ?fill >> NUTS! recurrence of an old problem Take a break and mow the lawn 08********08 #] 04Jun2021 path_exists, boolean [optr, test] path_exists bash if dir path dir path exits already "d_old "p_old '-d' '-f' can create "d_new "p_new '-d' '-f' I did fileops.ndf manually Find others within d_Qndfs : $ find "$d_Qndfs" -type f -name "*.ndf" | grep --invert-match "z_Old\|z_Archive\|webSite maintain \|file_ops.ndf" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "path_exists" "FILE" | sed 's|/media/bill/Dell2/Website - raw/Qnial/MY_NDFS/||' | sort >"$d_temp""path_exists in Qndfs.txt" >> OK, I can live through this : 29 instances in 8-10 paths +-----+ #] 06Jun2021 continue path_exists fixes diff_Howell.ndf:53 economics, markets/yahoo finance news [search,NLP].ndf:184 economics, markets/winURL yahoo finance news download.ndf:155 economics, markets/options data [download, process].ndf:291 webSite/webSite header.ndf:39 review move comments and strip illegal characters.ndf:169 fit_linearRegress.ndf:101 workFlow loop.ndf:52: finns_OK optimize - particle swarm.ndf:75 >> done Find others outside of d_Qndfs : $ find "$d_webRawe" -type f -name "*.ndf" | grep --invert-match "z_Old\|z_Archive\|file_ops.ndf" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "path_exists" "FILE" | sed 's|/media/bill/Dell2/Website - raw/Qnial/MY_NDFS/||;s/\(.*)\):[0-9].*:\(.*\)/\1/' >"$d_temp""path_exists in Qndfs.txt" >> initially was WRONG!! I got nothing >> oops, I only searched what is screened out >> it was the sort!? >> I needed to put in '| sort - ' for sort from stdin >> NYET - didn't work either, in any case the line number fos? $ find "$d_webRawe" -type f -name "*.ndf" | grep --invert-match "z_Old\|z_Archive\|Qnial/MY_NDFS/" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "path_exists" "FILE" >"$d_temp""path_exists outside of Qndfs.txt" # | sed 's|/media/bill/Dell2/Website - raw/||;s/\(.*\):[0-9]*:\(.*\)/\1/' | sort - -u -maxdepth 6 +--+ Final versions : $ find "$d_Qndfs" -type f -name "*.ndf" | grep --invert-match "z_Old\|z_Archive" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "path_exists" "FILE" | sed 's|/media/bill/Dell2/Website - raw/Qnial/MY_NDFS/||;s/\(.*\):[0-9]*:\(.*\)/\1/' | sort - -u >"$d_temp""path_exists in Qndfs.txt" $ find "$d_webRawe" -type f -name "*.ndf" | grep --invert-match "z_Old\|z_Archive\|Qnial/MY_NDFS/" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "path_exists" "FILE" | sed 's|/media/bill/Dell2/Website - raw/||;s/\(.*\):[0-9]*:\(.*\)/\1/' | sort - -u >"$d_temp""path_exists outside of Qndfs.txt" >> 06Jun2021 can't get this to work! "path_exists outside of Qndfs.txt" path_exists in 'fileops.ndf' Change : +.....+ IF noError THEN result := boolean first host_result link '[ ' typet ' "' paty '" ] && echo l >"' p_hostCmdRslt '" || echo o >"' p_hostCmdRslt '" ' ; +.....+ To : +.....+ IF noError THEN result := boolean first host_result (link '[ ' typet ' "' paty '" ] && echo l >"' p_hostCmdRslt '" || echo o >"' p_hostCmdRslt '" ') ; +.....+ WOW!! I wasn't expecting this double link!! : +--+ -->[nextv] link '[ ' typet ' "' paty '" ] && echo l >"' p_hostCmdRslt '" || echo o >"' p_hostCmdRslt '" ' +-----------------+-----------------------------------+--+--+--+---+------------------------------------------ |" || echo o >"|/media/bill/ramdisk/hostCmdRslt.txt|" |[ |-f| "|/media/bill/Dell2/Website - raw/Qnial/MY_N +-----------------+-----------------------------------+--+--+--+---+------------------------------------------ ---------------+------------------+-----------------------------------+-----------------+--------------------- DFS/strings.ndf|" ] && echo l >"|/media/bill/ramdisk/hostCmdRslt.txt|" || echo o >"|/media/bill/ramdisk/h ---------------+------------------+-----------------------------------+-----------------+--------------------- --------------+--+ ostCmdRslt.txt|" | --------------+--+ -->[nextv] link link '[ ' typet ' "' paty '" ] && echo l >"' p_hostCmdRslt '" || echo o >"' p_hostCmdRslt '" ' [ -f "/media/bill/Dell2/Website - raw/Qnial/MY_NDFS/strings.ndf" ] && echo l >"/media/bill/ramdisk/hostCmdRslt.txt" || echo o >"/media/bill/ramdisk/hostCmdRslt.txt" +--+ lq_fileops - Screws up both [boolean, path_exists]_test, when they both ran fine after initial loaddefs ??????????????? >> I don't see what a double loading of lq_fileops would corrupt BOTH [boolean, path_exists]_test path_exists does call boolean +-----+ olde code # for MERGE_2_1 operator : # loaddefs link d_Qndfs 'matrix operations - symbolic & real-valued.ndf' # 28Oct2020 lists_sortupCullOn1st is in setup.ndf # 28Oct2020 "Conference guides" are not included as I don't want to risk corrupting them & I won't change them anyways # 29Oct2020 - there are 2729 relevant files (except 15 ^[.]files, 1 ^[#] files I can't easily grep-out) excluding "Conference guides" # There must be something more [simple, efficient] than 'rows transpose mix'? ; # anachronistic webSite_sortCullGradeupOn1st_allPathsAndFnames IS webRawe_extract_pathsSubDirsFnames webSite_extractAll_pathsSubDirsFnames IS webRawe_extract_pathsSubDirsFnames # 25May2021 no longer there? # (link d_Qtest 'webSite/Website updates- tests.ndf') 08********08 #] 04Jun2021 upload website NYET - backup problems - revamped coding (much work on legacy path_exists!!!) urls_check works very well again webPageRawe_update doesn't work 08********08 #] 04Jun2021 webSite_link_counts - loaddef problem +-----+ webSite_link_counts total := sum link n_fails n_unkns n_OKKs ; tbl_tots := 4 2 reshape (sum n_fails) 'failed targetURLs' (sum n_unkns) 'unknown targetURLs' (sum n_OKKs ) 'OK targetURLs' total 'total' ; close fout ; ENDIF ; >> weird : close fout shouldn't be there!? I deleted it It was a mixup between [p_allLinksCnt, p_allLinkLineCnt] go with p_allLinkLineCnt to be symbol-consistent (this is what cost me 1/2 a day of frustration!) write (3 2 reshape _allLinkLineCnt 'count of all linkLines in webPages of this webSite (1)' p_allLinkErrorCnt 'count of all linkErrors on the webSite (2)' p_allLinkErrorNotOnlyCnt 'count of all linkErrorNotOnlys - with other text in link (3)' p_allLinkErrorOnlyCnt 'count of all linkErrorsOnlys - no other text in link (4)' ) ; +-----+ olde code p_log := link d_htmlBackup '0_webPageSite_update log.txt' ; flog := open p_log "a ; flog EACHRIGHT writefile '........' (link '# webPageSite_update for : ' fname) '' ; writefile flog 'diff results : ' ; close flog ; writeDoStep (link 'str_replaceIn_pathList l ' chr_apo d_webRawe chr_apo ' ' chr_apo '!!linkError!!' chr_apo ' ' chr_apo '' chr_apo ' ' 'htmlPathsSortedByPath' ) ; 08********08 #] 03Jun2021 continued fixes of LinkErrors +-----+ linkError onlys Most "empty links" were put in my webPages as a reminder that [thought, list]s are incomplete. In this case, empty links are a good thing for me, perhaps annoying to the reader. fileops.sh : [#=; backtrack ;=#]!!linkError!!"> # To see files containing linkErrors without [subDir, fname, bkmk]: # pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '..linkError..">' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" $ bash "$d_bin""fileops.sh" >>nothing >> WRONG! Fixed [FILE, FILE765] mixup in pathLinn_str_extractPathsWithStrTo_pathLout : grep --with-filename --line-number "$2" "$FILE765" | sed "s|\($FILE765\)\(.*\)|\1|g" | sort -u >>"$3" So yesterday's analysis was trash, like : # grep '!!linkError!!' "$d_webRawe""webWork files/webURLs_extract allLinks.txt" | sed 's|\(.*\)!!linkError!!\(.*\)">\(.*\)[A,a]>\(.*\)|!!linkError!!\2|g' >"$d_webRawe""webWork files/webURLs_extract errLinkLines.txt" fileops.sh # To count all linkErrors : see notes # wc -l "$d_webRawe""webWork files/webURLs_extract errLinkLines.txt" # To see files containing linkErrors without [subDir, fname, bkmk]: pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!">' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" # output [count, path] of pathLinn files containing linkErrors ONLY : pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" '!!linkError!!">' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" : >> only about ~40 linkErrorOnly occurences in 11 webPages compared to ~86 total linkErrors on website so ~5%, which isn't huge. +-----+ Return to where I started yesterday before linkErrorOnly fiasco work : /media/bill/WebSite/!!linkError!!Cool emails/ /media/bill/WebSite/!!linkError!!corona virus/#Corona virus models /media/bill/WebSite/!!linkError!!corona virus/#Cosmic/Galactic rays at historical high in summer 2019 /media/bill/WebSite/!!linkError!!corona virus/#COVID-19 data and models /media/bill/WebSite/!!linkError!!corona virus/#Daily cases charts for countries, by region /media/bill/WebSite/!!linkError!!corona virus/#Howells blog posts to MarketWatch etc /media/bill/WebSite/!!linkError!!corona virus/#Jumping off the cliff and into conclusions /media/bill/WebSite/!!linkError!!corona virus/#New corona virus cases/day/population for selected countries /media/bill/WebSite/!!linkError!!corona virus/#Questions, Successes, Failures /media/bill/WebSite/!!linkError!!corona virus/#Spreadsheet for generating the charts /media/bill/WebSite/!!linkError!!CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt >> all affected by subDir? >> corona virus links are WRONG!!!! >> Cool emails - not scoured by webRawe_extract_pathsSubDirsFnames because thy have many links that I don't want to cleanup! - maybe in a decade or so.... +--+ +-----+ $ bash "$d_bin""fileops.sh" using : # To see files containing '!!linkError!!Cool emails' : pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!Cool emails' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" # output [count, path] of pathLinn files containing '!!linkError!!Cool emails' : pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" '!!linkError!!Cool emails' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" +--+ /media/bill/Dell2/Website - raw/index.html:1 /media/bill/Dell2/Website - raw/page blogs.html:1 /media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html:1 +--+ >> not enough to worry about,again, mailtos normal webPages $ grep -rc --with-filename '!!linkError!!Cool emails' "$d_webRawe""Cool emails" >> None have '!!linkError!!Cool emails' $ grep -rc --with-filename '!!linkError!!' "$d_webRawe""Cool emails" >> None have '!!linkError!!' +-----+ /media/bill/WebSite/!!linkError!!corona virus/#Corona virus models /media/bill/WebSite/!!linkError!!corona virus/#Cosmic/Galactic rays at historical high in summer 2019 /media/bill/WebSite/!!linkError!!corona virus/#COVID-19 data and models /media/bill/WebSite/!!linkError!!corona virus/#Daily cases charts for countries, by region /media/bill/WebSite/!!linkError!!corona virus/#Howells blog posts to MarketWatch etc /media/bill/WebSite/!!linkError!!corona virus/#Jumping off the cliff and into conclusions /media/bill/WebSite/!!linkError!!corona virus/#New corona virus cases/day/population for selected countries /media/bill/WebSite/!!linkError!!corona virus/#Questions, Successes, Failures /media/bill/WebSite/!!linkError!!corona virus/#Spreadsheet for generating the charts /media/bill/WebSite/!!linkError!!corona virus/#Corona virus models >> this link doesn't make sense - it doesn't have an fname Get [count/path, pathL] $ bash "$d_bin""fileops.sh" using : pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!corona virus/#Corona virus models' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" /media/bill/Dell2/Website - raw/Pandemics, health, and the Sun/_Pandemics, health, and the sun.html >> I edited the webPage anndixed the links +-----+ /media/bill/WebSite/!!linkError!!CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt Get [count/path, pathL] $ bash "$d_bin""fileops.sh" using : pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" /media/bill/Dell2/Website - raw/Neural nets/Conference guides/Publications website/IEEE CrossCheck.html:1 >> just one file Correction in webPage : Neural nets/Conference guides/Publications website/CrossCheck - Publications Chair explanation of CrossCheck results and analysis.html 03Jun2021 Stop for now - leave NN corrections to next round in 3-6 months +-----+ qnial> webRaweSite_doAll o ; % o = flag_online, so online links are not verified ; +-----+ ?no_value - count of all links in webPages of this webSite (1) why doesn't this work now, it was until recently just fine?!?! 'wc -l "' p_allLinks '" | sed "s|^\([0-9]*\)\(.*\)|\1|"' $ wc -l "$d_webRawe""webWork files/webURLs_extract allLinkLines.txt" | sed "s|^\([0-9]*\)\(.*\)|\1|" 8799 >> works fine, I can't see error in webSite_link_counts p_allLinkLineCnt := first host_result (link 'wc -l "' p_allLinks '" | sed "s|^\([0-9]*\)\(.*\)|\1|"') ; write (2 2 reshape p_allLinksCnt 'count of all links in webPages of this webSite (1)' p_allLinkErrorCnt 'count of all linkErrors on the webSite (2)' ) ; >> OK, wrong symbol. I fixed it qnial> webRaweSite_doAll o >> still same problem 210603 12h34m07s webURLs_extract 2 1 0 6 0 3 1 2 h 3 4 m 1 6 s u r l s _ c h e c k ' e x t e r n ' ' o ' webRaweSite_doAll Change : +.....+ writeDoStep (link 'urls_check ' chr_apo 'extern' chr_apo ' ' chr_apo flag_online chr_apo) ; +.....+ To : +.....+ writeDoStep (link 'urls_check ' chr_apo 'extern' chr_apo) ; +.....+ Retry : qnial> webRaweSite_doAll o +-----+ only internal failure left, "urls inter fails.txt" : /media/bill/WebSite/MindCode/References/Howell 150225 - MindCode Manifesto.odt Where did this come from? "webURLs_extract allLinks.txt" search "Howell 150225 - MindCode Manifesto.odt" >> not here ... worry about this later as the main document comes up fine. +-----+ pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!Software programming & code/System_maintenance/' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" >> appears only in : /media/bill/Dell2/Website - raw/page Software programming.html:1>> This subDir problem should have been corrected, but it wasn't!! pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!Software programming & code/System_maintenance/' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" 'Climate and sun/Laskar etal model for solar insolation in QNial programming language' >> is a subDir!!! I should copy it to QNial/MY_NDFS/!! That way the optrs will be picked out >> I did it! Plus other 'Climate and Sun' subDirs pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" /media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html:1 /media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html:1 /media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html str_replaceIn_path.html:1 /media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html:1 I generalized webSite_fixes_noChrBads # fix the str_linkError webSite_fixes_noChrBads "yesBackup" '!!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language' 'Qnial/MY_NDFS/Laskar etal model for solar insolation in QNial programming language/' Check all 4 webPages for corrections >> I will probably have to fix all other links to subDirs in '_Climate and sun.html' But - hopefully webSite_doAll will do it!!?? +-----+ After a few more corrections, improvements of webSite_link_counts outputs #************ # Actual work on fixing links of [corrupt, change, move]ed [subDir, fname]s # OK, I don't need more examples : just change the linkError str below : # find files with llinkError # pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" # fix the str_linkError webSite_fixes_noChrBads "yesBackup" '!!linkError!!Software programming & code/System_maintenance/' 'System_maintenance/' +-----+ olde code # /media/bill/WebSite/!!linkError!!/national/nationalpost/ # contained in only : # /media/bill/Dell2/Website - raw/economics, markets/Nuclear for tar sands 23Sep05.html # webSite_fixes_noChrBads "yesBackup" '!!linkError!!/national/nationalpost/' 'economics, markets/Nuclear for tar sands 23Sep05.html' # /media/bill/WebSite/!!linkError!!Software programming & code/bin/SSH/ # contained in : # ??? # webSite_fixes_noChrBads "yesBackup" '!!linkError!!Software programming & code/bin/SSH/' 'bin/SSH/' # /media/bill/WebSite/!!linkError!!Software programming & code/System_maintenance/ # contained in : # /media/bill/Dell2/Website - raw/page Software programming.html # /media/bill/Dell2/Website - raw/security/encryption-decryption instructions.html # webSite_fixes_noChrBads "yesBackup" '!!linkError!!Software programming & code/System_maintenance/' 'System_maintenance/' # /media/bill/WebSite/!!linkError!!SP500 1928-2020 yahoo finance.dat # contained in : # pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!SP500 1928-2020 yahoo finance.dat' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" # /media/bill/WebSite/!!linkError!!Table of Contents # contained in : # pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!Table of Contents' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" # To count all linkErrors : see notes # wc -l "$d_webRawe""webWork files/webURLs_extract errLinkLines.txt" # To see files containing linkErrors without [subDir, fname, bkmk]: # pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!">' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" # output [count, path] of pathLinn files containing linkErrors ONLY : # pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" '!!linkError!!">' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" # To see files containing '!!linkError!!Cool emails' : # pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!Cool emails' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" # output [count, path] of pathLinn files containing '!!linkError!!Cool emails' : # pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" '!!linkError!!Cool emails' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" # pathLinn_str_extractCountPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!corona virus/#Corona virus models' "$d_temp""pathLinn_str_extractCountPathsWithStrTo_pathLout temp.txt" 08********08 #] 02Jun2021 webSite_doAll - how much is lt to fix? After 01-02Jun2021 fixes of [corrupt, chang, move, restructur]ed [subDir, fname]s (see below) qnial> webRaweSite_doAll o ; % o = flag_online, so online links are not verified ; >> hardly a dent in errors - [frustrating, slow, brutal] work p_nonuniqueFileList := link d_temp 'webRawe_extract_pathsSubDirsFnames nonuniqueFileList.txt' ; host link 'find "' d_webRawe '" -type f -name "*" | grep --invert-match "z_Old\|z_Archive\|code develop_test\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " | sort >"' p_nonuniqueFileList '" ' ; >> some of these should perhaps NOT be excluded, bmust look into side effects : code develop_test\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References /media/bill/WebSite/!!linkError!!Cool emails/ /media/bill/WebSite/!!linkError!!corona virus/#Corona virus models /media/bill/WebSite/!!linkError!!corona virus/#Cosmic/Galactic rays at historical high in summer 2019 /media/bill/WebSite/!!linkError!!corona virus/#COVID-19 data and models /media/bill/WebSite/!!linkError!!corona virus/#Daily cases charts for countries, by region /media/bill/WebSite/!!linkError!!corona virus/#Howells blog posts to MarketWatch etc /media/bill/WebSite/!!linkError!!corona virus/#Jumping off the cliff and into conclusions /media/bill/WebSite/!!linkError!!corona virus/#New corona virus cases/day/population for selected countries /media/bill/WebSite/!!linkError!!corona virus/#Questions, Successes, Failures /media/bill/WebSite/!!linkError!!corona virus/#Spreadsheet for generating the charts /media/bill/WebSite/!!linkError!!CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt >> all affected by subDir? >> corona virus links are WRONG!!!! >> Cool emails - not scoured by webRawe_extract_pathsSubDirsFnames because thy have many links that I don't want to cleanup! - maybe in a decade or so.... +-----+ Completely different subject : # fileops.sh - count all linkErrors : pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" > doesn't work?? pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" 'linkError' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" # To count all linkErrors : see notes # grep '!!linkError!!' "$d_webRawe""webWork files/webURLs_extract allLinks.txt" | sed 's|\(.*\)!!linkError!!\(.*\)">\(.*\)<\[Aa]>\(.*\)|!!linkError!!\2|g' >"$d_webRawe""webWork files/webURLs_extract errLinkLines.txt" # grep '!!linkError!!' "$d_webRawe""webWork files/webURLs_extract allLinks.txt" | sed 's|\(.*\)!!linkError!!\(.*\)">\(.*\)[A,a]>\(.*\)|!!linkError!!\2|g' >"$d_webRawe""webWork files/webURLs_extract errLinkLines.txt" # big problem with multiple linkErrors!! - must affect all other results too? # this was already solved - check QNial program!! # cmd := link 'grep -E -i ">"' p_temp1 '"' ; # grep -E -i "\(.*\)[A,a]>\(.*\)|!!linkError!!\2|g' >"$d_webRawe""webWork files/webURLs_extract errLinkLines.txt" # grep '!!linkError!!' "$d_webRawe""webWork files/webURLs_extract allLinks.txt" | sed 's|\(.*\)!!linkError!!\(.*\)">\(.*\)<\[Aa]>\(.*\)|!!linkError!!\2|g' >"$d_webRawe""webWork files/webURLs_extract errLinkLines.txt" many of all errors are empty links!!!! >> I must have erased tem at some time!!?? shit - this is a HUGE challenge! $ grep -E -i "^\x26\x26linkError\x26\x26&" "$d_webRawe""webWork files/webURLs_extract allLinks.txt" >"$d_webRawe""webWork files/webURLs_extract errOnlyLines.txt" >> no result $ grep -E -i "^\x26\x26linkError\x26\x26&" "$d_webRawe""webWork files/webURLs_extract allLinks.txt" >"$d_webRawe""webWork files/webURLs_extract errOnlyLines.txt" $ grep -E -i "^\d102\d102linkError\d102\d102&" "$d_webRawe""webWork files/webURLs_extract allLinks.txt" >"$d_webRawe""webWork files/webURLs_extract errOnlyLines.txt" # big problem with multiple linkErrors!! - must affect all other results too? # this was already solved - check QNial program!! # cmd := link 'grep -E -i ">"' p_temp1 '"' ; # grep -E -i "\(.*\)[A,a]>\(.*\)|!!linkError!!\2|g' >"$d_webRawe""webWork files/webURLs_extract errLinkLines.txt" $ grep -i "^..linkError..$" "$d_webRawe""webWork files/webURLs_extract errLinkLines.txt" >"$d_webRawe""webWork files/webURLs_extract errOnlyLines.txt" >> Bingo! 228 cases out of 806, ~1/4 of total $ wc -l "$d_webRawe""webWork files/webURLs_extract allLinks.txt" | sed 's|^\([0-9]*\)\(.*\)|\1|' $ wc -l "$d_webRawe""webWork files/webURLs_extract errLinkLines.txt" | sed 's|^\([0-9]*\)\(.*\)|\1|' $ wc -l "$d_webRawe""webWork files/webURLs_extract errOnlyLines.txt" | sed 's|^\([0-9]*\)\(.*\)|\1|' The count of alllinks is cumulating with each run! Must be zeroed! 08********08 01-02Jun2021 create fileops.sh to make work easier +-----+ Several [basic, general] optrs, plus : webSite_fixes_noChrBads() { if [ "$1" == "yesBackup" ]; then d_backup="$d_webRawe""z_Archive/$date_ymd pathL_strInn_replaceStr_strOut/" if ! [ -d "$d_backup" ]; then mkdir "$d_backup" fi fi d_webWork="$d_webRawe""webWork files/" pathLinn="$d_webWork""webSite webPageList.txt" pathLout1="$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" pathLout2="$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout empty temp.txt" strInn="$2" strOut="$3" #pathLinn_str_extractPathsWithStrTo_pathLout "$pathLinn" "$strInn" "$pathLout1" while read -u 9 FILE; do pathL_strInn_replaceStr_strOut "$FILE" "$strInn" "$strOut" "$d_backup" done 9< "$pathLout1" # can redo extraction as a check. It should give empty "$pathLout2" pathLinn_str_extractPathsWithStrTo_pathLout "$pathLinn" "$strInn" "$pathLout2" } test with one file situation (i had already been corrected) # /media/bill/Dell2/Website - raw/economics, markets/Nuclear for tar sands 23Sep05.html webSite_fixes_noChrBads "yesBackup" '!!linkError!!/national/nationalpost/' 'economics, markets/Nuclear for tar sands 23Sep05.html' Check updated file : /media/bill/Dell2/Website - raw/economics, markets/Nuclear for tar sands 23Sep05.html >> YIKES! - it is empty!!! "$d_webRa""z_Archive/210531 18h59m44s backups webPageRawe_update/" backup file is OK!! no !!linkErrors So why was file cleared? Re-test to see if it was due to early fileops.sh changes, or if the problem persists. >> not erased this time. +-----+ Testing : # testing : # 02Jun2021 # /media/bill/WebSite/!!linkError!!/national/nationalpost/ # webSite_fixes_noChrBads "yesBackup" '!!linkError!!/national/nationalpost/' 'economics, markets/Nuclear for tar sands 23Sep05.html' # /media/bill/WebSite/!!linkError!!Software programming & code/bin/SSH/ # dangerous - maybe MANY links!! first do : # pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!/national/nationalpost/' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" # nothing in temp file - why? Maybe the last search did it? # same danger (even more!) with : # /media/bill/WebSite/!!linkError!!Software programming & code/System_maintenance/ # check : # pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!Software programming & code/System_maintenance/' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" # Only 2 so run : # webSite_fixes_noChrBads "yesBackup" '!!linkError!!Software programming & code/System_maintenance/' 'System_maintenance/' Next : # /media/bill/WebSite/!!linkError!!Software programming & code/bin/SSH/ # dangerous - maybe MANY links!! first do : pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!/national/nationalpost/' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" # nothing in temp file - why? Maybe the last search did it? # same danger (even more!) with : # /media/bill/WebSite/!!linkError!!Software programming & code/System_maintenance/ pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!Software programming & code/System_maintenance/' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" >> Only 2 : /media/bill/Dell2/Website - raw/page Software programming.html /media/bill/Dell2/Website - raw/security/encryption-decryption instructions.html OK, so run # webSite_fixes_noChrBads "yesBackup" '!!linkError!!Software programming & code/System_maintenance/' 'System_maintenance/' Check both files : >> YIKES!!! again, like previous case, it is zeroed-out!?? z_Archive/210602 pathL_strInn_replaceStr_strOut : ALL backup files are zeroed!! +-----+ fix the erasing-file problem!!!! Only function that over-writes : pathInn_strInn_changeStr_strOut() { # backup if $4 = valid dir if [ -d "$4" ]; then cp -up "$1" "$4" fi p_temp="$d_temp""pathInn_strInn_changeStr_strOut temp.txt" grep "$2" "$1" | sed "s|$2|$3|" >>"$p_temp" mv "$p_temp" "$1" } >> oops, there are only 3 arguments!!! change "$4" to "$3" pathL_strInn_replaceStr_strOut Change : +.....+ while read -u 9 FILE; do pathInn_strInn_changeStr_strOut "$1" "$2" "$3" "$4" done 9< "$1" +.....+ To : +.....+ while read -u 9 FILE; do pathInn_strInn_changeStr_strOut "$FILE" "$2" "$3" "$4" done 9< "$1" +.....+ Rerun : # /media/bill/Dell2/Website - raw/economics, markets/Nuclear for tar sands 23Sep05.html webSite_fixes_noChrBads "yesBackup" '!!linkError!!/national/nationalpost/' 'economics, markets/Nuclear for tar sands 23Sep05.html' >> bad!!! real mess linkError fix temp1.txt : /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html:1090:
  • >> pathLinn_str_extractPathsWithStrTo_pathLout : sed isn't working now. Why? I differentiated $FILE in fileops.sh by adding not-so-random number suffixes webSite_fixes_noChrBadsI uncommented : pathLinn_str_extractPathsWithStrTo_pathLout "$pathLinn" "$strInn" "$pathLout1" Retry # /media/bill/WebSite/!!linkError!!/national/nationalpost/ # contained in only : # /media/bill/Dell2/Website - raw/economics, markets/Nuclear for tar sands 23Sep05.html webSite_fixes_noChrBads "yesBackup" '!!linkError!!/national/nationalpost/' 'economics, markets/Nuclear for tar sands 23Sep05.html' >> runs smoothly now >> OK, no linkErrors Retry # /media/bill/WebSite/!!linkError!!Software programming & code/bin/SSH/ # contained in : # ??? # webSite_fixes_noChrBads "yesBackup" '!!linkError!!Software programming & code/bin/SSH/' 'bin/SSH/' $ bash "$d_bin""fileops.sh" /media/bill/Dell2/Website - raw/bin/fileops.sh: line 82: : No such file or directory >> I assume that the linkError had been resolved and no longer exists... Retry # /media/bill/WebSite/!!linkError!!Software programming & code/System_maintenance/ # contained in : # /media/bill/Dell2/Website - raw/page Software programming.html # /media/bill/Dell2/Website - raw/security/encryption-decryption instructions.html # webSite_fixes_noChrBads "yesBackup" '!!linkError!!Software programming & code/System_maintenance/' 'System_maintenance/' $ bash "$d_bin""fileops.sh" /media/bill/Dell2/Website - raw/bin/fileops.sh: line 82: : No such file or directory >> Oops - I had no commented-out Retry # /media/bill/WebSite/!!linkError!!Software programming & code/bin/SSH/ Retry only # /media/bill/WebSite/!!linkError!!Software programming & code/System_maintenance/ >> I assume that the linkError had been resolved and no longer exists... These all seem OK. Now for new ones. +-----+ "$d_Qndfs""2_website updates [summary, ToDos, status].txt" see 01Jun2021 Overview of other situations to cluster cases and tackle together +--+ These should be OK - but nothing in directories! /media/bill/WebSite/!!linkError!!Solar modeling and forecasting/ /media/bill/WebSite/!!linkError!!Solar modeling and forecasting/_Solar modeling & forecasting.html >> neither d_web[Rawe,Site] have anything. I must have screwed up big time >> get from backup drive Seagate4Tb180804 - works now on USB2, didn't work on USB3 at least for a bit(?) /media/bill/Seagate4Tb180804/200824 SWAPPER monthly_backup/Website - raw/Solar modeling and forecasting/ >> it has the [subDir, file]s!! >> I copied it over Oops - Charvatova has own subDir "$d_webRawe""Charvatova solar inertial motion & activity/" I deleted the But I did copy over (probably are duplicates: Charbonneau 2002 - The rise and fall of the first sunspot model.pdf Howell - Solar presentations 13Oct06.pdf Maybe I should add the HUGE "Climate ctive, static] directories to "Solar modeling and forecasting/" ? >> worry about this later... +--+ Conference Guide - is it updated or excluded? >> leave this for last... >> these are included and updated, but they don't have embedded menu etc inserts /media/bill/WebSite/!!linkError!!WCCI2020 mass email [SS,Comp,Tut,Wrkshp] 191025 Howell.html /media/bill/Dell2/Website - raw/Neural nets/Conference guides/Publications website/CrossCheck - Publications Chair explanation of CrossCheck results and analysis.html >> exists in PubGuide /media/bill/WebSite/!!linkError!!Neural nets/Conference guides/Author guide website/N-19557 wrong paper [size, margin]s.pdf /media/bill/WebSite/!!linkError!!webWork files/confMenu_authors.html >> not present - is it somewhere else? backup drive, but confMenus aren't being used yet files shouldn't be linked! /media/bill/WebSite/!!linkError!!IJCNN2019 mass email 180925 v180925 Howell final.html /media/bill/WebSite/!!linkError!!IJCNN2019 mass email 181102 v181102 Howell with Plenaries, MDPI Genisama logos.html /media/bill/WebSite/!!linkError!!IJCNN2019 mass email 190108 v190102 Howell paper deadline, Plenary Speakers.html Again, I'll look at ConfGuides in the next round in 6 months or so.... I left it in "$d_Qndfs""2_website updates [summary, ToDos, status].txt" +--+ Hopeless - track these down if I can, or change link to something else : /media/bill/WebSite/!!linkError!!Table of Contents /media/bill/WebSite/!!linkError!!SP500 1928-2020 yahoo finance.dat # /media/bill/WebSite/!!linkError!!SP500 1928-2020 yahoo finance.dat # contained in : pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!SP500 1928-2020 yahoo finance.dat' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" >> ??it's empty (4 bytes) - no instances found? I doubt that, but leave it for now # /media/bill/WebSite/!!linkError!!Table of Contents # contained in : pathLinn_str_extractPathsWithStrTo_pathLout "$d_webRawe""webWork files/webSite webPageList.txt" '!!linkError!!Table of Contents' "$d_temp""pathLinn_str_extractPathsWithStrTo_pathLout temp.txt" >> Again!!! ??it's empty (4 bytes) - no instances found? I doubt that, but leave it for now 08********08 #] 01Jun2021 more fixes +-----+ 1. /media/bill/WebSite/!!linkError!!/national/nationalpost/ I need a much simpler approach for subDirs that have been [move, restructur]ed, and that don't involve [apo, quote, &]s in [subDir, fname]s see $ bash "$d_bin""fileops.sh" - collecton of handy bash scripts for handling files +--+ #] pathLinn_str_extractPathsWithStrTo_pathLout() - given pathLinn with a list of paths, #] write those with str to pathLout #] webSite_fixes_noChrBads() - website fixes of [changed, moved] subDirs with noChrBads like [apo, quote, &] #] must change the inputs as required. QNial programs could do this instead #] pathLinn_strInn_changeStr_strOut_pathLout() - given pathLinn with a list of paths, each containing strInn, #] replace strInn with strOut and write to pathLout +--+ Ouch! non-existant code for webPages (very old) /media/bill/Dell2/Website - raw/economics, markets/Nuclear for tar sands 23Sep05.html >> only file with the "!!linkError!!/national/nationalpost/" 08********08 #] 01Jun2021 70 failed links I can handle manually >> quite possibly menu links? Check one-at-a-time and seek ways of fixing computer coding +-----+ 1. /media/bill/WebSite/!!linkError!!Allegre's second thoughts.pdf where does this links appear? #] find specific '!!linkError!!' $ find "$d_webSite" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Allegre' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" +--+ $ find "$d_webSite" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Allegre' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" /media/bill/WebSite/Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html:199: +--+ >> the apostrophe is a killer!! remove from fname >> I already did that long ago! just didn't change the link >> I fixed the link, no coding changes required +-----+ same apostrophe killers!! : /media/bill/WebSite/!!linkError!!Bill Howells videos/120214 Venus et Mars, au dela d'une histoire d amour/Mythology.flv +--+ directory is problem! changed to : 120214 Venus et Mars, au dela d une histoire d amour >> ouch, may have many consequences? : $ find "$d_webSite" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '120214 Venus et Mars, au dela d' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" +--+ $ find "$d_webSite" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '120214 Venus et Mars, au dela d' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" /media/bill/WebSite/Personal/130726 Deer collison/Car collision with a deer.html:129:The video is raw, unprocessed and completely amateur, as I have only started taking videos. At least it's a start - as my previous video posting, "14Feb12 Venus et Mars - au dela d'une histoire d'amour (video 240 Mbytes)", only consisted of still images (pictures taken from books, the internet, PowerPoint-like software, and graphics programs) and audio (music CDs, voice recordings).

    /media/bill/WebSite/Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Birkeland rotation in galaxy - not dark matter.html:74: /media/bill/WebSite/Bill Howells videos/Howell - videos.html:74: /media/bill/WebSite/Bill Howells videos/Howell - videos.html:148:
  • /media/bill/WebSite/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html:74: /media/bill/WebSite/webWork files/Menu Howell videos.html:20: /media/bill/WebSite/page Howell - blog.html:973:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html:384: As a second bonus, in extracting and posting this video, I also took a tiny bit of extra time to convert a video I amde 2 yers ago, but which was posted ONLY in ogg (ogv) file format, whic most Microsoft users probably can't/ won't read. flv format compatible with most browsers :
    Venus et Mars - au dela d'une histoire d'amour (video 150 Mbytes)
    /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html:903:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html:910:flv format compatible with most browsers :
    Venus et Mars - au dela d'une histoire d'amour (video 150 Mbytes) /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html:384: As a second bonus, in extracting and posting this video, I also took a tiny bit of extra time to convert a video I amde 2 yers ago, but which was posted ONLY in ogg (ogv) file format, whic most Microsoft users probably can't/ won't read. flv format compatible with most browsers :
    Venus et Mars - au dela d'une histoire d'amour (video 150 Mbytes)
    /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html:903:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html:910: flv format compatible with most browsers :
    Venus et Mars - au dela d'une histoire d'amour (video 150 Mbytes) /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html str_replaceIn_path.html:384: As a second bonus, in extracting and posting this video, I also took a tiny bit of extra time to convert a video I amde 2 yers ago, but which was posted ONLY in ogg (ogv) file format, whic most Microsoft users probably can't/ won't read. flv format compatible with most browsers :
    Venus et Mars - au dela d'une histoire d'amour (video 150 Mbytes)
    /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html str_replaceIn_path.html:903:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html str_replaceIn_path.html:910: flv format compatible with most browsers :
    Venus et Mars - au dela d'une histoire d'amour (video 150 Mbytes) /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html:450: As a second bonus, in extracting and posting this video, I also took a tiny bit of extra time to convert a video I amde 2 yers ago, but which was posted ONLY in ogg (ogv) file format, whic most Microsoft users probably can't/ won't read. flv format compatible with most browsers :
    Venus et Mars - au dela d'une histoire d'amour (video 150 Mbytes)
    /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html:969:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html:976: flv format compatible with most browsers :
    Venus et Mars - au dela d'une histoire d'amour (video 150 Mbytes) +--+ >> ouch - just as I thought : 19 links. Not easy to sed changes for apos either redo to temp file : $ find "$d_webSite" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '120214 Venus et Mars, au dela d' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""Venus et Mars line-links.txt" do I have a [bash script, QNial pgm] to change a line in a file? Better to search-lace in file. QNial is safer with apostrophes. pList done by hand, geany (search "/media/bill/WebSite/" replace "/media/bill/Dell2/Website - raw/" : qnial> pList := '/media/bill/Dell2/Website - raw/Personal/130726 Deer collison/Car collision with a deer.html' '/media/bill/Dell2/Website - raw/Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Birkeland rotation in galaxy - not dark matter.html' '/media/bill/Dell2/Website - raw/Bill Howells videos/Howell - videos.html' '/media/bill/Dell2/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html' '/media/bill/Dell2/Website - raw/webWork files/Menu Howell videos.html' '/media/bill/Dell2/Website - raw/page Howell - blog.html' '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html' '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html' '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html str_replaceIn_path.html' '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html' qnial> str_replaceIn_pathList o d_webRawe (link '/120214 Venus et Mars, au dela d' chr_apo 'une histoire d amour/') '/120214 Venus et Mars, au dela d une histoire d amour/' pList Re-check to temp2 file, but this time for d_webRawe where changes were made : $ find "$d_webRawe" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '120214 Venus et Mars, au dela d' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""Venus et Mars line-links2.txt" >> still have 16 instead of 19 before!!??? >> but it's OK! the subDir has been corrected +-----+ 2. /media/bill/WebSite/!!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF >> just a file, easy, NOTE the fname apostrophe! $ find "$d_webSite" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number 'Vaughan 120324 The Solar Cycle' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""Vaughan 120324 The Solar Cycle temp.txt" >> 7 cases : +--+ /media/bill/WebSite/Climate and sun/_Climate and sun.html:136:
  • /media/bill/WebSite/Paul L Vaughan/0_Paul L Vaughan.html:328:
  • /media/bill/WebSite/page Howell - blog.html:965:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html:895:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html:895:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html str_replaceIn_path.html:895:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html:961:
  • +--+ pList done by [find above, hand, geany (search "/media/bill/WebSite/" replace "/media/bill/Dell2/Website - raw/"] : geany regexpr search: (/media/bill/WebSite/)(.*):(.*):(.*) replace : /media/bill/Website - raw/\2 /media/bill/Website - raw/Climate and sun/_Climate and sun.html /media/bill/Website - raw/Paul L Vaughan/0_Paul L Vaughan.html /media/bill/Website - raw/page Howell - blog.html /media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html /media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html /media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html /media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html geany add apos to above list pre-pended by 'qnial> pList := ' : geany regexpr search: (.*) replace : '\1 qnial> pList := '/media/bill/Website - raw/Climate and sun/_Climate and sun.html' '/media/bill/Website - raw/Paul L Vaughan/0_Paul L Vaughan.html' '/media/bill/Website - raw/page Howell - blog.html' '/media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html' '/media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html' '/media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html' '/media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html' qnial> str_replaceIn_pathList o d_webRawe (link '!!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF') 'Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF' pList Re-check to temp2 file, but this time for d_webRawe where changes were made : $ find "$d_webRawe" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number 'Paul L Vaughan/Vaughan 120324 The Solar Cycles Footprint on Terrestrial Climate.PDF' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""Venus et Mars line-links2.txt" >> still have 16 instead of 19 before!!??? >> but it's OK! the subDir has been corrected NUTS!! pList was all in d_webSite, but thesse files weren't in d_wRawe : I copied them over +-----+ 3. /media/bill/WebSite/!!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv Why is this a problem? it look legitamit! #] 0. Setup - define variables to make processing easier, reduce errors that be catastrophic NOTICE : include ['!!linkError!!', subDir, fname] as needed to make specific to LINKS ONLY!! don't want regular text to be changed, don't bother with [#=; backtrack ;=#] - not necessary DON'T include "#d_webSite",as this is replace with "$d_webRawe"!! DO include [start,trail]ing slashes for subDir replacement term in the search term include trailing slash for subDir replacements ONLY cannot use directly with unix : "!!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF" $ unix_noApoQuote='!!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv' $ unix_fixed='Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv' $ echo "$unix_noApoQuote" ; echo "$unix_fixed" Pay attention to [apo, quote, &]s in [subDir, fname]s qnial> qnial_error := '!!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv' qnial> qnial_replace := 'Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv' qnial> EACH write qnial_error qnial_replace ; #] 1. change the affected [subDir, fname]s in "$d_webRawe" (not $d_webSite") This is done manually via the fileManager, before creating pList #] 2. find pList - files with fname $ find "$d_webSite" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "$unix_noApoQuote" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""linkError fix temp1.txt" /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html:373:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html:373:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html str_replaceIn_path.html:373:
  • /media/bill/WebSite/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html:439:
  • #] 3. prepare QNial "pList" variable assignment expression #] use htmlList of last step, regular expression search-replace with txt-editor geany geany regexpr multiline search-replace search: (/media/bill/WebSite/)(.*):(.*):(.*) replace : /media/bill/Website - raw/\2 geany add apos to above list pre-pended by 'qnial> pList := ' : geany escSeq search: \n replace : ' '\n qnial> pList := '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html:373:
  • ' '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html' '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html str_replaceIn_path.html' '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html' >> did twice, same results, then fixed 'Dell2/' error in instructions #] 4. replace erroneous [subDir, fname] in pList with QNial #] QNial is slow, but much safer with [apo, quote]s than [find, sed, grep], although the latter do work qnial> str_replaceIn_pathList o d_webRawe qnial_error qnial_replace pList +--+ ?str_replaceIn_pathList error, file unknown : /media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html ?str_replaceIn_pathList error, file unknown : /media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html ?str_replaceIn_pathList error, file unknown : /media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html str_replaceIn_path.html ?str_replaceIn_pathList error, file unknown : /media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html +--+ >> why the error? all of these files are present! Lots of linkErrors but the pList files don't appear as links? Maybe I missed that step? Redo step 3 >> nope, same results ?str_replaceIn_pathList error, file unknown : /media/bill/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html /media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html >> Yikes!! wrong subDir - missing 'Dell2/' from pList missing search-replace Instructions Change : +.....+ replace : /media/bill/Website - raw/\2 +.....+ To : +.....+ replace : /media/bill/Dell2/Website - raw/\2 +.....+ qnial> str_replaceIn_pathList o d_webRawe qnial_error qnial_replace pList #] 5. Re-list to temp2 file, but this time for d_webRawe where changes were made : $ find "$d_webRawe" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '$unix_fixed' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""linkError fix temp2.txt" Frankly, I'm not sure if the `& caused a problem or what. Leave it for next round Also, as these looks OK except maybe & : /media/bill/WebSite/!!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv/media/bill/WebSite/!!linkError!!Charvatova solar inertial motion & activity/Verification/ /media/bill/WebSite/!!linkError!!Software programming & code/bin/SSH/ /media/bill/WebSite/!!linkError!!Software programming & code/System_maintenance/ /media/bill/WebSite/!!linkError!!Solar modeling and forecasting/_Solar modeling & forecasting.html >> This is going to be major work +-----+ 4. /media/bill/WebSite/!!linkError!!Climate and sun/Glaciation model 005 >> missing trailing `/? in link Also : /media/bill/WebSite/!!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language #] 0. Setup - define variables to make processing easier, reduce errors that be catastrophic NOTICE : include ['!!linkError!!', subDir, fname] as needed to make specific to LINKS ONLY!! don't want regular text to be changed, don't bother with [#=; backtrack ;=#] - not necessary DON'T include "#d_webSite",as this is replace with "$d_webRawe"!! DO include [start,trail]ing slashes for subDir replacement term in the search term include trailing slash for subDir replacements ONLY For [apo,quote,&]s cannot use directly with unix : "!!linkError!!Climate and sun/Glaciation model 005" >> not a problem here $ unix_noRootLinkErrorApoQuote='!!linkError!!Climate and sun/Glaciation model 005' $ unix_fixed='Climate and sun/Glaciation model 005/' $ echo "$unix_noRootLinkErrorApoQuote" ; echo "$unix_fixed" Pay attention to [apo, quote, &]s in [subDir, fname]s qnial> qnial_error := '!!linkError!!Climate and sun/Glaciation model 005' qnial> qnial_replace := 'Climate and sun/Glaciation model 005/' qnial> EACH write qnial_error qnial_replace ; Fear - this may replace legitimate links : check very carefully "linkError fix temp1.txt"! #] 1. change the affected [subDir, fname]s in "$d_webRawe" (not $d_webSite") This is done manually via the fileManager, before creating pList >> not required here #] 2. find pList - files with fname $ find "$d_webSite" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "$unix_noRootLinkErrorApoQuote" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""linkError fix temp1.txt" #] 3. prepare QNial "pList" variable assignment expression #] use htmlList of last step, regular expression search-replace with txt-editor geany geany regexpr multiline search-replace search: (/media/bill/WebSite/)(.*):(.*):(.*) replace : /media/bill/Dell2/Website - raw/\2 geany add apos to above list pre-pended by 'qnial> pList := ' : geany escSeq search: \n replace : ' ' qnial> pList := '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html convertBodyLinks.html' '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html' '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html str_replaceIn_path.html' '/media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/webSite test- page Howell - blog.html update.html' >> still can't find pList paths!? /media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite test- page Howell - blog.html convertBodyLinks.html /media/bill/Dell2/Website - raw/Qnial/code develop_test/webSite/ >> OK - now I see it, was missing 'webSite/'
  • #] 4. replace erroneous [subDir, fname] in pList with QNial #] QNial is slow, but much safer with [apo, quote]s than [find, sed, grep], although the latter do work qnial> str_replaceIn_pathList o d_webRawe qnial_error qnial_replace pList #] 5. Re-list to temp2 file, but this time for d_webRawe where changes were made : $ find "$d_webRawe" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '$unix_fixed' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >"$d_temp""linkError fix temp2.txt" >> "linkError fix temp2.txt" is empty? why - it was listed before? >> OK - appears in 'Climate and sun/Glaciation model 005', but isn't fixed!! but why isn't it found? I don't get it - just wait until after next webSite_doAll +-----+ 5. /media/bill/WebSite/!!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language >> missing trailing `/? in link Skip this, maybe just due to mfiles missing in d_webRawe? economics, markets/SP500/multi-fractal/SP500 1928-2020 yahoo finance.dat >> missing part of subDir - but should still work 08********08 #] 31May2021 same problem with 'urls errorslist.txt' - !!linkError!! >> seems to be [fname, subDirs] in same subDir as webPages No sense trying to run webRaweSite_doAll - first fix 'index.html' webPageSite_update depther_global := 0 ; IF (OR ('Menu' 'fin Head' 'fin Footer' 'fin footer' EACHLEFT subStr_in_str webPageRawe)) THEN depther := depther_global ; ELSE depther := (gage shape (`/ findAll_Howell webPageRawe)) - (gage shape (`/ findAll_Howell d_webRawe)); ENDIF ; backtrack := link (depther reshape (solitary '../')) ; >> if backup = 0 causes error? qnial> a := 0 reshape (solitary '../') qnial> gage shape a 0 qnial> a := link (0 reshape (solitary '../')) qnial> gage shape a 0 qnial> diagram a + | + Look at : line := str_executeEmbeds line (("fname fname)("fout fout)("backtrack backtrack)) ; qnial> abadabado := ' football ' football qnial> str_executeEmbeds '[#=; abadabado ;=#]!!linkError!!Bill Howells videos/' (("fname 'index.html')("fout 5)("abadabado abadabado)) football !!linkError!!Bill Howells videos/ Now try a null : qnial> abadabado := null qnial> str_executeEmbeds '[#=; abadabado ;=#]!!linkError!!Bill Howells videos/' (("fname 'index.html')("fout 5)("abadabado abadabado)) !!linkError!!Bill Howells videos/ >> This worked fine, so that's probably NOT the problem Try simply removing '!!linkError!!', don't replace it with '[#=; backtrack ;=#]' qnial> str_replaceIn_path o (link d_webRawe 'z_Archive') '!!linkError!!' '' (link d_webRawe 'index.html') >> OK qnial> webPageSite_update o (link d_webRawe 'index.html') >> that worked webRaweSite_doAll replaced backtrack with '' : writeDoStep (link 'str_replaceIn_pathList l ' chr_apo d_webRawe chr_apo ' ' chr_apo '!!linkError!!' chr_apo ' ' chr_apo '' chr_apo ' ' 'htmlPathsSortedByPath' ) ; retry qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webRaweSite_doAll d_webRawe
  • Bill Howells videos d_webSite
  • Bill Howells videos >> what is putting the links back in? - looks like webPageRawe_update > internalLinks_return_relativePath qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webPageRawe_update o (link d_webRawe 'index.html') -->[nextv] subDir fName +--------------------------------++ |economics, markets/options/TSLA/|| +--------------------------------++ -->[nextv] subDirFname_get_subPath subDir fname ?subDirFname_get_subPath error, subDir not found : economics, markets/options/TSLA/ >> WRONG!! - this should be fine!! -->[nextv] subDirFname_get_subPath_tests NOTE : there are several "FORECAST.NDF" files, so the best one must be chosen based on subDir if an incomplete subDir is provided (eg "options/TSLA/") a full subDir is returned when arbitrary selections are made, a "1st was chosen" message is written +-+----------------------------------------------------------------------------------------------------------- |o|?subDirFname_get_subPath error, subDir not found : +-+----------------------------------------------------------------------------------------------------------- |l|?subDirFname_get_subPath error, subDir not found : /media/bill/Dell2/Website - raw/ +-+----------------------------------------------------------------------------------------------------------- |l|Stalin supported Hitler/images/150213 Howell, Dad - Stalin picture.jpg +-+----------------------------------------------------------------------------------------------------------- |l|Stalin supported Hitler/images/150213 Howell, Dad - Stalin picture.jpg +-+----------------------------------------------------------------------------------------------------------- |l|Stalin supported Hitler/animations - Howell/Opening_Context/01.5 A_year_of_stunning_victories/01.5 A_year_o +-+----------------------------------------------------------------------------------------------------------- |l|?subDirFname_get_subPath error, fname not found : 01.5 A_year_of_stunning_victories.ogv +-+----------------------------------------------------------------------------------------------------------- |l|?subDirFname_get_subPath error, fname not found : 01.5 A_year_of_stunning_victories +-+----------------------------------------------------------------------------------------------------------- |l|Qnial/MY_NDFS/Ord Diff Eq Integration/FORECAST.NDF +-+----------------------------------------------------------------------------------------------------------- |l|Qnial/MY_NDFS/economics, markets/options data [download, process].ndf +-+----------------------------------------------------------------------------------------------------------- |o|?subDirFname_get_subPath error, subDir not found : options/TSLA/ +-+----------------------------------------------------------------------------------------------------------- |l|economics, markets/options/TSLA/price TSLA [call,putt] strikeDate 210618.png +-+----------------------------------------------------------------------------------------------------------- subDirFname_get_subPath Change : >> dubDirer isn't defined here! +.....+ ELSEIF (= '' fname) THEN IF (= '' subDirer) THEN ELSEIF (subDirer in allSubDirsSortedBySubdir) THEN path := link subDirer fname ; ELSE subDirs := ((subDir EACHRIGHT subStr_in_str allSubDirsSortedBySubdir) sublist allSubDirsSortedBySubdir) ; +.....+ To : +.....+ ELSEIF (= '' fname) THEN IF (= '' subDir) THEN path := fault '?subDirFname_get_subPath error, [subDir,fname] both null' ; ELSEIF (subDir in allSubDirsSortedBySubdir) THEN path := link subDir fname ; ELSE subDirs := ((subDir EACHRIGHT subStr_in_str allSubDirsSortedBySubdir) sublist allSubDirsSortedBySubdir) ; +.....+ qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webPageRawe_update o (link d_webRawe 'index.html') >> OK now qnial> webPageSite_update o (link d_webRawe 'index.html') >> also OK!!! qnial> webRaweSite_doAll /media/bill/Dell2/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 210531 19h12m31s Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : +----+---------------------------------------------------------------+ |8814|count of all links in webPages of this webSite | +----+---------------------------------------------------------------+ |1190|count of all [file, dir, url]s targeted by links on the webSite| +----+---------------------------------------------------------------+ Counts below are the number of unique TARGETED [file, dir]s of links (eg 5+ links per target on average) Failures : +--+------------+ |39|errors list | +--+------------+ |30|extern fails| +--+------------+ |1 |intern fails| +--+------------+ |0 |bkmkEx fails| +--+------------+ Unknowns - mailtos are [old, deactivated for privacy], they won't work now bookmarks not checked but their fileLinks have, easy to find location in same webPage +---+-----------+ |75 |mailto list| +---+-----------+ |309|bkmkIn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |275|extern OK| +---+---------+ |443|intern OK| +---+---------+ |18 |bkmkEx OK| +---+---------+ [fail, unknown, OK, total] counts : +----+-------------+ | 70|failed links | +----+-------------+ | 384|unknown links| +----+-------------+ | 736|OK links | +----+-------------+ |1190|total | +----+-------------+ >> super-AWESOME!!!! 70 failed links I can handle manually >> quite possibly menu links? 08********08 #] 31May2021 Major remaining issue is 'urls errors list.txt' check a few of these >> most are fine! they just have !!linkError!! for some reason? appears in : subDir_extractLongest_subDirGlobalList removes internalLinks_return_relativePath removes, but add back if error webURLs_extract merely write failed link to p_errorsURLs webRaweSite_doAll replaces '!!linkError!!' with backtrack webRaweSite_doAll - uncomment : % writeDoStep (link 'urls_check ' chr_apo 'extern' chr_apo) ; qnial> webRaweSite_doAll >same problem, 36 in index.html check in 'webSite urlList.txt' : 1 Allegre's second thoughts.pdf First step - get rid of '!!linkError!!' - why does this persist? internalLinks_return_relativePath : +--+ FOR midIndx WITH midIndxs DO bookmark := null ; lineList@midIndx := lineList@midIndx str_remove_subStr '../' ; lineList@midIndx := lineList@midIndx str_remove_subStr '[#=; backtrack ;=#]' ; lineList@midIndx := lineList@midIndx str_remove_subStr '!!linkError!!' ; lineList@midIndx := str_replace_subStr 'http://www.billhowell.ca' '' lineList@midIndx ; lineList@midIndx := str_replace_subStr 'http://www.BillHowell.ca' '' lineList@midIndx ; lineList@midIndx := str_replace_subStr d_webRawe '' lineList@midIndx ; lineList@midIndx := str_replace_subStr '%20' ' ' lineList@midIndx ; ... path := subDirFname_get_subPath subDir fname ; IF (isfault path) THEN path := link '!!linkError!!' subDir fname ; ENDIF ; lineList@midIndx := link backtrack path bookmark ; +--+ >> ?? this should be taking care of the problem? after : lineList@midIndx := lineList@midIndx str_remove_subStr '../' ; added : lineList@midIndx := lineList@midIndx str_remove_subStr './' ; subDirFname_get_subPath removed : IF (= './' (2 take subDir)) THEN subDirer := 2 drop subDirer ; ENDIF ; A key step is in web_doAll - run manually here and check result : wait! htmlPathsSortedByPath - does it list full path or subPath? >> full path, should work qnial> str_replaceIn_pathList o d_webRawe '!!linkError!!' '[#=; backtrack ;=#]' htmlPathsSortedByPath >> it didn't work? >> OOPS - look in p_temp, not at original file NYET - it's OK : Wow! total mismatch - fileList versus varList >> OK - it worked! >> but I need to change so that it overwites original file if flag_backup = o! Otherwise this is too confusing!! Also double [#=; backtrack ;=#][#=; backtrack ;=#] occurs - hopefully will be removed by internalLinks_return_relativePath str_replaceIn_pathList : +--+ IF flag_backup THEN d_backup := link d_backupRoot 'z_Archive/' timestamp_YYMMDD_HMS ' backups str_replaceIn_pathList/' ; host link 'mkdir "' d_backup '" ' ; ELSE d_backup := d_backupRoot ; ENDIF ; ... IF (NOT path_exists '-d' d_backup) THEN EACH write '?str_replaceIn_pathList error : could not create d_backup : ' d_backup ; ENDIF ; +--+ >> d_backupRoot is d_webRawe inn context (arg of optr) changed to : IF (AND flag_backup (NOT path_exists '-d' d_backup)) Try again qnial> str_replaceIn_pathList o d_webRawe '!!linkError!!' '[#=; backtrack ;=#]' htmlPathsSortedByPath now see if double [#=; backtrack ;=#][#=; backtrack ;=#] is removed by internalLinks_return_relativePath qnial> webPageSite_update o (link d_webRawe 'index.html') >> nyet?? IF flag_backup THEN host link 'mv "' p_temp '" "' webPageSite '"' ; ENDIF ; This is WRONG!! again, causes confusion! It was designed this way for early-stage debugging double [#=; backtrack ;=#][#=; backtrack ;=#] was removed, but backtrack was not put in!!?? webPageSite_update must put in backtracks!!?? backtrack := link (depther reshape (solitary '../')) ; line := str_executeEmbeds line (("fname fname)("fout fout)("backtrack backtrack)) ; That's supposed to work! Try again qnial> str_replaceIn_pathList o d_webRawe '!!linkError!!' '[#=; backtrack ;=#]' htmlPathsSortedByPath >> didn't work oops, should have been : qnial> webPageSite_update o (link d_webRawe 'index.html') >> stupid me - 'index.html' is in the root, so there are NO backtrack check '_Neil Howell.html' : >> looks good Retry qnial> webRaweSite_doAll same problem with 'urls errorslist.txt' - !!linkError!!, reduced from 95 to 90 >> seems to be [fname, subDirs] in same subDir as webPages No sense trying to run webRaweSite_doAll - first fix 'index.html' 08********08 #] 31May2021 continue fixes get rid of './' = dotSlash in subDirFname_get_subPath fix [bkmk[In,Ex]Page list pgPosn get rid of? or is there another case? +-----+ subDirFname_get_subPath Change : +.....+ IF (1 < (gage shape subDir)) THEN IF (= './' (2 take subDir)) THEN dotSlash := './' ; subDirer := 2 drop subDirer ; ELSE dotSlash := null ; subDirer := subDir ; ENDIF ; ENDIF ; +.....+ To : +.....+ IF (= './' (2 take subDir)) subDirer := 2 drop subDirer ; ENDIF ; ... removed dotSlash elsehwhere in optr +.....+ subDirFname_get_subPath Change : +.....+ midIndxsLines_bads := 'http' 'mailto:' './' ; +.....+ To : +.....+ midIndxsLines_bads := 'http' 'mailto:' ; +.....+ +-----+ fix 'urls bkmk[In,Ex]Page list.txt' bkmkInPage - webSite_link_counts only? plus header webSite_link_counts : p_temp := link d_temp 'webSite_link_counts temp.txt' ; >> latter has : 18 /media/bill/Dell2/Website - raw/webWork files/urls pgPosn list.txt >> but nothing for 'urls [bkmk[In,Ex]Page list.txt' +--+ WHILE (~= ??eof (fname := readfile finn)) DO cmd := link 'wc -l "' (link d_webWork fname) '" | sed "s/\(^[0-9]\+\) \(.*\)/\1\\t\2\\t/" >>"' p_temp '" ' ; host cmd ; +--+ webURLs_extract - creates the files >> the last run didn't change the 'urls list.txt' files? rerun to see qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webURLs_extract +--+ loading subDirFname_get_subPath ?expecting then: ) ) SUBDIRER <***> := 2 DROP loading subDirFname_get_subPath_tests ?undefined identifier: := ( SUBDIRFNAME_GET_SUBPATH <***> '' '' ) loading internalLinks_return_relativePath ?undefined identifier: PATH := SUBDIRFNAME_GET_SUBPATH <***> SUBDIR FNAME ; loading webPageRawe_update ?undefined identifier: LINE := INTERNALLINKS_RETURN_RELATIVEPATH <***> BACKTRACK ' webURLs_extract +--+ loading subDirFname_get_subPath ?expecting then: ) ) SUBDIRER <***> := 2 DROP loading webSite_link_counts ?local in nonlocal list: P_ALLLINKS P_LINKTYPEL P_SUMMARY <***> ; +--+ >> trivial stupidity on my part rerun to see qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webURLs_extract >> still no 'urls bkmk[In,Ex]Page list.txt' >> not updated : 'urls pgPosn list.txt' Why aren't working? : fbki := open p_pgBkInURLs "w ; fbke := open p_pgBkExURLs "w ; >> oops, wrong symbols,use fbki := open p_BkmkInURLs "w ; fbke := open p_BkmkExURLs "w ; >> OK, looks good now qnial> webRaweSite_doAll >> delete [howell, pgPosn] cart [lists,files] >> change code to reset 'count of all links in webSite' >> fix count of unknown links (may be error of pgPosn?) webURLs_extract - this should "reset" p_allLinks, but instead appends? : host link 'cp -p "' p_temp1 '" "' p_allLinks '"' ; I added before that : p_temp1 := link d_temp 'webURLs_extract temp1.txt' ; % delete current p_temp1 to reset ; host link 'rm "' p_temp1 '" ' ; olde code % 18Nov2020 I won't deal with 'pgPosn' until later - removed for now ; % see olde code in link d_Qndfs 'website updates notes.txt' ; +-----+ 'urls errors list.txt' >> these are all d_webSite errors - wait for another stage of corrections rerun to see qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webURLs_extract qnial> webSite_link_counts qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' loading webRawe_extract_pathsSubDirsFnames ?expecting end of block: ' This is done only by [initial loaddefs, webURLs_extract, webRawLinks_remove_pct20 webRaweSite_doAll] ' >> trivial extra `; qnial> webURLs_extract >> OK qnial> webSite_link_counts still has 'howell list' ? - oops fails to add failures qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webURLs_extract 'webSite linkType pnames.txt' still has : urls howell list.txt >> I deleted that line, also 'urls pgPosn list.txt' qnial> webSite_link_counts rerun to see qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' loading webURLs_extract ?undefined identifier: P_ERRORSURLS P_EXTERNURLS P_HOWELLURLS <***> P_INTERNURLS ?undefined identifier: P_INTERNURLS P_MAILTOURLS P_PGBKINURLS <***> P_PGBKEXURLS EACHLEFT >> I fixed these qnial> webURLs_extract qnial> webSite_link_counts intern fails : all but one are './' links, can't tell if d_web[Rawe,Site] index.html - ~36 like this :
  • Bill Howells videos
  • bin - bash scripts for Linux
  • Charvatova solar inertial motion & activity 'urls errors list.txt' - looks like ALL are d_webSite >> maybe some are from old duplicate files? >> possibly due (some of the) to directory shuffling, but also subDir reconstruction isn't good? >> many (almost half?) are conference guide links - is it bepdated? +-----+ urls_check add : % for p_BkmkExURLs, just check the root link, not the bookmark part (file exists) ; IF ('BkmkEx' = linkType)) THEN p_link := first str_splitBy_chr `# p_link ; ENDIF ; rerun to see qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webURLs_extract qnial> webSite_link_counts >> didn't work. Why? qnial> urls_check 'bkmkEx' >> MUCH better! Major remaining issue is 'urls errors list.txt' 08********08 #] 30May2021 webURLs_extract incorporate new optr subDirFname_get_subPath NUTS!!! the whole '. dotSlash was a waste time - just use backtrack +-----+ 1st modify subDirFname_get_subPath to handle './' = dotSlash >> Hmm, I might have lost many fnames with the last few webSite_doAll? +-----+ webURLs_extract - # 30May2021 - p_webSiteURLlist only has d_webSite links, NOT d_wRawe (makes sense) webRawe_extract_pathsSubDirsFnames used in : webURLs_extract -> used by webRaweSite_doAll webRawLinks_remove_pct20 webRaweSite_doAll >> I should make this more efficient - maybe too many updates? +-----+ urls_check handle internal bookmarks [internal, external] to webPage being checked internal-to-webPage link start with `#, for now simply don't corrupt & keep as is, later see if I can check Tricky for d_web[Rawe,Site], to update internal links webRaweSite_doAll : replace '!!linkError!!' with '[#=; backtrack ;=#]' webURLs_extract is for d_webSite only, changes to absolute addressing : % Convert each relative link to absolute, sort unique ; cmd := link 'cat "' p_temp2 '" | sed "s!\[#=; backtrack ;=#\]!;s' d_webSite '!g" | sort -u >"' p_webSiteURLlist '"' ; Questions : where do I convert d_webRawe back to relative??? example : after writeDoStep 'webAllRawOrSite_update l "webPageRawe_update' ; all links are relative with '[#=; backtrack ;=#\]', thy are NOT absolute internalLinks_return_relativePath does the switch back from absolute to relative +----+ for internal-to-webPage links that start with `# either [subDirFname_get_subPath,internalLinks_return_relativePath] - do it with internalLinks_return_relativePath, avoids useless processing of subDirFname_get_subPath plus urls_check - add provisions for the internal-to-webPage links IF (`# = (first lineList@midIndx)) THEN null ; % no change required for internal-to-webPage link start with `# ; ELSE >> this bypasses checks of [internal-to-webPage links, external-to-my-webPage bookmarks] add some point I must check them using eg. in a webPage in 'webSite header.ndf' : p_linkTypeL := link d_webWork 'webSite linkType pnames.txt' ; I added : urls bkmkInPg list.txt this used to be pgPosnURLs urls bookmks exPage.txt >> Nyet! these won't be checked for now and therefore shouldn't be in [urls_check,webSite linkType pnames.txt] >> already covered by webURLs_extract Change : +.....+ ELSEIF (= `# (first linker)) THEN writefile fpos linker ; +.....+ To : +.....+ ELSEIF (= `# (first linker)) THEN writefile fpos linker ; ELSEIF (in `# linker) THEN writefile fbke linker ; +.....+ +-----+ loaddef to find coding errors qnial> bye qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' +--+ >>> loading start : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf ?SCAN - missing ' at end of string: ' to e >>>>>> loading start : webSite header.ndf loading webURLs_extract ?expecting expression sequence: P_BKMKEXURL_FAILS P_BKMKEXURL_OK ; <***> ?tokens left: ; ENDIF ; <***> } loading webRaweSite_doAll ?expecting semicolon: CHR_APO D_WEBRAWE D_WEBSITE <***> errors found: 4 +--+ >> [simple,stupid] typos >> Now loads works qnial> webRaweSite_doAll (d_webSite stuff commented out) /media/bill/Dell2/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt >> Didn't help at all!! Makes sense - most were d_webSite, which I'm not doing yet urls errors list.txt - almost all are d_webSite, which I'm not doing yet urls intern fails.txt - all are './', which I failed to correct (might be d_webSite) bkmk[In,Ex]Page lists - not even created, don't worry now (lost pgPosn count) +-----+ What the heck, run d_webSite too! I uncommented that part in webRaweSite_doAll qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webRaweSite_doAll (d_webSite stuff included) >> OK, that's more like it, but not much has improved with the coding changes!! dotSlash './' still doesn't work p_pgBk[In,Ex]URLs have nothing - maybe not as easy as I thought >> Yikes! d_webSite still has '[#=; backtrack ;=#]' - backTrack isn't working,nor are the checks! >> Ouch! sed problem - many of these : ;s/media/bill/WebSite/Civilisations and sun/Howell - radioisotopes and history.jpg webURLs_extract - unfinished thought : Change : +.....+ cmd := link 'cat "' p_temp2 '" | sed "s!\[#=; backtrack ;=#\]!;s' d_webSite '!g" | sort -u >"' p_webSiteURLlist '"' ; +.....+ To : +.....+ cmd := link 'cat "' p_temp2 '" | sed "s!\[#=; backtrack ;=#\]!' d_webSite '!g" | sort -u >"' p_webSiteURLlist '"' ; +.....+ qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webRaweSite_doAll (d_webSite stuff included) +--+ +-----+-----------------------------+ |17628|count of all links in webSite| |507|intern fails| | |bkmkInPage list| +--+---------------+ | |bkmkExPage list| +--+ 210530 17h42m30s webAllRawOrSite_update l "webPageSite_update ?webAllRawOrSite_update - unrecognized optr_rawOrSite +-----+-----------------------------+ |17628|count of all links in webSite| +-----+-----------------------------+ >> wow! clearly this isn't being reset |507|intern fails| ;s/media/bill/WebSite/Paul L Vaughan/Vaughan 130224 - Solar Terrestrial Volatility Waves.pdf >> I screwed up my attempted fix! from previous correcctions : +--+ >> Ouch! sed problem - many of these : ;s/media/bill/WebSite/Civilisations and sun/Howell - radioisotopes and history.jpg webURLs_extract - unfinished thought : Change : +.....+ cmd := link 'cat "' p_temp2 '" | sed "s!\[#=; backtrack ;=#\]!;s' d_webSite '!g" | sort -u >"' p_webSiteURLlist '"' ; +.....+ To : +.....+ cmd := link 'cat "' p_temp2 '" | sed "s!\[#=; backtrack ;=#\]!' d_webSite '!g" | sort -u >"' p_webSiteURLlist '"' ; +.....+ +--+ Why hasn't 'index.html' changed? It was touched, but same problem check [webPageSite_update, webAllRawOrSite_update] webAllRawOrSite_update - not key to this? webPageSite_update - key features have not worked execute_embeds backtrack count of extern fails webPageSite_update : subDir fname := path_retrieve_subDirFname webPageRawe d_webRawe ; path_retrieve_subDirFname no longer exists!! Why no loaddef error msg? it is taken as a variable with no definition : ?no_value webPageSite : isn't defined!! >> not even set up to update? IF (NOT ("webPageRawe_update = optr_rawOrSite)) THEN write '?webAllRawOrSite_update - unrecognized optr_rawOrSite' ; ELSE olde code : % 30May2021 not required! : subDir fname := path_retrieve_subDirFname webPageRawe d_webRawe ; % modification ; webPageSite := link d_webSite (webPageRawe str_remove_subStr '[#=; backtrack ;=#]') ; % write webPageSite ; I added : webPageSite := link d_webSite (str_remove_subStr webPageRawe d_webRawe) ; str_executeEmbeds olde code : % remove [#= =#] brackets ; +-----+ I'm going to bed -just try another update qnial> bye $ qnial qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webRaweSite_doAll (d_webSite stuff included) qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, loaddefs - two oops : loading webPageSite_update +--+ ?undefined identifier: ( "fname FNAME <***> ) ( "fout loading webAllRawOrSite_update ?undefined identifier: ; ELSE WEBPAGESITE_UPDATE <***> FLAG_BACKUP WEBPAGE ; +--+ from 'strings.ndf', bring back deleted code ?& adapt? : subDir fname := path_retrieve_subDirFname webPageRawe d_webRawe ; qnial> webRaweSite_doAll (d_webSite stuff included) wrongsubDir - should have been fixed! file:///media/bill/WebSite/economics,%20markets/S&P%20500%20Shiller-forward%20PE%20versus%2010y%20Treasury%20bond%20rates.html file:///media/bill/WebSite/Solar%20modeling%20and%20forecasting/_Solar%20modeling%20&%20forecasting.html +-----+ olde code # loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' IF flag_debug THEN write 'loading webRawLinks_remove_pct20' ; ENDIF ; #] webRawLinks_remove_pct20 IS - generate htmlPathLists # 04Oct2020 initial # 18Nov2020 adapted to new coding from updates NOT done coding - have to change html files!!! # 30May2021 reove to '' - don't need now as covered by internalLinks_return_relativePath webRawLinks_remove_pct20 IS { NONLOCAL d_webRawe d_htmlBackup htmlPathsSortedByPath p_webPageList p_html_lines_pct20 p_html_files_pct20 ; % ; % backup files ; p_webPageList p_html_lines_pct20 p_html_files_pct20 EACHLEFT pathList_backupTo_dir d_htmlBackup ; % ; % webSite_readpathsSubDirsFnames in 'webSite header.ndf' lists webSite [all, html] files except : ; % "Conference guides\|z_Old\|z_Archive\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations|i9018xtp.default/extensions/" ; webRawe_extract_pathsSubDirsFnames ; % ; % list all html file lines with %20 in ["' p_html_lines_pct20 '" ' ; % ' | sed ' chr_apo 's#%20# #g' ; % ; % [cull, sort] html files with %20 in p_html_lines_pct20 ; host link 'cat "' p_html_lines_pct20 '" | sed ' chr_apo 's#\(.*\)html:\(.*\)#\1html#' chr_apo ' | sort -u >"' p_html_files_pct20 '" ' ; } # webURLs_extract %EACH path_delete p_errorsURLs p_externURLs p_howellURLs p_internURLs p_mailtoURLs p_pgPosnURLs p_temp1 ; % ; % webSite_readpathsSubDirsFnames in 'webSite header.ndf' lists webSite [all, html] files except : ; % "Conference guides\|z_Old\|z_Archive\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations|i9018xtp.default/extensions/" ; webRawe_extract_pathsSubDirsFnames ; % 24Nov2020 - 'Conference guides' are no longer excluded, so all that is included ; % it doesn't seem like all web[Page, Menu]s are listed? ; # old stuff p_html_url_clean := link d_temp 'html urls clean.txt' ; p_externURL_clean := link d_temp 'urls extern clean.txt' ; p_externURL_sort := link d_temp 'urls extern sorted.txt' ; p_internURL_clean := link d_temp 'urls intern clean.txt' ; p_internURL_sort := link d_temp 'urls intern sorted.txt' ; p_mailtoURL_clean := link d_temp 'urls mailto clean.txt' ; p_mailtoURL_sort := link d_temp 'urls mailto sorted.txt' ; pgBkInURL_clean := link d_temp 'urls pgPosn clean.txt' ; p_BkmkInURL_sort := link d_temp 'urls pgPosn sorted.txt' ; % move all "pgPosn" links in p_internURLs to p_pgPosn ; host link 'grep "#" "' p_internURLs '" >>"' p_pgPosn '"' ; host link 'grep --invert-match "#" "' p_internURLs '" >"' p_temp1 '"' ; host link 'mv "' p_temp1 '" "' p_internURLs '"' ; 08********08 #] 29May2021 fname_get_sDirPath IS OP subDir fname Must create allFnames_subDirLists in 'webSite header.ndf' qnial> allMulplicateFnames +------------------------------------------------------------------------------------------------------------- |/media/bill/Dell2/Website - raw/Lucas/math Howell/cos - 1 noo, iterative, non-feedback/d-dt Rpcs^-5_t__cos - +------------------------------------------------------------------------------------------------------------- -----+-------------------------------------------------------------------------------------------------------- 1.txt|/media/bill/Dell2/Website - raw/Lucas/math Howell/cos - 1 yes, iterative, non-feedback/d-dt Rpcs^-5_t__c -----+-------------------------------------------------------------------------------------------------------- ----------+--------------------------------------------------------------------------------------------------- os - 1.txt|/media/bill/Dell2/Website - raw/Projects - mini/History/Timechart of Military History - naval/0001i ----------+--------------------------------------------------------------------------------------------------- --------------+----------------------------------------------------------------------------------------------- w/SCAN0000.rtf|/media/bill/Dell2/Website - raw/Projects - mini/History/Timechart of Military History - naval/0 --------------+----------------------------------------------------------------------------------------------- ------------------+------------------------------------------------------------------------------------------- 002iw/SCAN0000.rtf|/media/bill/Dell2/Website - raw/Projects - mini/History/Timechart of Military History - nav ------------------+------------------------------------------------------------------------------------------- ----------------------+--------------------------------------------------------------------------------------- al/0003iw/SCAN0000.rtf|/media/bill/Dell2/Website - raw/Projects - mini/History/Timechart of Military History - ----------------------+--------------------------------------------------------------------------------------- --------------------------+----------------------------------------------------------------------------------- naval/0004iw/SCAN0000.rtf|/media/bill/Dell2/Website - raw/Projects - mini/History/Timechart of Military Histo --------------------------+----------------------------------------------------------------------------------- ------------------------------+ ry - naval/0005iw/SCAN0000.rtf| ------------------------------+ >> why isn't 'FORECAST.NDF' in the list? $ find "$d_webRawe" -name '*.NDF' | grep 'FORECAST' /media/bill/Dell2/Website - raw/Qnial/MY_NDFS/Ord Diff Eq Integration/FORECAST.NDF /media/bill/Dell2/Website - raw/Projects - mini/Solar system/Laskar - Milankovic insolation program/Howell/Glaciation Milancovic -6 to 1 My 002/documentation/FORECAST.NDF use listOfLists_linkSortCullTo_aryCommonKeyIndexs in setup.ndf qnial> gage shape allMulplicateIndxs 651 >> ouch - many duplicate [fname, subDir]s New optr #] fname_get_subDirPath IS OP subDir fname - return path if fname in allFnamesSortedByFname, else error # 28May2021 initial IF flag_break THEN BREAK ; ENDIF ; fname_get_subDirPath IS OP subDir fname { LOCAL indx path subDir subDirs ; NONLOCAL d_webRawe allFnamesSortedByFname allMulplicateFnames allMulplicateSubDirs ; IF (fname in allFnamesSortedByFname) THEN IF (NOT isfault (indx := find_Howell fname allMulplicateFnames)) THEN subDirs := indx pick allMulplicateSubDirs ; IF (isfault (subDir := first ((subDir EACHRIGHT subStr_in_str subDirs) sublist subDirs))) THEN path := fault (link '?fname_get_subDirPath error, subDir not found in allMulplicateSubDirs: ' subDir) ; ELSE path := link subDir fname ; ENDIF ; ELSE path := link ((find_Howell fname allFnamesSortedByFname) pick allSubDirsSortedByFname) fname ; ENDIF ; ELSE path := fault (link '?fname_get_subDirPath error, fname not found : ' fname) ; ENDIF ; path } +-----+ # olde code IF flag_debug THEN write 'loading webSite_readpathsSubDirsFnames' ; ENDIF ; #] webSite_readpathsSubDirsFnames IS - read stable [path, dir] lists # 11Nov2020 initial # do this only after [create, move] cart [path, dir]s # 25May2021 not invoked at present? webSite_extract_pathsSubDirsFnames is used (above) webSite_readpathsSubDirsFnames IS { NONLOCAL d_webRawe p_webPageList allPathsSortedByFname allSubDirsSortedByFname allFnamesSortedByFname allPathsSortedByPath allSubDirsSortedBySubdir htmlPathsSortedByFname htmlSubDirsSortedByFname htmlFnamesSortedByFname htmlPathsSortedByPath htmlSubDirsSortedBySubdir htmlNormalPages htmlConfGuidePages ; % ; % There must be something more [simple, efficient] than 'rows transpose mix'? ; allPathsSortedByPath := strList_readFrom_path p_allFileList ; fnamePosns := (`/ EACHRIGHT (1 + last findall) allPathsSortedByPath) ; subDirs fnames := rows transpose mix (fnamePosns EACHBOTH [take, drop] allPathsSortedByPath) ; allFnamesSortedByFname allSubDirsSortedByFname allPathsSortedByFname := lists_sortupCullOn1st ( fnames subDirs allPathsSortedByPath) ; allSubDirsSortedByFname := allSubDirsSortedByFname EACHLEFT str_extractPast_strFront d_webRawe ; allSubDirsSortedBySubdir := cull sortup allSubDirsSortedByFname ; % ; % There must be something more [simple, efficient] than 'rows transpose mix'? ; htmlPathsSortedByPath := strList_readFrom_path p_webPageList ; fnamePosns := `/ EACHRIGHT (1 + last findall) htmlPathsSortedByPath ; subDirs fnames := rows transpose mix (fnamePosns EACHBOTH [take, drop] htmlPathsSortedByPath) ; htmlFnamesSortedByFname htmlSubDirsSortedByFname htmlPathsSortedByFname := lists_sortupCullOn1st ( fnames subDirs htmlPathsSortedByPath) ; htmlSubDirsSortedByFname := htmlSubDirsSortedByFname EACHLEFT str_extractPast_strFront d_webRawe ; htmlSubDirsSortedBySubdir := cull sortup htmlSubDirsSortedByFname ; htmlConfGuidePageFlags := 'Neural nets/Conference guides/' EACHRIGHT subStr_in_str htmlPathsSortedByPath ; htmlConfGuidePages := htmlConfGuidePageFlags sublist htmlPathsSortedByPath ; htmlNormalPages := (NOT htmlConfGuidePageFlags) sublist htmlPathsSortedByPath ; } # 25May2021 not used now ??? webSite_readpathsSubDirsFnames ; # EACH (gage shape) allPathsSortedByFname allSubDirsSortedByFname allFnamesSortedByFname allPathsSortedByPath allSubDirsSortedBySubdir # EACH (gage shape) htmlPathsSortedByFname htmlSubDirsSortedByFname htmlFnamesSortedByFname htmlPathsSortedByPath htmlSubDirsSortedBySubdir # olde code p_htmlPathsSiteList htmlPathsSiteSortedByPath ; host link 'find "' d_webSite '" -maxdepth 4 -type f -name "*.html" | grep --invert-match "Conference guides\|z_Old\|z_Archive\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations|i9018xtp.default/extensions/" | sort -u >"' p_htmlPathsSiteList '" ' ; htmlPathsSiteSortedByPath := strList_readFrom_path p_htmlPathsSiteList ; 08********08 #] 28May2021 !!linkError!! problem Only [check_subDir, internalLinks_return_relativePath, webPageRawe_update] insert this into webPages? (I'm not sure) qnial> webPageRawe_update l webPage ------------------------------------------------------------- Break debug loop: enter debug commands, expressions or type: resume to exit debug loop executes the indicated debug command current call stack : webpagerawe_update internallinks_return_relativepath ------------------------------------------------------------- -->[stepv] nextv -->[nextv] line
  • Use the listing of the sub-directories of this root directory. This allows you to browse through ALL [files, directories] of this website. >> Ah hah! - double "!!linkError!!!!linkError!!" generated by bookmark-only situation, >> so BOTH [subDir, fname] fail -->[nextv] line >> not html! 'price TSLA [call,putt] strikeDate 210618.png' -->[nextv] find_Howell 'price TSLA [call,putt] strikeDate 210618.png' allFnamesSortedByFname 4404 >> no problem for ? : +--+ IF (isfault (find_Howell fname allFnamesSortedByFname)) THEN linkError := '!!linkError!!' ; ENDIF ; +--+ >> why did the flag_break even occur? -->[nextv] fname options/TSLA/price TSLA [call,putt] strikeDate 210618.png >> uh-oh, of course +--+ indxLastSlash := (last find_Howell `/ lineList@midIndx) + 1 ; IF (isfault indxLastSlash) THEN % fname only ; fname := lineList@midIndx ; subDir := '' ; ELSE % fname-subDir combo ; fname := indxLastSlash drop lineList@midIndx ; subDir := indxLastSlash take lineList@midIndx ; ENDIF ; +--+ >> test it -->[nextv] a := 'options/TSLA/price TSLA [call,putt] strikeDate 210618.png' options/TSLA/price TSLA [call,putt] strikeDate 210618.png -->[nextv] b := (last find_Howell `/ a) + 1 8 -->[nextv] b drop a TSLA/price TSLA [call,putt] strikeDate 210618.png -->[nextv] b take a options/ >> last find_Howell gives wrong slash? -->[nextv] b := (last findAll_Howell `/ a) + 1 13 >> OK -->[nextv] b drop a price TSLA [call,putt] strikeDate 210618.png -->[nextv] b take a options/TSLA/ >> OK qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webPage /media/bill/Dell2/Website - raw/index.html qnial> webPageRawe_update l webPage >> OK - mostly works. Some problems still in 'index.html' :
  • Use the listing of the sub-directories of this root directory. This allows you to browse through ALL [files, directories] of this website.
  • Cool emails
  • Solar modeling and forecasting Shower time, but start Failures : +---+------------+ |363|errors list | +---+------------+ |22 |extern fails| +---+------------+ |0 |howell list | +---+------------+ |27 |intern fails| +---+------------+ Unknowns - I havent written code to really show [OK, fail] : +--+-----------+ |75|mailto list| +--+-----------+ |15|pgPosn list| +--+-----------+ OKs - these links have been shown to work : +---+---------+ |239|extern OK| +---+---------+ |466|intern OK| +---+---------+ [fail, unknown, OK, total] counts : +----+-------------+ | 412|failed links | +----+-------------+ | 90|unknown links| +----+-------------+ | 705|OK links | +----+-------------+ |1207|total | +----+-------------+ >> still a BIG 'urls errors list.txt' : mostly bookmarks `# - I wonder if all were corrupted in earlier webSite_doAlls? subDirs without trailing `/ paths in d_webRawe root? - missing subDir.. see below : Wait a minute - most errors are for webSite, which wasn't updated! 1. Put './' back into Table of Contents (directory) of d_root 2. bookmark issue - do I have to restore a previous date of htmls? find '#Questions, Successes, Failures' in allFnamesSortedByFname from 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' 'Quick [searches, fixes] for bad webSite links - example [QNial-bash] fixes' #] search allFnamesSortedByFname for an fname qnial> (= '#Questions, Successes, Failures' allFnamesSortedByFname) sublist allPathsSortedByFname >> nothing #] search allFnamesSortedByFname for part of an fname qnial> ('#Questions, Successes, Failures' subStr_in_str allFnamesSortedByFname) sublist allPathsSortedByFname >> again, nothing qnial> ('#' subStr_in_str allFnamesSortedByFname) sublist allPathsSortedByFname >> again, nothing Is there a separate list of bookmarks somehow? p_errorsURLs = 'urls errors list.txt' webURLs_extract -> list of all '!!linkError!!' put into p_errorsURLs +--+ % move all "pgPosn" links in p_internURLs to p_pgPosnURLs ; host link 'grep "#" "' p_internURLs '" >>"' p_pgPosnURLs '"' ; host link 'grep --invert-match "#" "' p_internURLs '" >"' p_temp1 '"' ; host link 'mv "' p_temp1 '" "' p_internURLs '"' ; +--+ p_errorsURLs does NOT appear in other optrs What is p_pgPosnURLs? see 'webSite header.ndf' : p_pgPosnURLs := link d_webWork 'urls pgPosn list.txt' ; p_pgPosnURL_fails := link d_webWork 'urls pgPosn fails.txt' ; p_pgPosnURL_OK := link d_webWork 'urls pgPosn OK.txt' ; 'urls pgPosn list.txt' only lists 15 p_pgPosnURLs But conference guides have MANY!!?? 'webSite header.ndf' : # 28Oct2020 "Conference guides" are not included as I don't want to risk corrupting them & I won't change them anyways But webSite_extract_pathsSubDirsFnames doesn't show exclu: host link 'find "' d_webRawe '" -type f -name "*" | grep --invert-match "z_Old\|z_Archive\|code develop_test\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " | sort >"' p_nonuniqueFileList '" ' ; webURLs_extract +--+ IF (subStr_in_str 'http://www.billhowell.ca/' (tolower linker)) THEN writefile fhow linker ; ELSEIF (subStr_in_str '!!linkError!!' linker) THEN writefile ferr linker ; ELSEIF (subStr_in_str 'http' linker) THEN writefile fext linker ; ELSEIF (subStr_in_str 'mailto:' linker) THEN writefile fmto linker ; ELSEIF (= `# (first linker)) THEN writefile fpos linker ; ELSE writefile fint linker ; ENDIF ; +--+ >> Wait a minute - most errors are for webSite, which wasn't updated! 08********08 #] 27May2021 Again, back to the link problem with '/media/bill/Dell2/Website - raw/' As per earlier question (below) : internalLinks_return_relativePath, could be : +-- ELSEIF (OR (subDir_ins := lineList@midIndx EACHRIGHT subStr_in_str allSubDirsSortedBySubdir)) THEN subDirHits := subDir_ins sublist allSubDirsSortedBySubdir ; % How to choose a unique hit? If one is a full hit - subDir, take it, if not, error ; subDirENDs := (gage shape lineList@midIndx) EACHRIGHT takeright subDirHits ; subDirHits := (lineList@midIndx EACHRIGHT = subDirENDs) sublist subDirHits ; IF (= 1 (gage shape subDirHits)) THEN lineList@midIndx := link backtrack ((first subDirHits) str_remove_subStr d_webRawe) ; ELSE lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ENDIF ; ELSE lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ENDIF ; +--+ Look at qnial> allSubDirsSortedBySubdir none have '/media/bill/Dell2/Website - raw/' Just practice with one webPage : I plastered ?? with : IF flag_break THEN BREAK ; ENDIF ; qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> flag_break := l (fonn) qnial> webPage := link d_webRawe 'Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Birkeland rotation in galaxy - not dark matter.html' qnial> webPageRawe_update l webPage -->[nextv] i_fname 2620 -->[nextv] allFnamesSortedByFname@i_fname Dark matter video 1 - initial, simple.mpeg -->[nextv] i_fname pick allPathsSortedByFname /media/bill/Dell2/Website - raw/Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Dark matter video 1 - initial, simple.mpeg >> these all have '/media/bill/Dell2/Website - raw/' also : htmlPathsSortedByPath but : htmlSubDirsSortedBySubdir does NOT have d_webRawe >> maybe this should be the variable used? internalLinks_return_relativePath Change : +.....+ IF (NOT isfault (i_fname := find_Howell lineList@midIndx allFnamesSortedByFname)) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRawe) ; % check for a valid fname at end of a midIndx ; +.....+ To : +.....+ IF (NOT isfault (i_fname := find_Howell lineList@midIndx allFnamesSortedByFname)) THEN lineList@midIndx := link backtrack ((i_fname pick htmlSubDirsSortedBySubdir) str_remove_subStr d_webRawe) ; % check for a valid fname at end of a midIndx ; +.....+ >> notice the "switch" to select the subDir Again : qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> flag_break := l (fonn) qnial> webPage := link d_webRawe 'Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Birkeland rotation in galaxy - not dark matter.html' qnial> webPageRawe_update l webPage -->[nextv] i_fname 2620 -->[nextv] allFnamesSortedByFname@i_fname Dark matter video 1 - initial, simple.mpeg -->[nextv] i_fname pick htmlSubDirsSortedBySubdir ?address >> oops, I should have known htmlSubDirsSortedByFname should be used? qnial> EACH (gage shape) allFnamesSortedByFname allSubDirsSortedByFname 6231 6231 qnial> post allFnamesSortedByFname@1132 allSubDirsSortedByFname@1132 +-------------------------------+ |20406 puts XIU 24Apr2020.png | +-------------------------------+ |economics, markets/options/XIU/| +-------------------------------+ Change : +.....+ % don't modify midIndxs with midIndxsLines_bads ; IF (OR (`# chr_in_str lineList@midIndx) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@midIndx))) THEN null ; ELSE % remove %20 from links, now that mailtos are no longer considered ; IF ('%20' subStr_in_str lineList@midIndx) THEN lineList@midIndx := str_replace_subStr '%20' ' ' lineList@midIndx ; ENDIF ; % check for a valid fname-only, assumes only one instance of fname ; IF (NOT isfault (i_fname := find_Howell lineList@midIndx allFnamesSortedByFname)) THEN lineList@midIndx := link backtrack ((i_fname pick htmlSubDirsSortedBySubdir) str_remove_subStr d_webRawe) ; % check for a valid fname at end of a midIndx ; ELSEIF (NOT isfault (i_fname := first find_Howell ((1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; ELSEIF (`/ = (last lineList@midIndx)) THEN % check if a full subDir without fname ; IF (NOT isfault (i_subDir := first find_Howell lineList@midIndx allSubDirsSortedBySubdir)) THEN lineList@midIndx := link backtrack ((i_subDir pick allSubDirsSortedBySubdir) str_remove_subStr d_webRawe) ; % try to recover partial subDirEnd. Relies on early-stage sortup. ; % This is very inefficient coding! ; ELSEIF (OR (subDir_ins := lineList@midIndx EACHRIGHT subStr_in_str allSubDirsSortedBySubdir)) THEN subDirHits := subDir_ins sublist allSubDirsSortedBySubdir ; % How to choose a unique hit? If one is a full hit - subDir, take it, if not, error ; subDirENDs := (gage shape lineList@midIndx) EACHRIGHT takeright subDirHits ; subDirHits := (lineList@midIndx EACHRIGHT = subDirENDs) sublist subDirHits ; IF (= 1 (gage shape subDirHits)) THEN lineList@midIndx := link backtrack ((first subDirHits) str_remove_subStr d_webRawe) ; ELSE lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ENDIF ; ELSE lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ENDIF ; ELSE lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ENDIF ; ENDIF ; +.....+ To : +.....+ % don't modify midIndxs with midIndxsLines_bads ; IF (OR (`# chr_in_str lineList@midIndx) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@midIndx))) THEN null ; % ; % check [fname, dir] ; ELSE % ; IF flag_break THEN BREAK ; ENDIF ; % process link ; IF (`/ = (last lineList@midIndx)) THEN % check if subDir is valid ; subDir := check_subDir subDir ; % ; % fname is part of midIndex ; ELSE % extract [subDir, fname] ; indxLastSlash := (last find_Howell `/ lineList@midIndx) + 1 ; IF (isfault indxLastSlash) THEN % fname only ; fname := lineList@midIndx ; subDir := '' ; ELSE % fname-subDir combo ; fname := indxLastSlash dropright lineList@midIndx ; subDir := indxLastSlash drop lineList@midIndx ; ENDIF ; % check if subDir is valid ; subDir := check_subDir subDir ; % check for a valid fname-only, assumes only one instance of fname ; IF (isfault (find_Howell fname allFnamesSortedByFname)) THEN lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ELSE lineList@midIndx := link backtrack subDir fname ; ENDIF ; ENDIF ; +.....+ Also, move following to pre-checks % remove %20 from links, now that mailtos are no longer considered ; IF ('%20' subStr_in_str lineList@midIndx) THEN lineList@midIndx := str_replace_subStr '%20' ' ' lineList@midIndx ; ENDIF ; +.....+ And create new optr check_subDir IS OP subDir { LOCAL error subSubDir subDirs ; NONLOCAL allSubDirsSortedBySubdir ; IF flag_break THEN BREAK ; ENDIF ; subDirs := (0 link ((findall_Howell `/ subDir) + 1)) EACHLEFT drop subDir ; error := l ; FOR subSubDir WITH subDirs DO IF (~= null subSubDir) THEN IF (OR (subSubDir EACHRIGHT subStr_in_str allSubDirsSortedBySubdir)) THEN error := o ; EXIT 'found' ; ENDIF ; ENDIF ; ENDFOR ; IF error THEN subSubDir := link '!!linkError!!' subDir ; ENDIF ; subSubDir } >> OK, tests are find +.....+ trash subDirHits := subDir_ins sublist allSubDirsSortedBySubdir ; % How to choose a unique hit? If one is a full hit - subDir, take it, if not, error ; subDirENDs := (gage shape subDir) EACHRIGHT takeright subDirHits ; subDirHits := (subDir EACHRIGHT = subDirENDs) sublist subDirHits ; IF (= 1 (gage shape subDirHits)) THEN subDir := link backtrack ((first subDirHits) str_remove_subStr d_webRawe) ; ELSE subDir := link '!!linkError!!' subDir ; ENDIF ; +.....+ # test qnial> webPage := link d_webRawe 'economics, markets/currency-crypto/Cryptos versus [currencies, 10 year [rates, bonds]].html' qnial> webPageRawe_update l webPage +-----+ My own [cheap, crappy] animation for the spiral currents moving quickly though slowly revolving stars (mpeg format) shows the general idea (click image to view the video, right-click to download) :

    +-----+ >> It STILL doesn't work!!?? >> NOTICE! BOTH > check a different link where that is not an issue index.html - its fine except '!!linkError!!' takes the place of '[#=; backtrack ;=#]' Why?? internalLinks_return_relativePath Change : +.....+ % process link ; IF (`/ = (last lineList@midIndx)) THEN % check if subDir is valid ; subDir := check_subDir subDir ; fname := '' ; % ; % fname is part of midIndex ; ELSE % extract [subDir, fname] ; indxLastSlash := (last find_Howell `/ lineList@midIndx) + 1 ; IF (isfault indxLastSlash) THEN % fname only ; fname := lineList@midIndx ; subDir := '' ; ELSE % fname-subDir combo ; fname := indxLastSlash dropright lineList@midIndx ; subDir := indxLastSlash drop lineList@midIndx ; ENDIF ; % check if subDir is valid ; subDir := check_subDir subDir ; % check for a valid fname-only, assumes only one instance of fname ; IF (isfault (find_Howell fname allFnamesSortedByFname)) THEN lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ELSE lineList@midIndx := link backtrack subDir fname ; ENDIF ; ENDIF ; +.....+ To : +.....+ % process link ; IF (`/ = (last lineList@midIndx)) THEN % check if subDir is valid ; subDir := check_subDir subDir ; fname := '' ; % ; % fname is part of midIndex ; ELSE % extract [subDir, fname] ; indxLastSlash := (last find_Howell `/ lineList@midIndx) + 1 ; IF (isfault indxLastSlash) THEN % fname only ; fname := lineList@midIndx ; subDir := '' ; ELSE % fname-subDir combo ; fname := indxLastSlash dropright lineList@midIndx ; subDir := indxLastSlash drop lineList@midIndx ; ENDIF ; % check if subDir is valid ; subDir := check_subDir subDir ; % check for a valid fname-only, assumes only one instance of fname ; IF (isfault (find_Howell fname allFnamesSortedByFname)) THEN fname := link '!!linkError!!' fname ; ENDIF ; ENDIF ; lineList@midIndx := link backtrack subDir fname ; +.....+ Change : +.....+ liner := line str_remove_subStr '../' ; liner := liner str_remove_subStr './' ; liner := liner str_remove_subStr '[#=; backtrack ;=#]' ; liner := liner str_remove_subStr '!!linkError!!' ; midIndxs lineList := str_splitLftRgtTo_midIndxs_StrList strLeft strRight liner ; % most lines do not have [ webPage := link d_webRawe 'Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Dark matter video 1 - initial, simple.mpeg' qnial> webPageRawe_update l webPage >> It took a long time (2-3 minutes?) and it didn't do a damned thing!?!? >> Maybe hard drive was warming up? (unlikely, mht even be SSD) qnial> webPage := link d_webRawe 'index.html' qnial> webPageRawe_update l webPage >> It took a long time (1 minute?) and it doubled up the !!linkError!! I didn't check, but I doubt that it corrected any [subDir, fname]s >> linkError is supposed to be removed!! POISON!! - it truncated links!! : webPage := link d_webRawe 'index.html' /media/bill/Dell2/Website - raw/index.html qnial> webPageRawe_update l webPage 2nd test - use output of first qnial> webPageRawe_update l webPage >> No further degradations olde code # Actually not needed? taken care of by internalLinks_return_relativePath writeDoStep (link 'str_replaceIn_pathList l ' chr_apo d_webRawe chr_apo ' ' chr_apo '!!linkError!!' chr_apo ' ' chr_apo '[#=; backtrack ;=#]' chr_apo ' ' 'htmlPathsSortedByPath' ) ; 08********08 #] 27May2021 Now back to the link problem with '/media/bill/Dell2/Website - raw/' Again, first suspects : webPageRawe_update, str_replaceIn_pathList, internalLinks_return_relativePath Check webSite_doAll results with : "d_webRawe""Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Birkeland rotation in galaxy - not dark matter.html" +--+ d_webRawe
  • +--+ After last Web_doAll : 210527 13h54m25s backups webPageRawe_update Check webSite_doAll results with : "d_webRawe""Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Birkeland rotation in galaxy - not dark matter.html" +--+ My own [cheap, crappy] animation for the spiral currents moving quickly though slowly revolving stars (mpeg format) shows the general idea (click image to view the video, right-click to download) :

    +--+ >> still a problem Just run manual correction, see what happens. qnial> pathList := host_result (link 'find "$d_webRawe" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive" ') qnial> gage shape pathList 343 qnial> str_replaceIn_pathList l d_webRawe '/media/bill/Dell2/Website - raw/' '' pathList +--+ My own [cheap, crappy] animation for the spiral currents moving quickly though slowly revolving stars (mpeg format) shows the general idea (click image to view the video, right-click to download) :

    +--+ OK, let's see if it reverts again. qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> WebSite_doAll >> Yikes!! d_webSite had unmounted again (why?) >> it's not reconnecting!! just like my Sea4Tb180904 USB drive >> do I have a virus or something? But works in USB2.0 port +--+ My own [cheap, crappy] animation for the spiral currents moving quickly though slowly revolving stars (mpeg format) shows the general idea (click image to view the video, right-click to download) :

    +--+ >> Problem remains. What is putting '/media/bill/Dell2/Website - raw/' back in? urls intern list.txt -o has '/media/bill/WebSite//media/bill/Dell2/Website - raw/' for pretty much everything! Again, first suspects : webPageRawe_update, str_replaceIn_pathList, internalLinks_return_relativePath but mostly internalLinks_return_relativePath 08********08 #] 27May2021 problem in time-naming backups : webPageRawe_update versus webAllRawOrSite_update does str_replaceIn_pathList do this? +--+ "/media/bill/Dell2/Website - raw/economics, markets/options/TSLA/data/210527 TSLA gnuPlotScript prices, dateStriker 210611.plt", line 85: warning: Skipping data file with no valid points "/media/bill/Dell2/Website - raw/economics, markets/options/TSLA/data/210527 TSLA gnuPlotScript prices, dateStriker 210625.plt", line 85: warning: Cannot find or open file "/media/bill/Dell2/Website - raw/economics, markets/options/TSLA/data/210430 TSLA calls for 210625.dat" "/media/bill/Dell2/Website - raw/economics, markets/options/TSLA/data/210527 TSLA gnuPlotScript prices, dateStriker 210625.plt", line 85: warning: Cannot find or open file "/media/bill/Dell2/Website - raw/economics, markets/options/TSLA/data/210504 TSLA calls for 210625.dat" +--+ >> same errors for [call,puts cart [price,volum]s - probably I deleted files with 0 size?... Re-check in "$d_web[Rawe,Site]""Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Birkeland rotation in galaxy - not dark matter.html" : +--+ d_webRawe : Howell-produced videos Howell-produced videos In d_webRawe
  • d_webSite
  • +--+ same result : My own [cheap, crappy] animation for the spiral currents moving quickly though slowly revolving stars (mpeg format) shows the general idea (click image to view the video, right-click to download) :

    +--+ Again, why is '/media/bill/Dell2/Website - raw/' re-injected???!! First suspects : webPageRawe_update, str_replaceIn_pathList, internalLinks_return_relativePath internalLinks_return_relativePath : +--+ ELSEIF (`/ = (last lineList@midIndx)) THEN % check if a full subDir without fname ; IF (NOT isfault (i_subDir := first find_Howell lineList@midIndx allSubDirsSortedBySubdir)) THEN lineList@midIndx := link backtrack ((i_subDir pick allSubDirsSortedBySubdir) str_remove_subStr d_webRawe) ; % try to recover partial subDirEnd. Relies on early-stage sortup. ; % This is very inefficient coding! ; ELSEIF (OR (subDir_ins := lineList@midIndx EACHRIGHT subStr_in_str allSubDirsSortedBySubdir)) THEN subDirHits := subDir_ins sublist allSubDirsSortedBySubdir ; % How to choose a unique hit? If one is a full hit - subDir, take it, if not, error ; subDirENDs := (gage shape lineList@midIndx) EACHRIGHT takeright subDirHits ; subDirHits := (lineList@midIndx EACHRIGHT = subDirENDs) sublist subDirHits ; IF (= 1 (gage shape subDirHits)) THEN lineList@midIndx := link backtrack ((first subDirHits) str_remove_subStr d_webRawe) ; ELSE lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ENDIF ; ELSE lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ENDIF ; ELSE lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ENDIF ; +--+ >> this could be the problem IF d_webRawe appears in allSubDirsSortedBySubdir urls intern list.txt -> has all d_webRaws qnial> OR ('/media/bill/Dell2/Website - raw/' EACHRIGHT subStr_in_str allSubDirsSortedBySubdir) o oops - webSite_extract_pathsSubDirsFnames not loaded at start done by [webRawLinks_remove_pct20, webURLs_extract, webSite_doAll] activate at loaddefs qnial> OR ('/media/bill/Dell2/Website - raw/' EACHRIGHT subStr_in_str allSubDirsSortedBySubdir) o >> OK, visually checking allSubDirsSortedBySubdir confrms that d_webRawe is ont in list of links. Now what? internalLinks_return_relativePath, could be : +-- ELSEIF (OR (subDir_ins := lineList@midIndx EACHRIGHT subStr_in_str allSubDirsSortedBySubdir)) THEN subDirHits := subDir_ins sublist allSubDirsSortedBySubdir ; % How to choose a unique hit? If one is a full hit - subDir, take it, if not, error ; subDirENDs := (gage shape lineList@midIndx) EACHRIGHT takeright subDirHits ; subDirHits := (lineList@midIndx EACHRIGHT = subDirENDs) sublist subDirHits ; IF (= 1 (gage shape subDirHits)) THEN lineList@midIndx := link backtrack ((first subDirHits) str_remove_subStr d_webRawe) ; ELSE lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ENDIF ; ELSE lineList@midIndx := link '!!linkError!!' lineList@midIndx ; ENDIF ; +--+ >> but first, retry webSite_doAll and recheck "$d_web[Rawe,Site]""Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Birkeland rotation in galaxy - not dark matter.html" qnial> webSite_doAll again, tons of : +--+ 210527 12h53m44s webAllRawOrSite_update l "webPageRawe_update ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/Dell2/Website - raw/Bill Howells videos/160901 Big Data, Deep Learning, and Safety/0_Big Data, Deep Learning, and Safety.html /media/bill/Dell2/Website - raw/z_Archive/210527 12h53m44s backups webPageRawe_update/ +--+ >> obviously didn't work, 210527 12h53m44s backups webPageRawe_update created (?) I need to stop execution if there areto of the above? -> too complex for now only have 210527 12h53m28s backups str_replaceIn_pathList no 210527 12h53m44s backups webPageRawe_update >> dir not created? webAllRawOrSite_update : +--+ IF flag_backup THEN d_htmlBackup := link d_webRawe 'z_Archive/' timestamp_YYMMDD_HMS ' backups ' (string optr_rawOrSite) '/' ; IF (NOT path_exists d_htmlBackup) THEN host link 'mkdir "' d_htmlBackup '" ' ; ENDIF ; ENDIF ; +--+ IF (= "webPageRawe_update optr_rawOrSite) THEN webPageRawe_update flag_backup webPage ; ELSE webPageSite_update flag_backup webPage ; ENDIF ; ENDIF ; +--+ Does this use d_htmlbackup? : str_replaceIn_pathList +--+ str_replaceIn_pathList IS OP flag_backup d_backupRoot strOld strNew pathList { LOCAL d_backup pinn ; % backups are automatically done, except for testing purposes!! ; IF flag_backup THEN d_backup := link d_backupRoot 'z_Archive/' timestamp_YYMMDD_HMS ' backups str_replaceIn_pathList/' ; host link 'mkdir "' d_backup '" ' ; ELSE d_backup := d_backupRoot ; ENDIF ; +--+ >> Nope : (only used in ['webSite/webSite header.ndf', webSite_doAll] ) Why isn't d_htmlbackup created? qnial> d_htmlBackup := link d_webRawe 'z_Archive/' timestamp_YYMMDD_HMS ' backups ' (string optr_rawOrSite) '/' ?undefined identifier: ( STRING OPTR_RAWORSITE <***> ) '/' >> Ah hah! an oops again! writeDoStep (link 'str_replaceIn_pathList l ' chr_apo d_webRawe chr_apo ' ' chr_apo '!!linkError!!' chr_apo ' ' chr_apo '[#=; backtrack ;=#]' chr_apo ' ' 'htmlPathsSortedByPath' ) ; Change : +.....+ 'htmlPathsSortedByPath' +.....+ To : +.....+ htmlPathsSortedByPath +.....+ qnial> loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webSite_doAll d_htmlbackup STILL not ced! (why?) +--+ 210527 13h20m58s webURLs_extract ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/Dell2/Website - raw/webWork files/urls errors list.txt /media/bill/Dell2/Website - raw/z_Archive/210527 13h20m15s backups webPageRawe_update/ +--+ >> looks like I forgot to save file. First do a test qnial> d_htmlBackup := link d_webRawe 'z_Archive/' timestamp_YYMMDD_HMS ' backups ' (string "webPageRawe_update) '/' /media/bill/Dell2/Website - raw/z_Archive/210527 13h38m31s backups webPageRawe_update/ qnial> IF (NOT path_exists d_htmlBackup) THEN host link 'mkdir "' d_htmlBackup '" ' ; ENDIF ; >> didn't work!? qnial> path_exists '/media/bill/Dell2/Website - raw/z_Archive/210527 13h38m31s backups webPageRawe_update/' ?op_parameter >> oops -> path_exists IS OP typer path qnial> path_exists '-d' '/media/bill/Dell2/Website - raw/z_Archive/210527 13h38m31s backups webPageRawe_update/' o >> OK now webAllRawOrSite_update Change : +.....+ IF (NOT path_exists d_htmlBackup) THEN host link 'mkdir "' d_htmlBackup '" ' ; ENDIF ; +.....+ To : +.....+ IF (NOT path_exists '-d' d_htmlBackup) THEN host link 'mkdir "' d_htmlBackup '" ' ; ENDIF ; +.....+ >> and I fixed other places in same file! Retry: qnial> d_htmlBackup := link d_webRawe 'z_Archive/' timestamp_YYMMDD_HMS ' backups ' (string "webPageRawe_update) '/' /media/bill/Dell2/Website - raw/z_Archive/210527 13h38m31s backups webPageRawe_update/ qnial> IF (NOT path_exists '-d' d_htmlBackup) THEN host link 'mkdir "' d_htmlBackup '" ' ; ENDIF ; created : 210527 13h52m13s backups webPageRawe_update >> OK, success. I put apos back (used as host command?) : 'htmlPathsSortedByPath' Rerun qnial> webSite_doAll >> OK!!! none of the errors like : ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/Dell2/Website - raw/webWork files/urls errors list.txt /media/bill/Dell2/Website - raw/z_Archive/210527 13h20m15s backups webPageRawe_update/ Now back to the link problem with '/media/bill/Dell2/Website - raw/' Again, first suspects : webPageRawe_update, str_replaceIn_pathList, internalLinks_return_relativePath Check results with : +--+ d_webRawe
  • d_webSite
  • +--+ 08********08 #] 26May2021 current priorities : replace missing files (eg Wickson webPage!) find linkError problem +-----+ replace missing files (eg Wickson webPage!) compare earlier backup with current - dists of all website files bash "$d_bin""find_diff.sh" - shows path-diff in two directory listings find_diff "$d_webRawe" "$d_webSite" eample line : find "$1" -name "*" | grep --invert-match "References\|z_Old\|z_old\|z_Archive\|.Trash-1000" | sed "s#\($1\)\(.*\)#\2#" | sort >"$d_temp""find_diff1.txt" +--+ 628d627 < bin/find_diff.sh 9131,9136c9130 < webWork files/210525 13h02m rsync webRawe_to_webSite log.txt < webWork files/210525 14h31m rsync webRawe_to_webSite log.txt < webWork files/210525 15h39m rsync webRawe_to_webSite log.txt < webWork files/210525 17h03m rsync webRawe_to_webSite log.txt < webWork files/210525 19h16m rsync webRawe_to_webSite log.txt < webWork files/210525 20h01m rsync webRawe_to_webSite log.txt --- > webWork files/0_website notes.txt 9166d9159 < webWork files/urls extern list.txt 9170d9162 < webWork files/urls intern list.txt +--+ >> I'm OK, why did I think that I am missing files? maybe d_webSite wasn't mounted? rerun webSite_doAll Failures : +---+------------+ |33 |errors list | +---+------------+ |22 |extern fails| +---+------------+ |0 |howell list | +---+------------+ |439|intern fails| +---+------------+ Unknowns - I havent written code to really show [OK, fail] : +---+-----------+ |75 |mailto list| +---+-----------+ |344|pgPosn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |239|extern OK| +---+---------+ |65 |intern OK| +---+---------+ [fail, unknown, OK, total] counts : +----+-------------+ | 494|failed links | +----+-------------+ | 419|unknown links| +----+-------------+ | 304|OK links | +----+-------------+ |1217|total | +----+-------------+ >> almost ALL internal links fail!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! but hard to see? still a problem with Wickson? I don't understand. maybe QNial dirs messed up? seems that some filed were moved?!? rerun bash "$d_bin""find_diff.sh" - shows path-diff in two directory listings >> same result - d_webRawe onls one file in Wickson, d_webSite has 4 !!?? /media/bill/Dell2/Website - raw/Wickson website >> oops,mixed up [Wickson, Yaskell] >> Wickson menu link is OK file:///media/bill/WebSite/Steven%20H%20Yaskell/0_Steven%20H%20Yaskell.html >> But it is onLine!?? Check backup drive 200824 SWAPPER monthly_backup has it! (most recent on boacku on hand) >> I pasted it back Need to check other directories +-----+ set find_diff.sh to check d_webRaw against "/media/bill/Seagate4Tb180804/200824 SWAPPER monthly_backup/Website - raw/" >> 764 file differences (yiyi!) what a mess! I have to go through this!! >> hopeless given directory cha, just take care of d_web[Rawe, Site] No ONLINE permission to access some directory listings? Go through webPages [menu, content, footer] : main OK Neural nets root - no from least one subdir file:///media/bill/WebSite//media/bill/Dell2/Website%20-%20raw/Neural%20nets/Neural%20Networks.html subds OK Projects root seems fine file:///media/bill/WebSite/Pandemics,%20health,%20and%20the%20Sun/_Pandemics,%20health,%20and%20the%20sun.html#Robert%20Prechter%20-%20Socionomics,%20the%20first%20quantitative%20sociology? >> get rid of page bookmark! file:///media/bill/WebSite/economics,%20markets/S&P%20500%20Shiller-forward%20PE%20versus%2010y%20Treasury%20bond%20rates.html >> doesn't work file:///media/bill/WebSite/Civilisations%20and%20sun/_Civilisations%20and%20the%20sun.html >> doesn't work (maybe after I've copied files over?) file:///media/bill/WebSite/Steven%20H%20Yaskell/0_Steven%20H%20Yaskell.html file:///media/bill/WebSite/Solar%20modeling%20and%20forecasting/_Solar%20modeling%20&%20forecasting.html Software OK Profession OK Publication OK videos NOT STEM for kids, Big Data.., Icebreaker, etc yes root directory, from root directory online : only Birkeland rotation, blogs OK, but should maybe show directories? hosted some show projects menu! (confusing) no - Yaskell as noted previously file:///media/bill/WebSite/Steven%20H%20Yaskell/0_Steven%20H%20Yaskell.html Neil file:///media/bill/WebSite//media/bill/Dell2/Website%20-%20raw/Neil%20Howell/_Neil%20Howell.html from : Most failed internal links have : /media/bill/WebSite//media/bill/Dell2/Website - raw/ I though I nuked this yesterday - try again Take a look at a couple to see what needs to be done (why is this happening again?) d_webRawe : Howell-produced videos Howell-produced videos d_webRawe
  • d_webSite
  • Maybe the search-replace didn't work? Maybe I just did the double-d_webRawe? qnial> pathList := host_result (link 'find "$d_webRawe" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive" ') qnial> str_replaceIn_pathList l d_webRawe '/media/bill/Dell2/Website - raw/' '' pathList +-----+ Redo webSite_doAll sh: 1: cannot create /media/bill/Dell2/Website - raw/z_Archive/210526 backups/0_webPageSite_update log.txt: Directory nonexistent d_webSite - full path was stuck in again! /media/bill/Dell2/Website - raw/Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Birkeland rotation in galaxy - not dark matter.html >> This must be a programming change? 210526 14h51m09s str_replaceIn_pathList l '/media/bill/Dell2/Website - raw/' '!!linkError!!' '[#=; backtrack ;=#]' htmlPathsSortedByPath 210526 14h51m22s webAllRawOrSite_update l "webPageRawe_update 210526 14h51m59s webAllRawOrSite_update l "webPageSite_update sh: 1: cannot create /media/bill/Dell2/Website - raw/z_Archive/210526 backups/0_webPageSite_update log.txt: Directory nonexistent ... many others ... after "executing webSite_extract_pathsSubDirsFnames" : ?noexpr ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/Dell2/Website - raw/webWork files/webURLs_extract allLinks.txt /media/bill/Dell2/Website - raw/z_Archive/210526 backups/ Bad backups by 'webAllRawOrSite_update l "webPageSite_update' : +--+ #] loaddefs link d_Qndfs 'webSite/webSite header.ndf' d_htmlBackup := link d_webRawe (link 'z_Archive/' timestamp_YYMMDD ' backups/') ; +-+ no definition of d_htmlBackup : webPageRawe_update IS OP flag_backup webPage webPageSite_update IS OP flag_backup webPageRawe +--+ webAllRawOrSite_update IS OP flag_backup optr_rawOrSite { LOCAL flog p_log webPage ; NONLOCAL d_htmlBackup d_webRawe d_webSite htmlPathsSortedByPath ; % ; % ONLY for webPageRawe_update - create a new backup directory for every use of webSite_convert, as damage can be VERY time-costly ; IF (= "webPageRawe_update optr_rawOrSite) THEN d_htmlBackup := link d_webRawe 'z_Archive/' timestamp_YYMMDD_HMS ' backups ' (string optr_rawOrSite) '/' ; ... THEN d_htmlBackup := link d_webRawe (link 'z_Archive/' timestamp_YYMMDD ' backups/') ; >> NUTS! conflictiinfinitions +--+ >> These definitions of d_htmlBackup are different Makes sense, as individual file updates don't require the 210526 backups webAllRawOrSite_update redefines to (example) : 210526 14h51m09s backups str_replaceIn_pathList 210526 14h51m22s backups webPageRawe_update Change : +.....+ IF (= "webPageRawe_update optr_rawOrSite) +.....+ To : +.....+ IF (OR ("webPageRawe_update "webPageSite_update EACHLEFT = optr_rawOrSite) +.....+ +-----+ Is the code putting '/media/bill/Dell2/Website - raw/' back into links? webPageRawe_update IF (' I added to internalLinks_return_relativePath % 26May2021 I also must remove d_web[Rase, Site] ; IF ('/media/bill/Dell2/Website - raw/' subStr_in_str (str_toLowerCase lineList@midIndx)) THEN lineList@midIndx := str_replace_subStr '/media/bill/Dell2/Website - raw/' '' lineList@midIndx ; ENDIF ; +-----+ Redo webSite_doAll Two types of failures : +--+ 210526 20h06m, webSite check for [z_Archive, z_Old] find "/media/bill/WebSite/" -type d -name "z_Archive|z_Old" 210526 20h06m08s webSite_extract_pathsSubDirsFnames executing webSite_extract_pathsSubDirsFnames This is done only on initial loaddefs, or when manually invoked after changes 210526 20h06m10s str_replaceIn_pathList l '/media/bill/Dell2/Website - raw/' '!!linkError!!' '[#=; backtrack ;=#]' htmlPathsSortedByPath 210526 20h06m25s webAllRawOrSite_update l "webPageRawe_update sh: 1: cannot create /media/bill/Dell2/Website - raw/z_Archive/210526 20h06m25s backups webPageRawe_update/0_webPage_update log.txt: Directory nonexistent sh: 1: cannot create /media/bill/Dell2/Website - raw/z_Archive/210526 20h06m25s backups webPageRawe_update/0_webPage_update log.txt: Directory nonexistent ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/Dell2/Website - raw/webWork files/confFoot.html /media/bill/Dell2/Website - raw/z_Archive/210526 20h06m25s backups webPageRawe_update/ +--+ 210526 20h07m05s webURLs_extract ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/Dell2/Website - raw/webWork files/urls errors list.txt /media/bill/Dell2/Website - raw/z_Archive/210526 20h06m25s backups webPageRawe_update/ +--+ >> ?? problem in time-naming backups : webPageRawe_update versus webAllRawOrSite_update 08********08 #] 25May2021 yet another attempt to update website bad menu l file:///media/bill/WebSite/economics,%20markets/S&P%20500%20Shiller-forward%20PE%20versus%2010y%20Treasury%20bond%20rates.html file:///media/bill/WebSite//media/bill/Dell2/Website%20-%20raw/Civilisations%20and%20sun/Howell%20-%20radioisotopes%20and%20history.jpg file:///media/bill/WebSite/Steven%20H%20Yaskell/0_Steven%20H%20Yaskell.html file:///media/bill/WebSite/Solar%20modeling%20and%20forecasting/_Solar%20modeling%20&%20forecasting.html file:///media/bill/WebSite//media/bill/Dell2/Website%20-%20raw/Paul%20L%20Vaughan/Vaughan%20120324%20Solar-Terrestrial%20Resonance,%20Climate%20Shifts,%20&%20the%20Chandler%20Wobble%20Phase%20Reversal.pdf file:///media/bill/WebSite/Climate%20and%20sun/!!linkError!!Paul%20L%20Vaughan/Vaughan%20120324%20The%20Solar%20Cycle's%20Footprint%20on%20Terrestrial%20Climate.PDF need to replace : /media/bill/WebSite//media/bill/Dell2/Website%20-%20raw/ also - find has to avoid z_[Archive, Old] qnial> pathList := host_result (link 'find "$d_webRawe" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive" ') qnial> str_replaceIn_pathList l d_webRawe '/media/bill/WebSite//media/bill/Dell2/Website - raw/' '' pathList Actally I have to modify webSite_extract_pathsSubDirsFnames? > nope, it already does it +-----+ sill aproblem !! : file:///media/bill/WebSite/Solar%20modeling%20and%20forecasting/_Solar%20modeling%20&%20forecasting.html I have find delete d_weRawe files 08********08 #] 25May2021 problems with menu changes with uploads Menu Software programming.html is not being updated in d_webSite >> do I have to copy it over manually? >> are d_webSite menus even used? +-----+ must remove "/media/bill/Dell2/Website - raw/" from all links in html files!! +-+ BACKUPS AREN'T WORKING!!!! /media/bill/Dell2/Website - raw/z_Archive/210525 backups/ should be /media/bill/Dell2/Website - raw/webWork files/z_Archive/210525 backups/ also /media/bill/Dell2/Website - raw/z_Archive/210525 17h03m58s backups webPageRawe_update/0_webPage_update log.txt should be /media/bill/Dell2/Website - raw/z_Archive/210525 17h03m58s backups webPageRawe_update/0_webPage_update log.txt NYET - don't load up d_webWork!!! revert back to original, and put z_Archive in d_webRawe +--+ fix use as example : $ find "$d_webRawe" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!" "FILE" | sed "s/!!linkError!!/[#=; backtrack ;=#]/g" $ find "$d_webRawe" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive" | sed "s#/media/bill/Dell2/Website - raw/##g" test : $ find "$d_webRawe""Paul L Vaughan/" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive" | tr \\n \\0 | xargs -0 -IFILE sed "s#/media/bill/Dell2/Website - raw/##g" "FILE" >"$d_temp""html fixes/"FILE".txt" use QNial fileops.ndf str_replaceIn_path IS OP flag_backup d_backup strOld strNew path OR str_replaceIn_pathList IS OP flag_backup d_backupRoot strOld strNew pathList qnial> pathList := host_result (link 'find "$d_webRawe""Paul L Vaughan/" -type f -name "*.html"') qnial> str_replaceIn_pathList l (link d_webRawe 'z_Archive/210525 backups/') '/media/bill/Dell2/Website - raw/' '' pathList >> oops >> 25May2021 needed to fix str_replace_subStr - was igoring null replacement, and returning original str qnial> str_replaceIn_pathList l d_webRawe '/media/bill/Dell2/Website - raw/' '' pathList +--+ test a directory with several html files : qnial> pathList := host_result (link 'find "$d_webRawe""Lies, Damned Lies, and Scientists/" -type f -name "*.html"') qnial> str_replaceIn_pathList l d_webRawe '/media/bill/Dell2/Website - raw/' '' pathList +--+ OK - now for all of d_webRawe qnial> pathList := host_result (link 'find "$d_webRawe" -type f -name "*.html"') qnial> str_replaceIn_pathList l d_webRawe '/media/bill/Dell2/Website - raw/' '' pathList ********************** /media/bill/Dell2/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 210525 17h06m17s Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : +----+-----------------------------+ |8788|count of all links in webSite| +----+-----------------------------+ 1217 = count of all [file, dir, url]s targeted by links on the webSite Counts below are the number of unique TARGETED [file, dir]s of links (eg 5+ links per target on average) Failures : +---+------------+ |33 |errors list | +---+------------+ |22 |extern fails| +---+------------+ |0 |howell list | +---+------------+ |439|intern fails| +---+------------+ Unknowns - I havent written code to really show [OK, fail] : +---+-----------+ |75 |mailto list| +---+-----------+ |344|pgPosn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |239|extern OK| +---+---------+ |65 |intern OK| +---+---------+ [fail, unknown, OK, total] counts : +----+-------------+ | 494|failed links | +----+-------------+ | 419|unknown links| +----+-------------+ | 304|OK links | +----+-------------+ |1217|total | +----+-------------+ 08********08 #] 25May2021 fix Software Programming link, update +----+ qnial> loaddefs link d_Qndfs 'webSite/webSite header.ndf' >>> loading start : webSite header.ndf +-----+ Global variables mkdir: cannot create directory ‘/media/bill/Dell2/Website - raw/z_Archive/210525 backups/’: No such file or directory +-----+ Find all d_webRawe html files related to webSite [URLs, convert, update] loading countPathsDirs loading webSite_extract_pathsSubDirsFnames executing webSite_extract_pathsSubDirsFnames This is done only on initial loaddefs, or when manually invoked after changes +---------------------+----+----+ |pathDir[pass,sortBy] |all |html| +---------------------+----+----+ |PathsSortedByFname |6171|6171| +---------------------+----+----+ |SubDirsSortedByFname |6171|7078| +---------------------+----+----+ |FnamesSortedByFname | 381| 203| +---------------------+----+----+ |PathsSortedByPath | 203| 203| +---------------------+----+----+ |SubDirsSortedBySubdir| 219| 57| +---------------------+----+----+ ?noexpr loading webSite_readpathsSubDirsFnames executing webSite_readpathsSubDirsFnames ... >> table looks wrong?!?! 08********08 #] 17Dec2020 problems with menu changes with uploads - added crypto page http://www.BillHowell.ca/economics, markets/currency-crypto/Cryptos versus [currencies, 10 year [rates, bonds]].html http://www.BillHowell.ca/economics,%20markets/currency-crypto/Cryptos%20versus%20[currencies,%2010%20year%20[rates,%20bonds]].html http://www.BillHowell.ca/Software%20programming%20&%20code/bin/encrypt%20-%20keys%20setup.sh http://www.BillHowell.ca/Software%20programming%20&%20code/bin/encrypt-close.sh http://www.BillHowell.ca/Software%20programming%20&%20code/bin/encrypt-open.sh 08********08 #] 14Dec2020 fileZilla update webPages & check previous problems ALL Directory filters lost!!! Bullshit arrangement! NYET - it's OK, must highlight filter first [Author, Pub]guide menus OK Still a problem with : Neural Nets : MindCode earlier work Paper reviews Software : Linux bash scripts - still shows Wickson rest - all OK Projects - all OK Resume - all OK Publications - OK Videos - all OK Blogs - all OK didn't check rest 08********08 #] 14Dec2020 rerun to check link status qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webSite_doAll A bit better, but not as much as I hoped. Current results : Failures : +--+------------+ |24|errors list | +--+------------+ [fail, unknown, OK, total] counts : +----+-------------+ | 63|failed links | +----+-------------+ | 373|unknown links| +----+-------------+ | 671|OK links | +----+-------------+ |1107|total | +----+-------------+ Yesterday's : Failures : +--+------------+ |43|errors list | +--+------------+ [fail, unknown, OK, total] counts : +----+-------------+ | 82|failed links | +----+-------------+ | 373|unknown links| +----+-------------+ | 657|OK links | +----+-------------+ |1112|total | +----+-------------+ Counts : Today : +----+-----------------------------+ |4593|count of all links in webSite| +----+-----------------------------+ 1107 = count of all [file, dir, url]s targeted by links on the webSite Yesterday : 4598 = count of all links in webSite 1112 = count of all [file, dir, url]s targeted by links in the webSite Many problems seem to have come back, and don't seem correct!!?!! : Old problems : !!linkError!! !!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv !!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/ !!linkError!!Climate and sun/Glaciation model 005 !!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language !!linkError!!Neural nets/Conference guides/Author guide website/N-19557 wrong paper [size, margin]s.pdf !!linkError!!Neural nets/Conference guides/Publicity website/INNS mass email instructions.odt !!linkError!!Pandemics, health, and the Sun/Howell - corona virus 2020.html !!linkError!!Personal/130728 Car collision with a deer.html !!linkError!!Software programming & code/bin/SSH/ !!linkError!!Software programming & code/Qnial/MY_NDFS/fileops.ndf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Design of QNial V7.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/V7 QNial Dictionary.html !!linkError!!Software programming & code/System_maintenance/ !!linkError!!Table of Contents !!linkError!!webAnalytics/ !!linkError!!webWork files/confMenu_authors.html apo in fnames : !!linkError!!Allegre's second thoughts.pdf !!linkError!!Bill Howells videos/120214 Venus et Mars, au dela d'une histoire d amour/Mythology.flv !!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF These links should be OK!?? Are they coming from 'code develop_test' or something? : !!linkError!!Charvatova solar inertial motion & activity/Verification/ !!linkError!!Cool emails/ !!linkError!!LibreOffice/ This is VERY frustrating!!!! I am missing something important 'Qnial.html' Why did these revert? I just changed them, but my program corrupted them!!! Maybe too deep? (5 levels -maxdepth 4) $ find "/media/bill/SWAPPER/Website - raw/" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html:20:
  • $ find "/media/bill/SWAPPER/Website - raw/" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:46:
  • Array Theory and the Design of Nial.pdf 08********08 #] 14Dec2020 recurring problems that don't get fixed -why? 1. menus are wrong 2. links to non-existant files, lenks that don't stay fixed +-----+ First - list multiple occurences of fnames setup.ndf - add optr : aryList_extractMulplicate_subArys IS OP selectOp aryList { LOCAL indices matches subAryList ; indices := tell (gage shape aryList) ; iPairs := (front indices) EACHBOTH pair (rest indices) ; subaryList := selectOp EACHRIGHT apply aryList ; matches := (front subaryList) EACHBOTH = (rest subaryList) ; indxHits := cull link (matches sublist iPairs) ; indxHits EACHLEFT pick aryList } webSite_extract_pathsSubDirsFnames Change : +.....+ host link 'find "' d_webRawe '" -type f -name "*" | grep --invert-match "z_Old\|z_Archive\|code develop_test\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " | sort -u >"' p_allFileList '" ' ; +.....+ To : +.....+ p_nonuniqueFileList := link d_temp 'webSite_extract_pathsSubDirsFnames nonuniqueFileList.txt' ; host link 'find "' d_webRawe '" -type f -name "*" | grep --invert-match "z_Old\|z_Archive\|code develop_test\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " | sort >"' p_nonuniqueFileList '" ' ; allMulplicateFnames := aryList_extractMulplicate_subArys "path_extract_fname (strList_readFrom_path p_nonuniqueFileList) ; host link 'sort -u "' p_nonuniqueFileList '" >"' p_allFileList '" ' ; +.....+ qnial> webSite_extract_pathsSubDirsFnames qnial> EACH write allMulplicateFnames /media/bill/SWAPPER/Website - raw/Lucas/math Howell/cos - 1 noo, iterative, non-feedback/d-dt Rpcs^-5_t__cos -1.txt /media/bill/SWAPPER/Website - raw/Lucas/math Howell/cos - 1 yes, iterative, non-feedback/d-dt Rpcs^-5_t__cos -1.txt /media/bill/SWAPPER/Website - raw/Projects - mini/History/Timechart of Military History - naval/0001iw/SCAN0000.rtf /media/bill/SWAPPER/Website - raw/Projects - mini/History/Timechart of Military History - naval/0002iw/SCAN0000.rtf /media/bill/SWAPPER/Website - raw/Projects - mini/History/Timechart of Military History - naval/0003iw/SCAN0000.rtf /media/bill/SWAPPER/Website - raw/Projects - mini/History/Timechart of Military History - naval/0004iw/SCAN0000.rtf /media/bill/SWAPPER/Website - raw/Projects - mini/History/Timechart of Military History - naval/0005iw/SCAN0000.rtf >> Too good to bee - hardly ANY mulplicate fnames in the whole webSite? almost impossible >> In any case, duplicate fnames, on this basis, aren't the problem. >> none of the mulplicates above are important to the li. >> I don't trust this +-----+ 1. menus are wrong, re-check & make list : PROJECTS Climate - Kyoto Premise fraud Neural Nets - OK Software prog - OK Pf&Resume - OK Pub&Report - OK Videos - OK Blogs - OK Howell blog - OK Cool stuff -OK Cool images - OK SuspObs comments -OK Crazy themes - OK Hosted - OK I only see a problem with "Climate - Kyoto Premise fraud" file:///media/bill/HOWELL_BASE/Website/Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html >> extra space? 'Menu projects.html' >> I just can't see a problem with the link!!??!!! leave it for now +-----+ 2. links to non-existant files, lenks that don't stay fixed work again with 'urls errors list.txt' p_allLinks := link d_webWork 'webURLs_extract allLinks.txt' ; $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html:20:
  • Background material for the scenes This shows a directory of Scenes, each listing [scripts with additional references (albeit vastly incomplete), images used but NOT videos due to copyrights].
  • $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Charvatova solar inertial motion & activity/Verification/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html:93:Supporting documents, spreadsheets etc
    Changed to : Supporting documents, spreadsheets etc $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, towards a quasi-predictive model of the rise and fall of civilisations.pdf' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/_Pandemics, health, and the sun.html:419:
  • >> Changed to : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Cool emails/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/index.html:101:
  • Cool emails >> Changed to also : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!International Neural Network Society.JPG' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/webWork files/fin organisations.html:13: >> changed to : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!LibreCalc bank account macro system.txt' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/page Howell - blog.html:77:
  • $ find "$d_PROJECTS" -maxdepth 4 -type f -name 'LibreCalc bank account macro system.txt' /media/bill/PROJECTS/Investments/LibreCalc bank account macro system.txt >> I moved this to : Software programming & code/LibreOffice macros/ >> changed to : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Menu.html' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Software for the Guides.html:68:Authors' Guide Menu
    >> changed to : >> also : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Neural nets/Conference guides/Publications website/CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Publications website/IEEE CrossCheck, chair.html:179:
  • The Publications Chair responds to author inquiries about their CrossCheck rejection (see the Publications Chair explanation of CrossCheck results and analysis), including a generic comment, the CrossCheck print-out pdf, CrossCheck analysis comments, and offers to respond to any questions that they may have. >> changed to : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Publicity website/INNS mass email instructions.odt' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/IEEE ListServe publicity subscriptions.html:140:
  • INNS mass HOLDERS instructions.odt >> changed to : >> also : (Oops! I can't find this document - may have been supered by IEEE LstServer approach) : INNS mass emailers - easy [setup, approach] for mass emails.odt $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Regina Legoo. INNS Meetings & Program Manager. Association Resources. Washington. DC. USA ' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Sponsors website/Instructions.html:34: >> changed to : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Social media/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/page projects.html:61:
  • >> change to : >> also : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Software programming & code/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> Wow! lots of problems. ines other errors : !!linkError!!Software programming & code/bin/SSH/ !!linkError!!Software programming & code/Qnial/MY_NDFS/??? !!linkError!!Software programming & code/Qnial/MY_NDFS/fileops.ndf !!linkError!!Software programming & code/Qnial/MY_NDFS/website urls.ndf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Design of QNial V7.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/V7 QNial Dictionary.html !!linkError!!Software programming & code/System_maintenance/ /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Attendee downloads - summary.html:239:
  • Click to see a directory of sftp-related bash scripts that were used for the analysis of the sftp site.
    /media/bill/SWAPPER/Website - raw/page Software programming.html:59:
    >> change to : For the following errors, I mostly replaced '!!linkError!!' '[#=; backtrack ;=#]',some comments as file not found : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:44:
  • V7 QNial Dictionary.html, /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:45:
  • Design of QNial V7.pdf /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:46:
  • Array Theory and the Design of Nial.pdf /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:123:Besides augmenting [strings.ndf, fileops.ndf, other] files (describved in a section below) - I've added many operators that have come in handy for my [work, projects]. Here is a [random, scattered, incomplete] selection of my major QNial-based projects : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:142:
  • fileops.ndf - /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:145:
  • Linux computer startup - [open, configure] cart [windows, workspaces], and start applications, which is now bash-only based. This doesn't sound like much, but is a huge time-saver for me, especially as I occasionally shut down my system, and regularly [clear, start] different workspaces. /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:146:
  • Linux drive backups - This has evolved through several approaches over the years, and is currently Linux bash based, rather than still using QNial. Yeah, I know, many good backup programs are free (eg basic system with Linux), so why do I still waste my time on this? Good question now that huge drive capacity is cheap. But I still want the simple [flexibility, adaptability] of my system. /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:148:
  • [Check, correct] website links - (All-new as of Oct2020.) All [internal, external (other peoples' webSites)] relevant links are checked to allow easy identification of problems and their correction. /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:167:A near-full list of my bash scripts is also available (see also my Linux [command, script] web-page).
    $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Table of Contents' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html:15:

    Table of Contents

    >> change to (ignore link-back - maybe later) :

    Table of Contents

    $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!/webAnalytics' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author guide.html:93:
  • Google Analytics - This provides a directory listing of snapshots of the Guides' web-page usage (the file prefixes denote YYMMDD - [year, month, day]). Clearly, only a fraction (~200) of the IJCNN2019 submitting co-authors used the Authors' Guide. My guess is that they were mostly students. As the Authors' Guide was NOT updated for WCCI2020 and the mass emails didn't announce it's availability (other than broken links in the last mass email), there has been no usage of the Authors' Guide for WCCI2020 as of 18Jan2020. >> change to : whatever ... $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> crap, lots of errors (12), many as I was too [lazy, rushed] to put them in to start with : +-----+ /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/IEEE CrossCheck.html:212:What is CrossCheck? From an Elsevier webpage :
    >> changed to : I simply removed the empty link /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Paper formatting blog.html:725:As an example, I have attached paper 19557 with red-background margins and the standard [header, footer]s. Note that the paper [size, margin, font]s are all wrong. This is hard to see just by looking at it.
    >> changed to : /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Paper reviews.html:21:
  • Great paper, wrong conference - Even great papers are be filtered out in the screening process if they do not fit the conference themes and topics, albeit [WCCI,IEEE-CEC, IJCNN, IEEE-FUZZ] has been tolerant of innovative papers having some relevance, and that may differ from mainstream thinking. In such cases, the Chairs will often suggest alternate conference better suited to the paper. This can happen with papers from other areas of Computational Intelligence (such as Evolutionary Computation, Fuzzy Systems, and ???), but such papers can be accepted if they have a neural network component. >> changed to : I simply removed the empty links /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author guide.html:93:
  • Google Analytics - This provides a directory listing of snapshots of the Guides' web-page usage (the file prefixes denote YYMMDD - [year, month, day]). Clearly, only a fraction (~200) of the IJCNN2019 submitting co-authors used the Authors' Guide. My guess is that they were mostly students. As the Authors' Guide was NOT updated for WCCI2020 and the mass emails didn't announce it's availability (other than broken links in the last mass email), there has been no usage of the Authors' Guide for WCCI2020 as of 18Jan2020. >> changed to (I thought that I had already fixed this!): /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author [PDF, CrossCheck]-like tests.html:45:
  • a bash script to carry out the operations. You will have to adapt [directories, files] to your own system. >> changed to : /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author [PDF, CrossCheck]-like tests.html:46:
  • a LaTeX template for the overlay >> changed to : /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Publications website/PubChair guide.html:140:The IEEE-MCE webpage Quick links to Required Forms is a good reference to keep in mind.
    >> changed to (I delethe link as lost - let the users search) : The IEEE-MCE webpage "Quick links to Required Forms" (oops - I've lost the link) /media/bill/SWAPPER/Website - raw/page Software programming.html:59:
    >> changed to : /media/bill/SWAPPER/Website - raw/page blogs.html:17:
  • >> changed to : /media/bill/SWAPPER/Website - raw/Lies, Damned Lies, and Scientists/General Relativity is a turkey.html:96:As this issue is already covered in my webPage "General Relativity is a turkey?", please refer to it.
    >> changed to : I deleted the whole paragraph, as it was for the Quantum mechanics webPage. +-----+ These are already flagged for apos : /media/bill/SWAPPER/Website - raw/Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html:91: /media/bill/SWAPPER/Website - raw/Climate and sun/_Climate and sun.html:28:
  • nothing showing from find (null stdout) - some because of 'code develop_test' exclusion?, others already fixed? : !!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv !!linkError!!Climate and sun/Glaciation model 005 !!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language !!linkError!!LibreOffice/ !!linkError!!N-19557 wrong paper [size, margin]s.pdf !!linkError!!Pandemics, health, and the Sun/Howell - corona virus 2020.html !!linkError!!Personal/130728 Car collision with a deer.html !!linkError!!Publicity website/INNS mass emailers - easy [setup, approach] for mass emails.odt !!linkError!!Puetz greatest of cycles/ !!linkError!!Qnial !!linkError!!/bash scripts/pdf edits/pdf insert [ISBN, copyright] by author, single paper.sh apo (apostrophe) problems : !!linkError!!Allegre's second thoughts.pdf !!linkError!!Bill Howells videos/120214 Venus et Mars, au dela d'une histoire d amour/Mythology.flv !!linkError!!Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists aren't wearing any clothes.html !!linkError!!Lucas's Universal Force for electrodynamics, gravity, mass, etc !!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF !!linkError!!Software programming & code/bin/bin - Howell's web-page.html 08********08 #] 13Dec2020 fix menu errors of normalSite (not confGuideSite) Add projects menu menu to all 'hosted sites' >> manually done Move [Prechter, Puetz] menu items in projects menu >> done Add 'normalStatus.html' to all normalPages >> created optr normalGuide_header qnial> webSite_extract_pathsSubDirsFnames qnial> EACH normalGuide_header htmlNormalPages >> seems OK +-----+ corrupted - scrapped menus? NYE - should work? S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html The presence of `& may have caused the problem - this will be a problem elsewhere for <TITLE>? qnial> EACH write (('&' EACHRIGHT substr_in_str htmlFnamesSortedByFname) sublist htmlFnamesSortedByFname) ; + _Charvatova - solar inertial motion & activity.html + _Solar modeling & forecasting.html n/a Long term market indexes & PPI 0582.html + page Publications & reports.html + Past & future worlds.html OK S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html +--+ + had already been fixed OK fixed now n/a not a normal wage, ignore +-----+ 403 You don't have permission to access this resource. 150525 Icebreaker unchained%20 - We should have lost World War II/ - screwed path >> ?? can't see a problem? will have to find corrupted menu link? Neural networks - no menus or files for [MindCode, NN earlier work, Paper reviews] >> can't see a problem with : [MindCode, MindCode earlier work (renamed), Paper reviews] >> wait & see?? +-----+ Bad menus - several PROJECTS!! - should be OK? maybe weren't updated? below _Lies, damned lies, and scientists.html OK? page Howell - Hope-to-do projects, active and planned.html - no PROJECTS menu OK? Neural nets/MindCode/ - no html file? but doesn't need one! OK? Software programming & code/bin/ - gives Wickson html!!!?? "Menu Lies, Damned Lies, and Scientists.html' changed to : Problems with Science </TD><TD><A HREF="[#=; backtrack ;=#]Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html"> Lies, Damned Lies, and Scientists </A></TD><TD><A HREF="[#=; backtrack ;=#]Lies, Damned Lies, and Scientists/General Relativity is a turkey.html"> General Relativity turkey? </A></TD><TD><A HREF="[#=; backtrack ;=#]Lies, Damned Lies, and Scientists/Quantum Mechanics is a fools paradise.html"> Quantum Mechanics fools paradise? </A></TD><TD><A HREF="[#=; backtrack ;=#]Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html"> _Kyoto Premise fraud? </A></TD><TD><A HREF="[#=; backtrack ;=#]Pandemics, health, and the Sun/corona virus/Howell - corona virus.html"> Corona virus screwups? </A></TD></tr> index.html - Howell's photo >> I changed it to the new photo qnial> webPageSite_update l (link d_webRawe 'index.html') >> OK +-----+ qnial> webSite_doAll extra <BR> between last menu and 'normalStatus.html' for some webPages >> leave it for next round of fixes 'page Howell - Hope-to-do projects, active and planned.html' >> missing projects menu qnial> webPageSite_update l (link d_webRawe 'page Howell - Hope-to-do projects, active and planned.html') >> OK eliminate redundant menu from Lies, Damned Lies series : [#!: path_executeEmbedsInsertIn_fHand (link d_webWork 'Menu Lies, Damned Lies, and Scientists.html') phraseValueList ; qnial> webPageSite_update l (link d_webRawe 'Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html') qnial> webPageSite_update l (link d_webRawe 'Lies, Damned Lies, and Scientists/General Relativity is a turkey.html') qnial> webPageSite_update l (link d_webRawe 'Lies, Damned Lies, and Scientists/Quantum Mechanics is a fools paradise.html') economics, markets/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html >> why isn't projects menu showing? I just put it in?!?!! qnial> webPageSite_update l (link d_webRawe 'economics, markets/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html') >> nyet - problem with link!?!? I change menu projects : <TD><A HREF="[#=; backtrack ;=#]economics, markets/SP500/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html"> S&P500 P/E ratios vs Treasury rates </A></TD> qnial> webPageSite_update l (link d_webRawe 'economics, markets/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html') ?webPageSite_update file unknown error : /media/bill/SWAPPER/Website - raw/economics, markets/S&P 500 Shiller-forward PE versus 10y Treasury bond rate.html missing subDir : qnial> webPageSite_update l (link d_webRawe 'economics, markets/SP500/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html') Actual link : file:///media/bill/HOWELL_BASE/Website/economics,%20markets/S&P%20500%20Shiller-forward%20PE%20versus%2010y%20Treasury%20bond%20rates.html >> WRONG!! shouldn't be there!! >> What the sam hill is going on? I'm worried that webPage[Rawe,Site] updates aren't being done to all?!?? >> Oops - do this first? qnial> webPageRawe_update l (link d_webRawe 'economics, markets/SP500/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html') qnial> webPageSite_update l (link d_webRawe 'economics, markets/SP500/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html') >> Still not updating properly - puts in the wrong link. Why!!?? Is it the `&? >> Nope - others work Others with looks-to-be-the-same problem : Climate - Kyoto Premise fraud file:///media/bill/HOWELL_BASE/Website/Climate%20-%20Kyoto%20Premise%20fraud/_Kyoto%20Premise%20-%20%20the%20scientists%20arent%20wearing%20any%20clothes.html General relativity is a turkey? file:///media/bill/HOWELL_BASE/Website/Pandemics,%20health,%20and%20the%20Sun/_Pandemics,%20health,%20and%20the%20sun.html#Robert%20Prechter%20-%20Socionomics,%20the%20first%20quantitative%20sociology? Quantum mechanics is a fools paradise? file:///media/bill/HOWELL_BASE/Website/Lies,%20Damned%20Lies,%20and%20Scientists/General%20Relativity%20is%20a%20turkey,%20Quantum%20Mechanics%20is%20a%20fools%20paradise.html >> Ah Hah! still the wrong fname! Robert Prechter - Socionomics file:///media/bill/HOWELL_BASE/Website/Lies,%20Damned%20Lies,%20and%20Scientists/General%20Relativity%20is%20a%20turkey,%20Quantum%20Mechanics%20is%20a%20fools%20paradise.html >> Ah Hah! still the wrong fname! but this works (suedly same link!!) : file:///media/bill/HOWELL_BASE/Website/Pandemics,%20health,%20and%20the%20Sun/_Pandemics,%20health,%20and%20the%20sun.html#Robert%20Prechter%20-%20Socionomics,%20the%20first%20quantitative%20sociology? General Relativity is a turkey, Quantum Mechanics is a fools paradise.html >> not in 'Menu projects.html' >> so why does it keep showing up? I'm lost and confused - go to 'errors list' many still have `# - these should have been put into 'pgPosn list' internalLinks_return_relativePath Change : +.....+ IF (OR (= `# (first lineList@midIndx)) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@midIndx))) THEN null ; +.....+ To : +.....+ IF (OR (`# chr_in_str lineList@midIndx) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@midIndx))) THEN null ; +.....+ webURLs_extract don't changeow - wait and see effect of change above Better look at remaining problems : $ grep --invert-match '#' "$d_webRawe""webWork files/urls errors list.txt" !!linkError!! !!linkError!!Allegre's second thoughts.pdf !!linkError!!/bash scripts/pdf edits/pdf insert [ISBN, copyright] by author, single paper.sh !!linkError!!Bill Howells videos/120214 Venus et Mars, au dela d'une histoire d amour/Mythology.flv !!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv !!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/ !!linkError!!bin/blog-format.sh !!linkError!!bin/pdf edits !!linkError!!bin/SSH/ !!linkError!!Charvatova solar inertial motion & activity/Verification/ !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, towards a quasi-predictive model of the rise and fall of civilisations.pdf !!linkError!!Climate and sun/Glaciation model 005 !!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language !!linkError!!Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists aren't wearing any clothes.html !!linkError!!Cool emails/ !!linkError!!CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt !!linkError!!Google analytics !!linkError!!Howell - Are we ready for global cooling.pdf !!linkError!!International Neural Network Society.JPG !!linkError!!LibreCalc bank account macro system.txt !!linkError!!LibreOffice/ !!linkError!!LibreOffice/LibreCalc bank account macro system.txt !!linkError!!Lucas's Universal Force for electrodynamics, gravity, mass, etc !!linkError!!Menu.html !!linkError!!N-19557 wrong paper [size, margin]s.pdf !!linkError!!National Post.jpg !!linkError!!Nial Systems Limited.JPG !!linkError!!Pandemics, health, and the Sun/Howell - corona virus 2020.html !!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF !!linkError!!Personal/130728 Car collision with a deer.html !!linkError!!Publicity website/INNS mass emailers - easy [setup, approach] for mass emails.odt !!linkError!!Publicity website/INNS mass email instructions.odt !!linkError!!Puetz greatest of cycles/ !!linkError!!Qnial !!linkError!!Randell Mills - hydrinos/ !!linkError!!Regina Legoo. INNS Meetings & Program Manager. Association Resources. Washington. DC. USA <rlegoo@association-resources.com> !!linkError!!Social media/ !!linkError!!Software programming & code/ !!linkError!!Software programming & code/bin/bin - Howell's web-page.html !!linkError!!Software programming & code/Qnial/MY_NDFS/??? !!linkError!!Software programming & code/Qnial/MY_NDFS/fileops.ndf !!linkError!!Software programming & code/Qnial/MY_NDFS/website urls.ndf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Design of QNial V7.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/V7 QNial Dictionary.html !!linkError!!Software programming & code/System_maintenance/ !!linkError!!Table of Contents +--+ $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Personal/130728 Car collision with a deer.html' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >> no result? $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Personal/130728 Car collision with a deer.html' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html convertBodyLinks.html:516:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html convertBodyLinks.html:651:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html:516:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html:651:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html str_replaceIn_path.html:516:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html str_replaceIn_path.html:651:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html update.html:581:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html update.html:716:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> >> Shoose! these don't count!! >> remove 'code develop_test' $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Personal/130728 Car collision with a deer.html' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK, none show webSite_extract_pathsSubDirsFnames Change : +.....+ host link 'find "' d_webRawe '" -type f -name "*" | grep --invert-match "z_Old\|z_Archive\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " | sort -u >"' p_allFileList '" ' ; +.....+ To : +.....+ host link 'find "' d_webRawe '" -type f -name "*" | grep --invert-match "z_Old\|z_Archive\|code develop_test\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " | sort -u >"' p_allFileList '" ' ; +.....+ $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/120214 Venus et Mars, au dela d'une histoire d amour/Mythology.flv' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> I chopped off apo $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/120214 Venus et Mars, au dela ' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK, nothing $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!bin/blog-format.sh' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Attendee downloads - summary.html:240: <LI> Nifty bash script to help with <A HREF="!!linkError!!bin/blog-format.sh">re-formatting emails to html blog format</A>.<BR> Change : +.....+ <LI> <A HREF="!!linkError!!bin/SSH/">Click to see a directory of sftp-related bash scripts</A> that were used for the analysis of the sftp site. <BR> <LI> Nifty bash script to help with <A HREF="!!linkError!!bin/blog-format.sh">re-formatting emails to html blog format</A>.<BR> +.....+ To : +.....+ <LI> <A HREF="[#=; backtrack ;=#]Software programming & code/bin/SSH/">Click to see a directory of sftp-related bash scripts</A> that were used for the analysis of the sftp site. <BR> <LI> Nifty bash script to help with <A HREF="[#=; backtrack ;=#]Software programming & code/bin/conference guides - format html.sh">re-formatting emails to html blog format</A>.<BR> +.....+ $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> nothing? Are many '!!linkError!!' somehow lingering AFTER solved? Just do a full update, and come back : qnial> bye qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webSite_doAll +-----+ Check those already fixed : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!bin/blog-format.sh' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK (trivial) $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!bin/blog-format.sh' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK +-----+ Check new ones : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!bin/pdf edits' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author guide.html:96: <LI> <B><A HREF="!!linkError!!bin/pdf edits"> >> changed to "[#=; backtrack ;=#]Software programming & code/bin/pdf edits/" $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Charvatova solar inertial motion & activity/Verification/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html:93:<A HREF="!!linkError!!Charvatova solar inertial motion & activity/Verification/">Supporting documents, spreadsheets etc</a><BR> >> changed to : <A HREF="[#=; backtrack ;=#]Charvatova solar inertial motion & activity/Charvatova related files/Howell - solar inertial motion - NASA-JPL versus Charvatova.pdf"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Galactic rays and evolution/_Galactic rays and evolution - life, the mind, civilisation, economics, financial markets.html:70: <A HREF="!!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf"> /media/bill/SWAPPER/Website - raw/Climate and sun/_Climate and sun.html:100: <A HREF="!!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf"> >> both Changed to : <A HREF="[#=; backtrack ;=#]Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Climate and sun/Glaciation model 005' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK, no shows $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK, no shows $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Cool emails/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/page blogs.html:17: <LI><A HREF="!!linkError!!Cool emails/"> /media/bill/SWAPPER/Website - raw/index.html:101: <LI> <A HREF="!!linkError!!Cool emails/">Cool emails</a> >> both changed to : <A HREF="[#=; backtrack ;=#]Cool emails/"> Problem - the linkErrors are being ignored, but should remove linkErrod process link anyways, or this becomes all manual $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Publications website/IEEE CrossCheck, chair.html:179:<LI>The Publications Chair responds to author inquiries about their CrossCheck rejection (see the <A HREF="!!linkError!!CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt">Publications Chair explanation of CrossCheck results and analysis</A>), including a generic comment, the CrossCheck print-out pdf, CrossCheck analysis comments, and offers to respond to any questions that they may have. >> Changed to :<A HREF="[#=; backtrack ;=#]Neural nets/Conference guides/Publications website/CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Google analytics' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author guide.html:93: <LI> <B><A HREF="!!linkError!!Google analytics">Google Analytics</a></b> - This provides a directory listing of snapshots of the Guides' web-page usage (the file prefixes denote YYMMDD - [year, month, day]). Clearly, only a fraction (~200) of the IJCNN2019 submitting co-authors used the Authors' Guide. My guess is that they were mostly students. As the Authors' Guide was NOT updated for WCCI2020 and the mass emails didn't announce it's availability (other than broken links in the last mass email), there has been no usage of the Authors' Guide for WCCI2020 as of 18Jan2020. >> changed to (after moving directories) : <A HREF="[#=; backtrack ;=#]/webAnalytics">Google Analytics</a> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Howell - Are we ready for global cooling.pdf' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/page Publications & reports.html:63: <LI>Bill Howell <A HREF="!!linkError!!Howell - Are we ready for global cooling.pdf">"Are we ready for global cooling?" </A>- A short presentation to Toastmasters – Dows Lake, Ottawa, 14Mar06. Needs corrections and comments! (some time later...)<BR><BR> >> Changed to : <A HREF="[#=; backtrack ;=#]Climate - Kyoto Premise fraud/Howell - Are we ready for global cooling 14Mar06 longer version.pdf"> This is [long, boring] work! Again - Problem - the linkErrors are being ignored, but should remove linkErrod process link anyways, or this becomes all manual >> NYET - codalready does that. most problems are [mis-spelt, renamed] files? Just re-do and update webOnline qnial> webSite_doAll 08********08 #] 12Dec2020 check for failures /media/bill/SWAPPER/Website - raw/z_Archive/201211 18h09m50s backups webPageRawe_update corrupted <TITLE> - scrapped menus? NYE - should work? S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html 403 You don't have permission to access this resource. 150525 Icebreaker unchained%20 - We should have lost World War II/ - screwed path Neural networks - no menus or files for [MindCode, NN earlier work, Paper reviews] Bad menus - several PROJECTS!! - should be OK? maybe weren't uploaded _Lies, damned lies, and scientists.html S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html page Howell - Hope-to-do projects, active and planned.html - no PROJECTS menu Neural nets/MindCode/ - no html file? Software programming & code/bin/ - gives Wickson html!!!?? Hosted websites - add projects menu "Problems with Science" menu not in PROJECTS? Lies, Damned Lies, and Scientists/General Relativity is a turkey, Quantum Mechanics is a fools paradise.html GR turkey, QM fools paradise -no menu or page? I need a message insert file for Howell side of website (like confGuides)!! 08********08 #] 09Dec2020 STOP working on this! - simple patch, get onto MindCode etc!!! qnial> str_executeEmbeds '<TITLE> Howell : [#=; fname ;=#] ' (("fname 'This is a test.html')("fout 5)("backtrack './../../')) Howell : This is a test.html >> This works fine! So why not within webSite_doAll? Forget it - some other year... Simple ConfGuide fix : qnial> str_replaceIn_pathList l d_webRawe ' Howell : [#=; fname ;=#] ' ' [Author, Committee]s guides to [IEEE-CIS, INNS] conferences ' htmlPathsSortedByPath qnial> webSite_doAll >> cofGuides webPage titles OK >> Nuts - NapierU logo for 2020 missing. >> extra space in 'IMG SRC=' qnial> webSite_doAll Upload - FileZilla is uploading ALL of Dad's paintings AGAIN! - bullshit! I switched to lftp - but now it's uploading everything, probably die to FileZilla fuckup. VERY slow!! $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" For now - just lftp the html files only! >> Nope, can't get log OR excludesork!!??? FileZilla - set up different download manager for html-only 08********08 #] 07Dec2020 Now I need to : 1. check my entire website as others can't seem to see content!!?? 2. test ConfGuide updates selectively, webPage by webPage 3. modify [bash, QNial] to do confGuides? 4. [update, upload] all webPages +--+ 1. check my entire website as others can't seem to see content!!?? http://www.billhowell.ca/Neural%20nets/Neural%20Networks.html go to village office - ask Kate Brandt no problems there wait : look at /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt Website stats for : www.BillHowell.ca 201125 12h50m26s Failures : +--+------------+ |79|errors list | +--+------------+ |22|extern fails| +--+------------+ |38|howell list | +--+------------+ |27|intern fails| +--+------------+ At least fix "howell list" : 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' : >> I need to [update, test] this optr to remove 'http://www.BillHowell.ca' internalLinks_return_relativePath Change : +.....+ % modify lineList#midIndxs for legit [fname, subDir] -> expand to relative paths ; FOR midIndx WITH midIndxs DO % don't modify lines with midIndxsLines_bads ; IF (OR (= `# (first lineList@midIndx)) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@midIndx))) THEN null ; % check for a valid fname-only, assumes only one instance of fname ; ELSEIF (NOT isfault (i_fname := find_Howell lineList@midIndx allFnamesSortedByFname)) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRawe) ; +.....+ To : +.....+ % modify lineList#midIndxs for legit [fname, subDir] -> expand to relative paths ; FOR midIndx WITH midIndxs DO % remove any http://www.BillHowell.ca BEFORE checking midIndxsLines_bads (http etc) ; IF ('http://www.billhowell.ca' subStr_in_str (str_toLowerCase lineList@midIndx)) THEN lineList@midIndx := 24 drop lineList@midIndx ; ENDIF ; % don't modify midIndxs with midIndxsLines_bads ; IF (OR (= `# (first lineList@midIndx)) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@midIndx))) THEN null ; ELSE % remove %20 from links, now that mailtos are no longer considered ; IF ('%20' subStr_in_str lineList@midIndx) THEN lineList@midIndx := str_replace_subStr '%20' ' ' lineList@midIndx ; ENDIF ; % check for a valid fname-only, assumes only one instance of fname ; IF (NOT isfault (i_fname := find_Howell lineList@midIndx allFnamesSortedByFname)) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRawe) ; +.....+ To re-try each time to resolve linkErrors, Change : +.....+ midIndxsLines_bads := 'http' 'mailto:' '!!linkError!!' './' ; +.....+ To : +.....+ midIndxsLines_bads := 'http' 'mailto:' './' ; +.....+ plus add line : liner := liner str_remove_subStr '!!linkError!!' ; +-----+ 2. test ConfGuide updates selectively, webPage by webPage http://www.billhowell.ca/Neural%20nets/Conference%20guides/Author%20guide%20website/Conference%20registration%20blog.html qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >> had to fix a few coding glitches qnial> webPageRawe_update o (link d_webRawe 'Neural nets/Conference guides/Author guide website/Conference registration blog.html') >> looks great! (p_log) Now update webPage[Rawe, Site] for real qnial> webPageRawe_update l (link d_webRawe 'Neural nets/Conference guides/Author guide website/Conference registration blog.html') qnial> webPageSite_update l (link d_webRawe 'Neural nets/Conference guides/Author guide website/Conference registration blog.html') >> IJCNN2019 sponsor logos don't show!! >> Oops - I didn't change the "message" for each Menu.html of the confGuides!! >> webWork files : OK, I changed all of the "messages" +-----+ 3. modify [bash, QNial] to do confGuides? Screw it - leap of faith, just update full deal 4. [update, upload] all webPages qnial> webSite_doAll Failures : +--+------------+ |74|errors list | +--+------------+ |22|extern fails| +--+------------+ |0 |howell list | +--+------------+ |27|intern fails| +--+------------+ >> all 'howell list' are now OK >> same number of other failures as previous MOST 'intern fails' are of form : ./SP500 1872-2020 TradingView, 1928-2020 yahoo finance.ods ./SP500 1872-2020 TradingView download.dat ./SP500 1928-2020 yahoo finance.dat ./Table of Contents ./WCCI2020 mass email [SS,Comp,Tut,Wrkshp] 191025 Howell.html ./Website tie-ins.html >> easy to fix (... someday...) Check webPageSites - start with confGuides >> everything looks OK EXCEPT I forgot the for confGuides!! Edit [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand (link d_webWork 'confHead.html') fout ; >> phraseValueList doesn't include fname. Is it available within webPageSite_update? >> So [#!: path_insertIn_fHand is just a guide (throw-away line, as coding isn't executed) webPageSite_update Change : +.....+ % so here executeEmbeds if present, with (phrase values) pairList ; IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line (("fout fout)("backtrack backtrack)) ; ENDIF ; +.....+ To : +.....+ % so here executeEmbeds if present, with (phrase values) pairList ; IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line (("fname fname)("fout fout)("backtrack backtrack)) ; ENDIF ; +.....+ confHead.html Change : +.....+ <TITLE>??? +.....+ To : +.....+ Howell : +.....+ Let's see if that works : qnial> webSite_doAll >> didn't work, because 'Howell : [#=; fname ;=#]' is in : [#!: path_insertIn_fHand (link d_webWork 'confHead.html') fout ; >> This isn't executed, seeing as it isn't in the webPageRawe file Two options - 1. nested substitution 2. Change : +.....+ [#!: path_insertIn_fHand (link d_webWork 'confHead.html') fout ; +.....+ To : +.....+ [#!: path_insertIn_fHand (link d_webWork 'fin Head_one.html') fout ; Howell : [#=; fname ;=#] [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout ; +.....+ Take the second - easier for now. #] change strOld to strNew in pathList, for strPattern, automatic path backups to d_backup # be careful - can screw up many files if str is not unique!!! # chopped-up line : str_replaceIn_pathList l d_webRawe (link '[#!: path_insertIn_fHand (link d_webWork ' chr_apo 'confHead.html' chr_apo ') fout ;') (link '[#!: path_insertIn_fHand (link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' index.html ' chr_newline '[#!: path_insertIn_fHand (link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ' ) htmlPathsSortedByPath # all webPages : str_replaceIn_pathList l d_webRawe (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'confHead.html' chr_apo ')' chr_tab 'fout ;') (link '[#!: path_insertIn_fHand (link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' Howell : [#=; fname ;=#] ' chr_newline '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ') htmlPathsSortedByPath # just test with one file str_replaceIn_pathList o d_webRawe (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'confHead.html' chr_apo ')' chr_tab 'fout ;') (link '[#!: path_insertIn_fHand '(link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' Howell : [#=; fname ;=#] ' chr_newline '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ') (solitary (link d_webRawe 'Neural nets/Conference guides/Author guide website/Author guide.html')) >> Result +--+ [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand(link d_webWork 'fin Head_one.html') fout ; Howell : [#=; fname ;=#] [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout ; [#!: path_executeEmbedsInsertIn_fHand (link d_webWork 'confMenu_authors.html') phraseValueList ; +--+ >> OK - just add some tab str_replaceIn_pathList o d_webRawe (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'confHead.html' chr_apo ')' chr_tab 'fout ;') (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' Howell : [#=; fname ;=#] ' chr_newline '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ') (solitary (link d_webRawe 'Neural nets/Conference guides/Author guide website/Author guide.html')) +--+ [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand (link d_webWork 'fin Head_one.html') fout ; Howell : [#=; fname ;=#] [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout ; [#!: path_executeEmbedsInsertIn_fHand (link d_webWork 'confMenu_authors.html') phraseValueList ; +--+ >> beautiful Take a break - Friends of Science AGM Full-meal deal : str_replaceIn_pathList l d_webRawe (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'confHead.html' chr_apo ')' chr_tab 'fout ;') (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' Howell : [#=; fname ;=#] ' chr_newline '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ') htmlPathsSortedByPath qnial> webSite_doAll >> Oops - didn't work, as the line below was "erased!?" Howell : [#=; fname ;=#] >Why - this shouldn't happen? [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand (link d_webWork 'fin Head_one.html') fout ; Howell : [#=; fname ;=#] [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout ; [#!: path_executeEmbedsInsertIn_fHand (link d_webWork 'confMenu_authors.html') phraseValueList ; qnial> str_executeEmbeds ' Howell : [#=; fname ;=#] ' (("fname 'This is a test.html')("fout 5)("backtrack './../../')) Howell : This is a test.html >> This works fine! So why not within webSite_doAll? 08********08 07Dec2020 /media/bill/SWAPPER/Website - raw/webWork files/ fin confHead.html fin confFoot_authors.html fin confFoot.html /media/bill/SWAPPER/Website - raw/webWork files/confMenu_overall.html /media/bill/SWAPPER/Website - raw/webWork files/confMenu_authors.html YIKES!!!, will moz-do-not-send="true" fail, as it must NOT occur before SRC=. however, this is for emails, not webPages!! : Conference Guide Next : Publications Guide, then Publicity, Reviewers, Sponsors /media/bill/SWAPPER/Website - raw/webWork files/confMenu_publications.html #] 08Dec2020 resume work /media/bill/SWAPPER/Website - raw/webWork files/confMenu_publicity.html /media/bill/SWAPPER/Website - raw/webWork files/confMenu_sponsors.html 08********08 #] 25Nov2020 'webSite [menuHeadFoot, link, TableOfContents, link] tools.html' I botched it together. qnial> webPageRawe_update l (link d_webRawe 'Software programming & code/Qnial/webSite [menuHeadFoot, link, TableOfContents, link] tools.html') qnial> webPageSite_update l (link d_webRawe 'Software programming & code/Qnial/webSite [menuHeadFoot, link, TableOfContents, link] tools.html') 08********08 #] 25Nov2020 index.html - put in a smaller sized image! I made a 200*200 pixel image of the big chart : http://www.billhowell.ca/Civilisations%20and%20sun/Howell%20-%20radioisotopes%20and%20history.jpg qnial> webPageRawe_update l (link d_webRawe 'index.html') qnial> webPageSite_update l (link d_webRawe 'index.html') >> image doesn't show!? >> OKnow file:///media/bill/HOWELL_BASE/Website/Civilisations and sun/Howell - radioisotopes and history 200 by 200 pixels.jpg qnial> webPageRawe_update l (link d_webRawe 'Civilisations and sun/_Civilisations and the sun.html') qnial> webPageSite_update l (link d_webRawe 'Civilisations and sun/_Civilisations and the sun.html') 08********08 #] 25Nov2020 lftp instead of fileZilla upload see "$d_SysMaint""internet & wifi/lftp notes.txt $ bash "$d_PROJECTS""bin-secure/lftp update www-BillHowell-ca.sh" bash: /media/bill/PROJECTS/bin-secure/lftp update www-BillHowell-ca.sh: No such file or directory >> kills me - I can't seem to get the file!!!???!!! >>bin - secure : Permissions, file access was null, set to read & write $ bash "$d_PROJECTS""bin-secure/lftp update www-BillHowell-ca.sh" bash: /media/bill/PROJECTS/bin-secure/lftp update www-BillHowell-ca.sh: No such file or directory qnial> a := EACH string 'lftp update www-BillHowell-ca.sh' +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ |l|f|t|p| |u|p|d|a|t|e| |w|w|w|-|B|i|l|l|H|o|w|e|l|l|-|c|a|.|s|h| +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ qnial> a EACHLEFT in chrs_fnames llllllllllllllllllllllllllllllll >> OK - so no weird chrs Why can't this file be accessed?? >> shit, I miss-spelled and couldn't see the obvious -> spaces around the hypen search "Linus lftp and how do I ensure that only new versions are uploaded?" Hmm, no direct answer backup "$d_webSite/Mythology/" $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" 02--02 mirror: Access failed: /Mythology: No such file or directory mkdir: Access failed: 550 Mythology/: File exists (/billhowell.ca/Mythology/) lftp: MirrorJob.cc:242: void MirrorJob::JobFinished(Job*): Assertion `transfer_count>0' failed. /media/bill/PROJECTS/bin - secure/lftp update www-BillHowell-ca.sh: line 9: 13719 Aborted lftp $PROTOCOL://$URL <<-UPLOAD user $USER "$PASS" cd $REMOTEDIR mirror --reverse --recursion=newer "$d_webSite/Mythology/" "/billhowell.ca/Mythology/" close UPLOAD /media/bill/PROJECTS/bin - secure/lftp update www-BillHowell-ca.sh: line 33: /home/user/script.log: No such file or directory 02--02 >> hmm, I have to put in full path for lftp? Change : +.....+ mirror --reverse --recursion=newer "$d_webSite/Mythology/" "/billhowell.ca/Mythology/" +.....+ To : +.....+ mirror --reverse --recursion=newer "/media/bill/HOWELL_BASE/Website/Mythology/" "/billhowell.ca/Mythology/" +.....+ $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" 02--02 mirror: Access failed: /media/bill/HOWELL_BASE/Website/Mythology/Mythology: No such file or directory mkdir: Access failed: 550 Mythology/: File exists (/billhowell.ca/Mythology/) lftp: MirrorJob.cc:242: void MirrorJob::JobFinished(Job*): Assertion `transfer_count>0' failed. /media/bill/PROJECTS/bin - secure/lftp update www-BillHowell-ca.sh: line 9: 13990 Aborted lftp $PROTOCOL://$URL <<-UPLOAD user $USER "$PASS" cd $REMOTEDIR mirror --reverse --recursion=newer "/media/bill/HOWELL_BASE/Website/Mythology/Mythology/" "/billhowell.ca/Mythology/" close UPLOAD /media/bill/PROJECTS/bin - secure/lftp update www-BillHowell-ca.sh: line 33: /home/user/script.log: No such file or directory 02--02 >> failed as directory exists >> I need to check a small directory with an updated webPage /media/bill/SWAPPER/Website - raw/economics, markets/SP500/PE Schiller forward vs 10yr Tbills/ backup "PE Schiller forward vs 10yr Tbills/", wwwBillHowell_update() Change : +.....+ mirror --reverse --recursion=newer "/media/bill/HOWELL_BASE/Website/Mythology/" "/billhowell.ca/Mythology/" +.....+ To : +.....+ mirror --reverse --only-newer "/media/bill/HOWELL_BASE/Website/economics, markets/SP500/PE Schiller forward vs 10yr Tbills/" "/billhowell.ca/economics, markets/SP500/PE Schiller forward vs 10yr Tbills/" +.....+ $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" >> OK, it looked like ONLY the html file was uploaded backup & try : mirror --reverse --only-newer "/media/bill/SWAPPER/Website - raw/economics, markets/SP500/multi-fractal/" "/billhowell.ca/economics, markets/SP500/multi-fractal/" >> OK, again it looked like ONLY the html file was uploaded >> Check online file : looks good, has images now Try a dry-run of the whole webSite mirror --reverse --only-newer --dry-run "/media/bill/HOWELL_BASE/Website/" "/billhowell.ca/" Screw the [get, permissions] - just run with >> Howell's command in bash file "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" : mirror --reverse --only-newer --log=$LOG "/media/bill/HOWELL_BASE/Website/" "/billhowell.ca/" problems with LO file, doesn't accept spaces? Change to : LOG="lftp update www-BillHowell-ca log.txt" I had to remove --log=$LOG : mirror --reverse --only-newer --log=$LOG "/media/bill/HOWELL_BASE/Website/" "/billhowell.ca/" lftp is VERY slow compared to fileZilla!! >> go back to fileZilla... and fight with settings 08********08 #] 25Nov2020 fixes : Neil Howell (lost images - manwebPages affected) Kyoto fraud - incorrect backtrack for bodylinks... Conference guides not covered by webSite_extract_pathsSubDirsFnames 05----05 '_Kyoto Premise - the scientists arent wearing any clothes.html' - fname problems? qnial> a := EACH string 'Landsea, The hurricane expert who stood up to UN junk science.pdf' qnial> chrs_fnames:= link chrs_alphaNumeric (EACH string ` `. `, `_ `-) +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+- |0|1|2|3|4|5|6|7|8|9|a|A|b|B|c|C|d|D|e|E|f|F|g|G|h|H|i|I|j|J|k|K|l|L|m|M|n|N|o|O|p|P|q|Q|r|R|s|S|t|T|u|U|v|V|w +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+- +-+-+-+-+-+-+-+-+-+-+-+-+ |W|x|X|y|Y|z|Z| |.|,|_|-| +-+-+-+-+-+-+-+-+-+-+-+-+ qnial> a EACHLEFT in chrs_fnames lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll >> OK - so no weird chrs >> This one is OK in webSite, other twrong backtrack like the others qnial> a := EACH string 'Akasofu, Little Ice Age is still with us.pdf' +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ |A|k|a|s|o|f|u|,| |L|i|t|t|l|e| |I|c|e| |A|g|e| |i|s| |s|t|i|l|l| |w|i|t|h| |u|s|.|p|d|f| +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ qnial> a EACHLEFT in chrs_fnames llllllllllllllllllllllllllllllllllllllllllll >> OK - so no weird chrs So why did these generate !!linkError!! ?? Also - all backtracks in bodyLinks are wrong - Why? eg : "../Climate - Kyoto Premise fraud/Abdussamatov, look to Mars for the truth on global warming.pdf" "../Howell - Are we ready for Global Cooling, comments 14Mar06.pdf" >> should have 2 ../ second example missing subDir Check other bodyLinks in d_webSite (I have already, but see again) OK index.html most Past & future worlds.html - only one failure : file:///media/bill/HOWELL_BASE/Website/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/!!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/ OK Peratt - Auroral phenomena and petroglyphs.html Maybe the fname-only case needs to add 1? nyet? I don't have an answer. 05----05 Neil Howell images don't show Of course - the current webPage doen't have > What happened? Look for recent version OK - update webPage without text wrap NO wrapping : https://www.angelfire.com/nm/thehtmlsource/jazzup/image/stoptextwrap.html


    >> wow! easy, and after years of searching, only one guy makes it clear! #] 25Nov2020 Problem is, my coding probably destroyed many images during initial code development. Look at 'z_Old/201027 13h46m19s backups' versions I have to [list, check, fix] all of them!! (probably 10 or so) A index.html x Climate - Kyoto Premise fraud x Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html A _Pandemics, health, and the sun.html 1 Howell - corona virus.html x Howell - corona virus of countries, by region.html A Howell - influenza virus.html x _Climate and sun.html D 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html x _Lies, damned lies, and scientists.html x S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html R Conference Guide' webPage x = doesn't need correction or doesn't have images (checked in html file) c = corrected images n = # of image links fixed (not all) A = all image links had to be re-inserted R = all Conference guide webPages need a revamp of the header! D - desroyed by over-writing with another file!!! qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >>> loading start : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf >>>>>> loading start : webSite header.ndf <<<<<< loading ended : webSite header.ndf <<< loading ended : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf qnial> webSite_doAll 05----05 Kyoto fraud - incorrect backtrack for bodylinks... looks hard &g continnuation of many past bats on same issue leave this for after-taxes 05----05 Conference guides not covered by webSite_extract_pathsSubDirsFnames Nuts, I had left 'Conference guides' --invertmatch of webSite_extract_pathsSubDirsFnames qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >>> loading start : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf >>>>>> loading start : webSite header.ndf <<<<<< loading ended : webSite header.ndf <<< loading ended : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf qnial> webSite_extract_pathsSubDirsFnames >> 'webSite webPageList.txt' now has 187 webPages Redo webSite_doAll un-(commenting out) : writeDoStep (link 'urls_check ' chr_apo 'extern' chr_apo) ; qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >>> loading start : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf >>>>>> loading start : webSite header.ndf <<<<<< loading ended : webSite header.ndf <<< loading ended : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf qnial> webSite_doAll /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 201125 11h00m33s Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : +----+-----------------------------+ |7004|count of all links in webSite| +----+-----------------------------+ 1147 = count of all [file, dir]s targeted by links on the webSite Counts below are the number of unique TARGETED [file, dir]s of links (eg 3+ links per target on average) Failures : +--+------------+ |79|errors list | +--+------------+ |22|extern fails| +--+------------+ |38|howell list | +--+------------+ |27|intern fails| +--+------------+ Unknowns - I havent written code to really show [OK, fail] : +---+-----------+ |71 |mailto list| +---+-----------+ |277|pgPosn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |239|extern OK| +---+---------+ |394|intern OK| +---+---------+ [fail, unknown, OK, total] counts : +----+-------------+ | 166|failed links | +----+-------------+ | 348|unknown links| +----+-------------+ | 633|OK links | +----+-------------+ |1147|total | +----+-------------+ >> looks nice! 08********08 #] 24Nov2020 upload to webOnln via fileZilla working well now? I have to break for a month - income taxes, visits etc! Problem with useless re-uploads (file dating problem!) 08********08 #] 24Nov2020 webSite_doAll - add a check to see if [z_Archive, z_Old] dirs in d_webSite check for [z_Archive, z_Old] : find: ‘’: No such file or directory >> OK, this works However, now the webPageSites are not being updated! I can't update webOnln via fileZilla until it works Sigh - what now? 201124 11h39m47s webAllRawOrSite_update l "webPageSite_update 201124 11h39m47s webURLs_extract >> my guess - flag_break screws the optr? webAllRawOrSite_update Change : +.....+ ELSE webPageSite_update webPage ; +.....+ To : +.....+ ELSE webPageSite_update flag_backup webPage ; +.....+ Wait! - I havent updated webSite_extract_pathsSubDirsFnames >> But this will add in 'Conference Guides' >> What the hell, just do it? I can always revert back webSite_doAll - I added : writeDoStep 'webSite_extract_pathsSubDirsFnames' ; >> NUTS! webSite webPageList.txt was updated, but has NONE of the 'Conference Guides' webPages!!??!! qnial> gage shape htmlFnamesSortedByFname 106 qnial> webSite_extract_pathsSubDirsFnames qnial> gage shape htmlFnamesSortedByFname 106 >> not working. Why? qnial> webSite_doAll /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 201124 12h08m34s Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : Failures : +---+------------+ |54 |errors list | +---+------------+ |17 |extern fails| +---+------------+ |27 |howell list | +---+------------+ |651|intern fails| +---+------------+ Unknowns - I havent written code to really show [OK, fail] : +---+-----------+ |48 |mailto list| +---+-----------+ |109|pgPosn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |115|extern OK| +---+---------+ |3 |intern OK| +---+---------+ [fail, unknown, OK, total] counts : +----+-------------+ | 749|failed links | +----+-------------+ | 157|unknown links| +----+-------------+ | 118|OK links | +----+-------------+ |1024|total | +----+-------------+ >> STILL no update of webPageSites!!??!! >> webSite_doAll still comments out : % writeDoStep (link 'urls_check ' chr_apo 'extern' chr_apo) ; >> '0_webPage_update log.txt' - shows the changes that SHOULD have been made OUCH! I've overwritten webPageRawes with wbPageSite style >> I must restore, and lose a greatal of stuff? qnial> dirBackup_restoreTo_paths o (link d_webRawe 'z_Old/201123 22h24m20s backups webPageRawe_update/') htmlPathsSortedByPath >> o for non-dated fnames webPageSite_update Change : +.....+ host link 'mv "' p_temp '" "' webPageRawe '"' ; +.....+ To : +.....+ host link 'mv "' p_temp '" "' webPageSite '"' ; +.....+ webPageSite := link d_webSite subDir fname ; %write fname ; IF (path_exists '-f' p_temp) THEN host (link 'diff --width=85 "' webPageSite '" "' p_temp '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; host link 'echo "" >>"' p_log '"' ; IF flag_backup THEN host link 'mv "' p_temp '" "' webPageSite '"' ; ENDIF ; ELSE host link 'echo ?webPageSite_update error, p_temp not created >>"' p_log '"' ; ENDIF ; Reverse yesterday's change? back to : +.....+ WHILE (NOT isfault (line := readfile finn)) DO % webPageRawe_update MUST have already processed links ; % to '[#=; backtrack ;=#], followed by a [legit, full] subir ; % so here executeEmbeds if present, with (phrase values) pairList ; IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line (("fout fout)("backtrack backtrack)) ; ENDIF ; +.....+ To : +.....+ WHILE (NOT isfault (line := readfile finn)) DO % process links ; IF ('> Hmm, this is a problem, I think? Better to leave it for now. But the old way worked, with str_executeEmbeds BEFORE the internalLinks_return_relativePath lines. Try what I have (again, no urls_check 'extern') /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 201124 13h33m00s Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : Failures : +--+------------+ |54|errors list | +--+------------+ |17|extern fails| +--+------------+ |27|howell list | +--+------------+ |7 |intern fails| +--+------------+ Unknowns - I havent written code to really show [OK, fail] : +---+-----------+ |48 |mailto list| +---+-----------+ |105|pgPosn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |115|extern OK| +---+---------+ |330|intern OK| +---+---------+ [fail, unknown, OK, total] counts : +---+-------------+ |105|failed links | +---+-------------+ |153|unknown links| +---+-------------+ |445|OK links | +---+-------------+ |703|total | +---+-------------+ >> Looks nice, but did it work? index.html all main menu items work saw some !!linkError!! in .html file Climate - Kyoto Premise fraud all main menu items work Projects menus - all work EXCEPT : file:///media/bill/HOWELL_BASE/Website/Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html new links to Financial Post Deniers work (3 tested) page blogs.html all subMenus work bodyLink to Howell's blog doesn't work (strange, as the menu does) Directory of files works (as do other webPageSites) +--+ Conference guides NO [main, subDir] menus work Directory of files doesn't work, no [GNU, Creative Commons] images How come this didn't give rise to a huge number of 'intern fails'? >> see Authors'guide below - only this web-page sucks? Authors' guide ALL main menu links work ALL 'Non-author actions' menus work ALL bodylinks that I looked at worked >> I am STUNNED that this works!!! PubChair guide as with Authors' guide, ALL menu links work nicely I didn't check bodyLinks >> Again, I am STUNNED that the conference guide links work, even if it was only a little bit!!! For fun, check 'extern' links : qnial> urls_check 'extern' 05-----05 Olde Code # olde code webPageSite_update : I don't need to back up webPageSite, as it is entirely based on webPageRawe. Only webPageRawe needs backup. Keep it in for now, just in case. IF flag_backup THEN IF (`m in d_htmlBackup) THEN webPage path_backupTo_dir d_htmlBackup ; ELSE webPage path_backupDatedTo_dir d_htmlBackup ; ENDIF ; ENDIF ; # 23Nov2020 WHILE (NOT isfault (line := readfile finn)) DO % process links ; IF ('> Why "/home/bill/ ?? backer() { # $p_excl - must avoid transferring [z_Archive, z_Old, References] p_excl="$1" d_src="$2" d_out="$3" becho "rsync $2" becho "to $3" rsync "$options" --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" bash "$d_bin""du_diff.sh" "$d_src" "$d_out" } Change : +.....+ bash "$d_bin""du_diff.sh" "$d_src" "$d_out" +.....+ To : +.....+ bash "$d_bin""du_diff.sh" "$d_src" "$d_out" >>"p_log" +.....+ rsync /media/bill/PROJECTS/bin/ to /media/bill/SWAPPER/Website - raw/Software programming & code/bin/ rsync: link_stat "/home/bill/ --stats --itemize-changes -rltgu " failed: No such file or directory (2) rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1196) [sender=3.1.2] NYET!!! I don't want du_diff.sh!! - dirSizes only! just remove this! Try save options DIRECTLY in backer() Change : +.....+ rsync "$options" --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" +.....+ To : +.....+ rsync --stats --itemize-changes -rltgu --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" +.....+ $ bash "$d_bin""rsync website.sh" rsync /media/bill/PROJECTS/bin/ to /media/bill/SWAPPER/Website - raw/Software programming & code/bin/ rsync /media/bill/PROJECTS/Lucas/ to /media/bill/SWAPPER/Website - raw/Lucas/ rsync /media/bill/PROJECTS/MindCode/ to /media/bill/SWAPPER/Website - raw/MindCode/ rsync /media/bill/PROJECTS/Qnial/ to /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/ rsync /media/bill/PROJECTS/System_maintenance/ to /media/bill/SWAPPER/Website - raw/Software programming & code/System_maintenance/ rsync /media/bill/SWAPPER/Website - raw/ to /media/bill/HOWELL_BASE/Website/ >> Wow! seems to work >> 'Climate - Kyoto Premise fraud' - has been rsync'd!!! Change : +.....+ rsync "$options" --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" ... # options=" --dry-run --itemize-changes -rltgu " # report what will be done, but no transfers options=" --stats --itemize-changes -rltgu " +.....+ To : +.....+ if [ "$options" == 'test' ]; then rsync --dry-run --itemize-changes -rltgu --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" else rsync --stats --itemize-changes -rltgu --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" fi ... # options="test" # report what will be done, but no transfers options="change" +.....+ 08********08 #] 23Nov2020 Should webPageSite_update be doing internalLinks_return_relativePath? That is the job of webPageRawe_update BEFORE calling webPageSite_update, all links should have '[#=; backtrack ;=#], followed by a [legit, full] subir Can I Change : +.....+ WHILE (NOT isfault (line := readfile finn)) DO % process links ; IF (' webSite_doAll Holy shit! It works now and I don't know why!!?? /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt Website stats for : www.BillHowell.ca 201123 22h31m47s Summary of the number of links by type [external, internal, menu, tableOfContent] and [OK, bad] : Failures : +--+------------+ |54|errors list | +--+------------+ |18|extern fails| +--+------------+ |27|howell list | +--+------------+ |34|intern fails| +--+------------+ Unknowns - I havent written code to really show [OK, fail] : +---+-----------+ |48 |mailto list| +---+-----------+ |105|pgPosn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |114|extern OK| +---+---------+ |303|intern OK| +---+---------+ [fail, unknown, OK, total] counts : +---+-------------+ |133|failed links | +---+-------------+ |153|unknown links| +---+-------------+ |417|OK links | +---+-------------+ |703|total | +---+-------------+ errors list (!!linkError!!) - will need real work? extern fails : amazon is (7/18), obviously I must not use their links in the futurre non seem critical, but is work to find alt link & fix howell list - special correction to 'Conference guides' intern fails : Almost all (27/34) are due to 'Climate - Kyoto Premise fraud' pdfs, which have not been rsync'd yet. Four are due to './' mistakes - easy to correct. 08********08 #] 23Nov2020 re-retry webSite_doAll - see if webPageRawes are updated after this - I'm going to eat popcorn & break for the night, fail or succeed qnial> webSite_doAll >> same problems >> I should have run internalLinks_return_relativePath_tests first Ah hah! The simple fname fails now (again - it keeps doing that, and I keep fixing it) # internalLinks_return_relativePath_test example #1 : FAILED - result does NOT match standard t_input, t_standard, t_result = +---------+--------+-+-----------------------------------------------------------+ |../../../|| +---------+--------+-+-----------------------------------------------------------+
  • ........ #] 30Oct2020 Simple check of multiple " /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- Howell - corona virus.html:177: /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- Howell - corona virus.html update.html:177: /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- Howell - corona virus.html str_replaceIn_path.html:177: /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- Howell - corona virus.html convertBodyLinks.html:177: /media/bill/SWAPPER/Website - raw/economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html:178: >> ALL of these should work! >> NO, NOT for webPageSite_update!!?? It only inserets backtrack? 02--02 Howell 920000 Front - Introduction.pdf qnial> (= 'Howell 920000 Front - Introduction.pdf' allFnamesSortedByFname ) sublist allPathsSortedByFname >> again - no hits? It IS in : '/media/bill/HOWELL_BASE/Website/Pandemics, health, and the Sun/corona virus' try links via geany of 'webSite urlList.txt' : Howell 920000 Front - Introduction.pdf /media/bill/HOWELL_BASE/Website/Lies, Damned Lies, and Scientists/Howell 920000 Front - Introduction.pdf >> This is in that directory, so why the failure? 02--02 NUTS! I haven't updated using webSite_extract_pathsSubDirsFnames? >> that shouldn't be the problem! I guess the good news is that many of the intern fails have 2-3 fails per path. So it may not be as badit looks Why isn't internalLinks_return_relativePath working now? My changes yesterday must have screwed it up. Hard to track though, even with notes. From 20Nov2020 : internalLinks_return_relativePath : Change : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell (fname := (1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ >> This looks suspicious? To : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell (1 + (last findAll_Howell `/ lineList@midIndx)) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ >> Why did I drop lineList@midIndx? Might be a leftover? Re-[loaddefs, internalLinks_return_relativePath_test] : >> Now #2 also fails (!!linkError!!) >> problem rains - it seems that the step above is being ignored To : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell ((1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ Now Change (add "first") To : +.....+ ELSEIF (NOT isfault (i_fname := first find_Howell ((1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ 08********08 #] 23Nov2020 Check [webPageSite, webPageOnln] etc for errors First - fix some problems, but not ones that require webAllRawOrSite_update, which I will do after >> OK for n0w - I only fixed one or two below qnial> webSite_doAll >> OUCH!! backtrack problem again - often the wrong number of'../' >> Conference guides are included!??! - possibly because I changed --max-depth to 4 I have now REMOVED the max-depth!! the invert-match should exclude ["z_Old, z_Archive, System_maintenance] anyways, but I removed Conference Guides exclusion. I might asl have it in as well - changes for next round... --invert-match "Conference guides\|z_Old\|z_Archive\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " Big issue is the failed backtracks... Re-try : qnial> webSite_doAll Wait a minutewebPageSites were updated, but not webPageRawes. Is this normal? >> NO - webPageRawe_update is supposed to process all webPageRawes!! Why didn't it? webPageRawe_update IS OP flag_backup webPage IF flag_backup THEN host link 'mv "' p_temp '" "' webPage '"' ; ENDIF ; webSite_doAll IS { str_replaceIn_pathList l d_webRawe '!!linkError!!' '[#=; backtrack ;=#]' htmlPathsSortedByPath webAllRawOrSite_update l "webPageRawe_update ; webAllRawOrSite_update l "webPageSite_update ; webURLs_extract ; EACH urls_check 'intern' 'extern' ; webSite_link_counts ; } >so this should be changing every webPageRawe! last updates Sat 21Nov2020 Not this? : subDir fname := path_retrieve_subDirFname webPage d_webRawe ; fileops.ndf : path_retrieve_subDirFname IS OP path dirBase { LOCAL fname fPath subDir ; NONLOCAL webSiteAllPathList ; IF (NOT (dirBase subStr_in_str path)) THEN fault '?path_retrieve_subDirFname error, dirBase not in path' ELSE fname := path_extract_fname path ; IF (isfault fname) THEN fault '?path_retrieve_subDirFname error, fname' ELSE subDir fname := ((path str_extractPast_strFront dirBase) str_remove_subStr fname) fname ENDIF ENDIF } OUCH!! changed around arguments? : str_replaceIn_path IS OP flag_backup d_backup strOld strNew path str_replaceIn_pathList IS OP flag_backup d_backupRoot strOld strNew pathList ELSE str_replaceIn_path o '' strOld strNew pinn ; >> No, this is OK webPageRawe_update Change : +.....+ IF flag_backup THEN IF (`m in d_htmlBackup) THEN webPage path_backupTo_dir d_htmlBackup 'Rawe' ; ELSE webPage path_backupDatedTo_dir d_htmlBackup ; ENDIF ; ENDIF ; +.....+ To : +.....+ IF flag_backup THEN IF (`m in d_htmlBackup) THEN webPage path_backupTo_dir d_htmlBackup ; ELSE webPage path_backupDatedTo_dir d_htmlBackup ; ENDIF ; ENDIF ; +.....+ webPageRawe_update Change : +.....+ webPageRawe_update IS OP flag_backup webPage ... IF flag_backup THEN host link 'mv "' p_temp '" "' webPage '"' ; ENDIF ; +.....+ To : +.....+ webPageRawe_update IS OP flag_backup flag_update webPage ... IF flag_update THEN host link 'mv "' p_temp '" "' webPage '"' ; ENDIF ; +.....+ I will also thatve to change str_replaceIn_path IS OP flag_backup d_backup strOld strNew path { LOCAL finn fout flag_chg line p_temp ; % ; % backup unless a test, or called by [str_replaceIn_dir, str_replaceIn_pathList] which already backup ; IF flag_backup THEN path path_backupDatedTo_dir d_backup ; ENDIF ; p_temp := link d_temp 'str_replaceIn_path temp.txt' ; flag_chg := o ; % ; finn := open path "r ; fout := open p_temp "w ; WHILE (NOT isfault (line := readfile finn)) DO IF (strOld subStr_in_str line) THEN IF (= o flag_chg) THEN flag_chg := l ; %write path ; ENDIF ; EACH write '' line ; line := str_replace_subStr strOld strNew line ; write line ; ENDIF ; writefile fout line ; ENDWHILE ; EACH close finn fout ; % ; IF flag_backup THEN host link 'mv "' p_temp '" "' path '"' ; ENDIF ; } str_replaceIn_pathList IS OP flag_backup d_backupRoot strOld strNew pathList { LOCAL d_backup pinn ; % backups are automatically done, except for testing purposes!! ; IF flag_backup THEN d_backup := link d_backupRoot 'z_Archive/' timestamp_YYMMDD_HMS ' backups str_replaceIn_pathList/' ; host link 'mkdir "' d_backup '" ' ; ENDIF ; % ; IF (NOT path_exists '-d' d_backup) THEN EACH write '?str_replaceIn_pathList error : could not create d_backup : ' d_backup ; ENDIF ; FOR pinn WITH pathList DO IF (NOT path_exists ("r pinn)) THEN EACH write '' '?str_replaceIn_pathList error, file unknown : ' pinn '' ; % keep flag_backup = o (false) because all targeted files are backed up above ; ELSE str_replaceIn_path flag_backup d_backup strOld strNew pinn ; ENDIF ; ENDFOR ; } 5-----5 webPageSite : file:///media/bill/HOWELL_BASE/Website/index.html no images.. file:///media/bill/HOWELL_BASE/Website/page%20Publications%20&%20reports.html no menu - execute embed only file:///media/bill/HOWELL_BASE/Website/Neil%20Howell/_Neil%20Howell.html no images, menu, etc Projects subMenus 'S&P500 1872-2020, 83y trend' shows 'file:///media/bill/HOWELL_BASE/Website/economics,%20markets/SP500/multi-fractal/1872-2020%20SP500%20index,%20ratio%20of%20opening%20price%20to%20semi-log%20detrended%20price.html' >> wrong link! 'Pandemics, health, Sun : ' 'Fun, crazy stuff' -> change this 'Astro-correlates of health' 'Anthony Peratt -petroglyphs' -

    as Title 5-----5 Online checks Home page - no images? Olde Code % Delete past versions of p_temp, so update failures will result in a diff error message ; IF (path_exists '-f' p_temp) THEN host link 'rm "' p_temp '"' ; ENDIF ; 08********08 #] 22Nov2020 Continue to setup 'rsync website.sh' 1. d_PROJECTS -> rsync script -> d_webRawe (see p_log="$d_bin""rsync log PROJECTS_to_webRawe.txt") 2. d_webRawe -> rsync script -> d_webSite (see p_log="$d_bin""rsync log webRawe_to_webSite.txt" ) 3. check that there are no z_[Archive, Old] in d_webSite must be very careful NOT to delete these from [PROJECTS, d_webRawe]!!! 4. d_webSite -> fileZilla ftp -> BillHowell.ca +-----+ Step 1 : d_PROJECTS -> rsync script -> d_webRawe (see p_log="$d_bin""rsync log PROJECTS_to_webRawe.txt") rsync NO webPages, as this will be done by : link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >> most had already been transferred perhaps several months ago. backer_rsync() - 201122 17h36m rsync of /media/bill/PROJECTS/bin/ to /media/bill/SWAPPER/Website - raw/Software programming & code/bin/ Number of files: 381 (reg: 360, dir: 21) Number of created files: 1 (reg: 1) sent 51,063 bytes received 115 bytes 102,356.00 bytes/sec total size is 30,488,826 speedup is 595.74 backer_rsync() - 201122 17h36m rsync of /media/bill/PROJECTS/Lucas/ to /media/bill/SWAPPER/Website - raw/Lucas Number of files: 21 (reg: 20, dir: 1) Number of created files: 0 sent 959 bytes received 12 bytes 1,942.00 bytes/sec total size is 1,104,218 speedup is 1,137.20 backer_rsync() - 201122 17h36m rsync of /media/bill/PROJECTS/Qnial/ to /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/ Number of files: 1,214 (reg: 1,143, dir: 71) Number of created files: 0 sent 32,806 bytes received 94 bytes 65,800.00 bytes/sec total size is 31,690,557 speedup is 963.24 backer_rsync() - 201122 17h36m rsync of /media/bill/PROJECTS/System_maintenance/ to /media/bill/SWAPPER/Website - raw/Software programming & code/System_maintenance/ Number of files: 1,089 (reg: 921, dir: 168) Number of created files: 0 sent 33,654 bytes received 183 bytes 67,674.00 bytes/sec total size is 401,253,276 speedup is 11,858.42 All looks OK? - won't really know until [testing, usage] +-----+ Step 2. d_webRawe -> rsync script -> d_webSite (see p_log="$d_bin""rsync log webRawe_to_webSite.txt" ) backer_rsync() - 201122 18h20m rsync of /media/bill/SWAPPER/Website - raw/ to /media/bill/HOWELL_BASE/Website/ Number of files: 10,638 (reg: 9,424, dir: 1,213, link: 1) Number of created files: 319 (reg: 304, dir: 15) sent 210,681,294 bytes received 12,066 bytes 32,414,363.08 bytes/sec total size is 10,026,762,188 speedup is 47.59 +-----+ Step 3. check that there are no z_[Archive, Old] in d_webSite d_webRawe NEEDs z_[Archive, Old], but I just want to see the status : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type d -name "*z_Archive" /media/bill/SWAPPER/Website - raw/Lucas/math Howell/z_Archive /media/bill/SWAPPER/Website - raw/Lucas/math Lucas/z_Archive /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/z_Archive /media/bill/SWAPPER/Website - raw/z_Archive /media/bill/SWAPPER/Website - raw/Hussar/SummerDaze/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/bin/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/bin/email scripts/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/bin/backupper/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/System_maintenance/Linux/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/System_maintenance/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/Puetz & Borchardt/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/Lies, damned lies, and scientists/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/Voja - education/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/Colin James - KanbanNN, 4VL logic/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/JC-NPS/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/History/z_Archive /media/bill/SWAPPER/Website - raw/economics, markets/z_Archive /media/bill/SWAPPER/Website - raw/economics, markets/SP500/z_Archive /media/bill/SWAPPER/Website - raw/economics, markets/SP500/Fibonacci mirror/z_Archive $ find "/media/bill/HOWELL_BASE/Website/" -maxdepth 4 -type d -name "*z_Archive" /media/bill/HOWELL_BASE/Website/Projects - mini/Puetz & Borchardt/z_Archive $ find "/media/bill/HOWELL_BASE/Website/" -maxdepth 4 -type d -name "*z_Old" /media/bill/HOWELL_BASE/Website/Projects - mini/History/Temple - Egyptian Dawn/z_Old /media/bill/HOWELL_BASE/Website/Software programming & code/System_maintenance/FireFox/z_Old /media/bill/HOWELL_BASE/Website/economics, markets/SP500/z_Old >> I deleted these! +-----+ Step 4. d_webSite -> fileZilla ftp -> BillHowell.ca I left it running while I went to visit Adrian. Seems to have uploaded 08********08 #] 22Nov2020 Trivial change to symbols to align better webPageRawe d_webRawe 10:00 'urls extern fails.txt' - fixes 3/4 are a Financial Post series on climate deniers,eg : HTTP/1.1 404 Not Found !!! http://www.canada.com/nationalpost/financialpost/comment/story.html?id=2271ac23-6895-4789-9da0-6b28968b8d15 I should remove all and simply link to Lawrence Soloman's book 5 are amazon.com links - leave for now 05-----05 webURLs_extract - p_internURL_fails : pgPosns [extract to p_pgPosnURLs, remove from p_internURL_fails] Untested : % move all "pgPosn" links in p_internURLs to p_pgPosnURLs ; host link 'grep "#" "' p_internURLs '" >>"' p_pgPosnURLs '"' ; host link 'grep --invert-match "#" "' p_internURLs '" >"' p_temp1 '"' ; host link 'mv "' p_temp1 '" "' p_internURLs '"' ; >> watch out for problems with this!! 05-----05 fix anomalous first line of webPageSites : d_webRawe 'page Publications & reports.html' gives "' reports.html " at top of web-page d_webRawe 'Solar modeling and forecasting/_Solar modeling & forecasting.html' gives "' forecasting.html " at top of web-page (like earlier cases) >> looks OK now, I think I fixed it yesterday 05-----05 failed links - Maybe I need to rsync files?? : Randell Mills : /media/bill/HOWELL_BASE/Website/Randell Mills - hydrinos/ /media/bill/HOWELL_BASE/Website/Randell Mills - hydrinos/Howell - review of Holverstott 2016 Hydrino energy.pdf Social media : /media/bill/HOWELL_BASE/Website/Social media/Howell 110902 - Systems design issues for social media.pdf /media/bill/HOWELL_BASE/Website/Social media/Howell 111006 - Semantics beyond search.pdf /media/bill/HOWELL_BASE/Website/Social media/Howell 111230 - Social graphs, social sets, and social media.pdf Paul Vaughan : /media/bill/HOWELL_BASE/Website/Paul L Vaughan/Vaughan 120324 Solar-Terrestrial Resonance, Climate Shifts, & the Chandler Wobble Phase Reversal.pdf /media/bill/HOWELL_BASE/Website/Paul L Vaughan/Vaughan 120324 The Solar Cycles Footprint on Terrestrial Climate.PDF >> Yes, seems like problems after moving directories around. [Randell Mills, Social media] were moved into mini-projects subDir for both webPage[Rawe, Site] >> WAIT to setup rsync to update directorie's contents >> Do manually BEFORE setting up rsync!! >> I must correct webPagesRawe with links 05-----05 More problems : mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do S&P500 1872-2020, 83y trend - goes to covid-19 webPage!! COVID-19 - goes to S&P500 >> for some later date - maybe monthly rsyn-to-web 05-----05 Olde code 02--02 from link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' # subStr_in_str 'http://www.billhowell.ca/' 'HTTP/1.1 400 Bad Request !!! http://www.billhowell.ca/Bill Howells videos/150525 Icebreaker unchained - We should have lost World War II/' # cmd := link 'grep "Chandler Wobble" "' p_internURLs '"' qnial> host cmd /media/bill/HOWELL_BASE/Website/Paul L Vaughan/Vaughan 120324 Solar-Terrestrial Resonance, Climate Shifts, & the Chandler Wobble Phase Reversal.pdf qnial> cmd := link 'grep "" "' p_internURLs '"' >> OK, lists all files qnial> cmd := link 'grep "#" "' p_internURLs '"' qnial> host cmd ?Invalid argument qnial> cmd := link 'grep "\#" "' p_internURLs '"' grep "\#" "/media/bill/ramdisk/urls intern list.txt" qnial> host cmd ?Invalid argument 08********08 21Nov2020 22:07 02--02 intern fails : Howell_photo_Nov05_light.jpg /media/bill/HOWELL_BASE/Website/Paul L Vaughan/Vaughan 120324 Solar-Terrestrial Resonance, Climate Shifts, & the Chandler Wobble Phase Reversal.pdf /media/bill/HOWELL_BASE/Website/Paul L Vaughan/Vaughan 120324 The Solar Cycles Footprint on Terrestrial Climate.PDF /media/bill/HOWELL_BASE/Website/Personal/121225 Howells recent changes - career and location.pdf /media/bill/HOWELL_BASE/Website/Projects - mini/Howells projects.ods /media/bill/HOWELL_BASE/Website/Randell Mills - hydrinos/ /media/bill/HOWELL_BASE/Website/Randell Mills - hydrinos/Howell - review of Holverstott 2016 Hydrino energy.pdf /media/bill/HOWELL_BASE/Website/Social media/Howell 110902 - Systems design issues for social media.pdf /media/bill/HOWELL_BASE/Website/Social media/Howell 111006 - Semantics beyond search.pdf /media/bill/HOWELL_BASE/Website/Social media/Howell 111230 - Social graphs, social sets, and social media.pdf 02--02 errors : Ignoring internal page links, eg : !!linkError!!corona virus/#Corona virus models and Conference Guide links, eg : !!linkError!!Neural nets/Conference guides/Author guide website/Author guide.html >> I removed these "Conference guides" paths from p_allFileList, but they will be regenerated if I don't change : Note that many use small caps, not title case : /media/bill/SWAPPER/Website - raw/Software programming & code/bin/conference guides - format html.sh /media/bill/SWAPPER/Website - raw/Software programming & code/bin/conference guides - remove emails.sh >> most important is to modify webSite_extract_pathsSubDirsFnames so they won't be included. >> STOP!! wrong - these are shell scripts, numbskull. look at later These are left : !!linkError!! !!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/ !!linkError!!Calendar.odp !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, towards a quasi-predictive model of the rise and fall of civilisations.pdf !!linkError!!Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists aren't wearing any clothes.html !!linkError!!Cool emails/ !!linkError!!diversity_member/people.odt !!linkError!!Howell - Are we ready for global cooling.pdf !!linkError!!influenza/Howell - influenza virus.html#Astronomical correlates of pandemics !!linkError!!influenza/Howell - influenza virus.html#Howell - USA influenza [cases, deaths] alongside [sunspots, Kp index, zero Kp bins] !!linkError!!influenza/Howell - influenza virus.html#Influenza pandemics - Tapping, Mathias, and Surkan (TMS) theory !!linkError!!influenza/Howell - influenza virus.html#Is the effectiveness of vaccines over-rated? !!linkError!!influenza/Howell - influenza virus.html#Quite apart from the issue of the benefits of vaccines !!linkError!!influenza/Howell - influenza virus.html#Rebuttals of the [solar, disease] correlation !!linkError!!International Neural Network Society.JPG !!linkError!!LibreCalc bank account macro system.txt !!linkError!!Lucas's Universal Force for electrodynamics, gravity, mass, etc !!linkError!!National Post.jpg !!linkError!!Nial Systems Limited.JPG !!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF !!linkError!!Puetz greatest of cycles/ !!linkError!!Software programming & code/ !!linkError!!Software programming & code/bin/bin - Howell's web-page.html !!linkError!!Software programming & code/Qnial/MY_NDFS/??? !!linkError!!Software programming & code/Qnial/MY_NDFS/fileops.ndf !!linkError!!Software programming & code/Qnial/MY_NDFS/MindCode/ !!linkError!!Software programming & code/Qnial/MY_NDFS/video production/ !!linkError!!Software programming & code/Qnial/MY_NDFS/website urls.ndf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Design of QNial V7.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/V7 QNial Dictionary.html !!linkError!!Software programming & code/System_maintenance/ !!linkError!!S&P 500 Shiller-forward PE versus 10y US Treasury bond rates.jpg >> some had been corrected BEFORE I reverted to older '201117 17h00m21s backups' webPages >> I will have to repeat that work 02--02 extern fails : 3/4 are a Financial Post series on climate deniers,eg : HTTP/1.1 404 Not Found !!! http://www.canada.com/nationalpost/financialpost/comment/story.html?id=2271ac23-6895-4789-9da0-6b28968b8d15 I should remove all and simply link to Lawrence Soloman's book 5 are amazon.com links - leave for now 08********08 21Nov2020 05-----05 21:37 qnial> webSite_doAll >> no help for [extern, intern] links and I should have known that So why aren't they being [save, count]ed ? Just run : webURLs_extract >> ramdisk has good files : urls intern list.txt urls extern list.txt So how did I [lose, delete] them? urls_check : EACH path_delete p_list p_bad p_OKK ; >> brilliant, delete p_lists before they can be used (idiot!) urls_check Change : +.....+ EACH path_delete p_list p_bad p_OKK ; +.....+ To : +.....+ EACH path_delete p_bad p_OKK ; +.....+ qnial> webSite_link_counts /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 21Nov2020 Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : Failures : 02--02------------+ |56|errors list | 02--02------------+ |44|extern fails| 02--02------------+ |0 |howell list | 02--02------------+ |10|intern fails| 02--02------------+ Unknowns - I havent written code to really show [OK, fail] : 02--02-----------+ |2 |mailto list| 02--02-----------+ |80|pgPosn list| 02--02-----------+ OKs - these links have been shown to work : +---+---------+ |88 |extern OK| +---+---------+ |238|intern OK| +---+---------+ [fail, unknown, OK, total] counts : +---+-------------+ |110|failed links | +---+-------------+ | 82|unknown links| +---+-------------+ |326|OK links | +---+-------------+ |518|total | +---+-------------+ AWESOME!!! Stupid mistakes cost me 3 days. getting tired [errors, intern fails] are key - now is FAR EASIER to start work in confidence!!! 05-----05 17:53 qnial> webSite_doAll 02--02 no counts for intern or extern webPageSite : OK index.html Maybe the problem is in webURLs_extract : EACH path_delete p_errorsURLs p_externURLs p_howellURLs p_internURLs p_mailtoURLs p_pgPosnURLs p_temp1 ; >> comment and re-run >> this did NOT fix the problem!?? webURLs_extract Change : +.....+ ELSEIF (in `# linker) THEN writefile fpos linker ; +.....+ To : +.....+ ELSEIF (= `# (first linker)) THEN writefile fpos linker ; +.....+ >> all intern links will have '[#=; backtrack ;=#]' >> The new approach won't capture [extern, intern] links to positions within the webPage >> That can be done by later processing of !!linkError!! 02--02 no Past & future worlds.html backtrack fails in some bodyLinks, ' future worlds.html @pageTop The following were problems earlier : mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page subMenu Cdn Solar Forecast - gives "' forecasting.html " at top of web-page (like earlier cases) How many fnames have `&? qnial> ('&' EACHRIGHT subStr_in_str allFnamesSortedByFname) sublist allFnamesSortedByFname >> HUGE number of aa files qnial> EACH write (('&' EACHRIGHT subStr_in_str htmlFnamesSortedByFname) sublist htmlPathsSortedByFname) qnial> EACH write (('&' EACHRIGHT subStr_in_str htmlFnamesSortedByFname) sublist htmlPathsSortedByFname) /media/bill/SWAPPER/Website - raw/Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html >> OK /media/bill/SWAPPER/Website - raw/Solar modeling and forecasting/_Solar modeling & forecasting.html >> OK /media/bill/SWAPPER/Website - raw/economics, markets/Long term market indexes & PPI 0582.html >> ' PPI 0582.html /media/bill/SWAPPER/Website - raw/page Publications & reports.html >> ' reports.html /media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html >> ' future worlds.html /media/bill/SWAPPER/Website - raw/economics, markets/SP500/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html >> ' P 500 Shiller-forward PE versus 10y Treasury bond rates.html This really looks like a problem with the amperhsand!! But not all fail? 05-----05 Later look at more problems : subMenu Projects failed links - Randell Mills, S&P500 1872-2020, 83y trend - goes to covid-19 webPage!! COVID-19 - goes to S&P500 I didn't recheck Projects subMenu 24************************24 #] 20Nov2020 subDirs sometimes work, sometimes don't! This is new. 05-----05 All failures were at 3 subDirs down test#1 qnial> a := '170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html' qnial> (a EACHRIGHT subStr_in_str htmlPathsSortedByPath) sublist htmlPathsSortedByPath +------------------------------------------------------------------------------------------------------------- |/media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & +------------------------------------------------------------------------------------------------------------- ------------------+ future worlds.html| ------------------+ >> OK, plus the final web-page works >> So it is internalLinks_return_relativePath_test t_standards that are wrong? >> nyet, although I did make changes so results conform to the correct backtracks subDirs are being truncated by dropping the front? I must first make it work in : internalLinks_return_relativePath_test internalLinks_return_relativePath : Change : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell (fname := (1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ >> This looks suspicious? To : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell (1 + (last findAll_Howell `/ lineList@midIndx)) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ >> Why did I drop lineList@midIndx? Might be a leftover? Re-[loaddefs, internalLinks_return_relativePath_test] : >> Now #2 also fails (!!linkError!!) >> problem rains - it seems that the step above is being ignored To : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell ((1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ qnial> EACH (gage shape) allFnamesSortedByFname allPathsSortedByFname 2799 2799 qnial> find_Howell ((1 + (last findAll_Howell `/ '/Past & future worlds.html' )) drop '/Past & future worlds.html') allFnamesSortedByFname 1953 This is a big problem, probably with a simple solution. setup.ndf : #] array_findAll_subArray IS OP subArray array - addr of ALL subArray in array, error if not found array_findAll_subArray IS OP subArray array_to_search { LOCAL i_ins ; i_ins := subArray EACHRIGHT = array_to_search ; i_adds := i_ins sublist (tell gage shape array_to_search) ; IF (isfault i_adds) THEN fault '?array_findAll_subArray : item not found' ELSE i_adds ENDIF } findAll_Howell IS array_findAll_subArray Change : +.....+ i_adds := i_ins sublist (tell gage shape array_to_search) ; +.....+ To : +.....+ i_adds := i_ins sublist (tell (gage shape array_to_search)) ; +.....+ [bye, qnial,loaddefs] qnial> find_Howell '/Past & future worlds.html' allFnamesSortedByFname ?find_Howell : item not found qnial> find_Howell 'Past & future worlds.html' allFnamesSortedByFname 1953 >> As always expected qnial> (1953 pick allPathsSortedByFname) str_remove_subStr d_webRaw Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html >> OK, as it should be I still can't see a problem qnial> find_Howell (solitary 'Past & future worlds.html') allFnamesSortedByFname ?find_Howell : item not found >> Ouch! I should have seen this qnial> find_Howell (first solitary 'Past & future worlds.html') allFnamesSortedByFname 1953 >> OK qnial> 1953 pick allPathsSortedByFname /media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html >> OK flag_break it -->[nextv] link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ../../../../economics, markets/SP500/multi-fractal/1872-2020 SP500 index semi-log detrended 1871-1926 & 1926-2020, TradingView.png >> OK, this is perfect, but this is NOT the first test!!!??? IF (= '
  • ' line) THEN break ; ENDIF ; internalLinks_return_relativePath Change : +.....+ liner := line str_remove_subStr '[#=; backtrack ;=#]' ; +.....+ To : +.....+ liner := liner str_remove_subStr '[#=; backtrack ;=#]' ; +.....+ >> Oh boy - most if I correct the t_standards, which depend on the backtrack provided, and not a caculated one >> Nasty little bug!! hard to see... # internalLinks_return_relativePath_test example #2 : FAILED - result does NOT match standard t_input, t_standard, t_result = +------------+--------+-+-------------------------------------------------------------------------------------------------------------------------+ |../../../../|| +------------+--------+-+-------------------------------------------------------------------------------------------------------------------------+
  • ........ #] 30Oct2020 Simple check of internal path to fname-only link. This is the remaining test error, assuming that all t_standards are correct, which may not be the case, as continually in the past. >> Nope - calcualtion is correct, t_standard was wrong. I added : i_test := i_test + 1 ; backtrack := '../../../../' ; t_name := link '# internalLinks_return_relativePath_test example #' (string i_test) ; t_input := backtrack ' WIDTH=90% NAME="1872-2020 SP500 index semi-log detrended"
    ' ; t_standard := ' WIDTH=90% NAME="1872-2020 SP500 index semi-log detrended"
    ' ; t_result := internalLinks_return_relativePath t_input ; test_comment t_name t_input t_standard t_result ; EACH write '........' '30Oct2020 Simple check of "Bill Howells videos >> should be OK? check current index.html : >> Yes, it works webPage has [' reports.html, etc] - this problem is back 02--02 Re-do previous corrections : #] from 'fileops.ndf' : pathList_backupTo_dir htmlPathsSortedByPath (link d_webRaw 'z_Archive/') >> OK #] from 'fileops.ndf' : str_replaceIn_pathList l d_webRaw '!!linkError!!' '' htmlPathsSortedByPath >> OK? 02--02 from 17Nov2020 Redo 6. : problem with extral junk line at top of webPage >> I suspect a recurring problem with the ampersand `& in the fname? 'page Software programming.html' >> OK already 'Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html' Change : +.....+ _Charvatova - solar inertial motion [#!: writefile fout '<TITLE> path ' activity.html +.....+ To : +.....+ _Charvatova - solar inertial motion +.....+ 'Solar modeling and forecasting/_Solar modeling & forecasting.html' Change : +.....+ _Solar modeling [#!: writefile fout '<TITLE> path ' forecasting.html +.....+ To : +.....+ Solar modeling and forecasting.html +.....+ Later look at more problems : mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page subMenu Projects failed links - Randell Mills, S&P500 1872-2020, 83y trend - goes to covid-19 webPage!! COVID-19 - goes to S&P500 Cdn Solar Forecast - gives "' forecasting.html " at top of web-page (like earlier cases) 05-----05 Red herring - depther : qnial> d_webRaw /media/bill/SWAPPER/Website - raw/ qnial> d_webSite /media/bill/HOWELL_BASE/Website/ >> so for my current setup, depther = 0 for for all "webRoot" files But : qnial> null reshape 5 5 qnial> null reshape (solitary '../') o---+ |../| +---+ qnial> 0 reshape (solitary '../') So the trick may be to convert a null to 0 - but better yet to set backtrack := './'? webPageSite_update, Change : +.....+ depther_global := 0 ; IF (OR ('Menu' 'fin Head' 'fin Footer' 'fin footer' EACHLEFT subStr_in_str webPage)) THEN depther := depther_global ; ELSE depther := (gage shape (`/ findAll_Howell webPage )) - (gage shape (`/ findAll_Howell d_webRaw)); ENDIF ; backtrack := link (depther reshape (solitary '../')) ; +.....+ To : +.....+ depther := (gage shape (`/ findAll_Howell webPage )) - (gage shape (`/ findAll_Howell d_webRaw)); IF (= 0 depther) THEN backtrack := '' ; ELSE backtrack := link (depther reshape (solitary '../')) ; ENDIF +.....+ Or : +.....+ depther := (gage shape (`/ findAll_Howell webPage )) - (gage shape (`/ findAll_Howell d_webRaw)); backtrack := link (depther reshape (solitary '../')) ; +.....+ >> I don't think that the 'Or:' will work, but I can learn from the results >> Nah - menuHeadFoots are special - they always need to start from root? For [GNU, Creative Commons]? >> Keep the same for now 05-----05 Olde Code 02--02 from fileops.ndf : # find "$d_Qndfs" -maxdepth 3 -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number 'str_replaceIn_path' "FILE" >> jillions in fileops.ndf and it's [backups, z_Archives] # find "$d_Qndfs" -maxdepth 3 -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number 'replaceStringIn_path' "FILE" >> many z_Archive, backups] plus : /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:629: replaceStringIn_path IS str_replaceIn_path # find "$d_Qndfs" -maxdepth 3 -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number 'str_executeEmbeds' "FILE" >> many hits # sed - selection of separator character based on search-replace text # (`/ `# `! `| `@ EACHLEFT subStr_in_str (link strOld strNew)) sublist (`/ `# `! `| `@) ) # tests # $ strOld='pinn_writeExecute_pout' # $ strNew='str_executeEmbeds' # $ path='/media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html' # $ cat "$path" | sed "s#$strOld#$strNew#" # qnial> str_replaceSubStrIn_path o 'pinn_writeExecute_pout' 'str_executeEmbeds' (link d_webRaw 'Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html') # EACHRIGHT str_replaceIn_path EACH link (((solitary (flag_backup strOld strNew )) cart (solitary pathList) # find "$d_webRaw" -maxdepth 3 -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number ':&file-insert &:' "FILE" | sed 's#:.*##' | sort -u # str_replaceIn_pathList d_webRaw 'pinn_writeExecute_pout' 'str_executeEmbeds' htmlPathsSortedByFname # newly created operator # str_changeIn_webRaw '!!linkError!!' '[#=; backtrack ;=#]' >> this ran very well, no error outputs, and very quickly >> now to check a couple of files : /media/bill/SWAPPER/Website - raw/page projects.html:62:
  • >> OK /media/bill/SWAPPER/Website - raw/Climate and sun/_Climate and sun.html:99: >> This was NOT changed!!?? Maybe I had already changed the file above? >> !!linkError!!S&P 500 Shiller-forward PE versus 10y US Treasury bond rates.jpg not changed in 'S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html' Why aren't the changes "holding"? # I already had an operator - adapt it # str_replaceIn_pathList d_webRaw '!!linkError!!' '[#=; backtrack ;=#]' pathList # tests # check EACHALL : # a b c d := 1 2 3 4 # EACH link (((solitary (1 2 3)) cart (tell 10))) # qnial> str_replaceIn_dir l ??? # EACHRIGHT pass EACH link (((solitary (flag_backup strOld strNew )) cart (solitary pathList) # olde code result := EACHRIGHT pass EACH link (((solitary (flag_backup strOld strNew )) cart (solitary pList))) ; % result := EACHRIGHT str_replaceIn_path EACH link (((solitary (flag_backup strOld strNew )) cart (solitary pList))) ; 24************************24 #] 20Nov2020 from yesterday : I have to restore last good version of d_webRaw Double-shit - I stupidly dated the files. NOT easy to restore!!! Change path_backupDatedTo_dir To path_backupTo_dir Last set without dates : in '/media/bill/SWAPPER/Website - raw/z_Archive/201117 17h00m21s backups' just drop ~17 chars - do a few single file renames backupDir_restore_dated optr create dirBackupDated_restoreTo_paths use link d_webRaw 'z_Archive/201117 17h00m21s backups/' Problematic : qnial> dirBackupDated_restoreTo_paths (link d_webRaw 'z_Archive/201119 20h02m56s backups/') htmlPathsSortedByPath cp -p "/media/bill/SWAPPER/Website - raw/z_Archive/201119 20h02m56s backups/201119 20h02m56s 0_Big Data, Deep Learning, and Safety.html" "/media/bill/SWAPPER/Website - raw/Bill Howells videos/160901 Big Data, Deep Learning, and Safety/0_Big Data, Deep Learning, and Safety.html" >> no - this is OK! Even though the cp look legitimate, copies don't occur, and there are no error outputs. Why? Long day - take a break. >Eureka! Of cou the file dates don't change - they are preserved!! OK - re-try solution to problem : qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >>> loading start : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf >>>>>> loading start : webSite header.ndf <<<<<< loading ended : webSite header.ndf ?expecting end of block: PATH_BACKUPDATEDTO_DIR D_HTMLBACKUP ; <***> P_LIST P_BAD P_OKK ?undefined identifier: ; EACH URLS_CHECK <***> 'intern' 'extern' ; <<< loading ended : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf errors found: 2 >> I just can't see how this occurs. ?expecting end of block: PATH_BACKUPDATEDTO_DIR D_HTMLBACKUP ; <***> P_LIST P_BAD P_OKK comment out & re-try, as the problem is probably rooted elsewhere. 02--02 loading urls_check ?expecting end of block: ; 02--02 >> up, migrating glitch Perhaps an undefined variable? chr_apos balance no curly brain optr From link "$d_Qroot""help - [develop, debug, error list, etc]/0_QNial error list.txt" : #] ?expecting end of block: ; - 14Jan2020 I checked for missing ';', unbalanced ' - no workee >> I had defined an operator with arguments, but it wasn't supposed to have them (IS, not IS OP) I commented out several expressions : 02--02 %NONLOCAL d_htmlBackup d_temp d_webRaw d_webSite htmlPathsSortedByPath p_webPageList p_webSiteURLlist p_externURLs p_externURL_fails p_externURL_OK p_internURLs p_internURL_fails p_internURL_OK p_pgPosnURLs p_pgPosnURL_fails p_pgPosnURL_OK ; % ; % check each link, save [OK,fail]s ; %p_list p_bad p_OKK := EACH execute (EACH link (cart ((solitary 'p_') (solitary linkType) (solitary 'URL') ('s' '_fails' '_OK')) )) ; % backup files ; %p_list p_bad p_OKK EACHLEFT path_backupDatedTo_dir d_htmlBackup ; %p_list p_bad p_OKK EACHLEFT path_delete ; 02--02 >>OK, now loads when it shouldn't because loaren't defined!?!?! Now try : 02--02 NONLOCAL d_htmlBackup d_temp d_webRaw d_webSite htmlPathsSortedByPath p_webPageList p_webSiteURLlist p_externURLs p_externURL_fails p_externURL_OK p_internURLs p_internURL_fails p_internURL_OK p_pgPosnURLs p_pgPosnURL_fails p_pgPosnURL_OK ; % ; % check each link, save [OK,fail]s ; %p_list p_bad p_OKK := EACH execute (EACH link (cart ((solitary 'p_') (solitary linkType) (solitary 'URL') ('s' '_fails' '_OK')) )) ; % backup files ; %p_list p_bad p_OKK EACHLEFT path_backupDatedTo_dir d_htmlBackup ; %p_list p_bad p_OKK EACHLEFT path_delete ; 02--02 >> still holding ... 02--02 NONLOCAL d_htmlBackup d_temp d_webRaw d_webSite htmlPathsSortedByPath p_webPageList p_webSiteURLlist p_externURLs p_externURL_fails p_externURL_OK p_internURLs p_internURL_fails p_internURL_OK p_pgPosnURLs p_pgPosnURL_fails p_pgPosnURL_OK ; % ; % check each link, save [OK,fail]s ; p_list p_bad p_OKK := EACH execute (EACH link (cart ((solitary 'p_') (solitary linkType) (solitary 'URL') ('s' '_fails' '_OK')) )) ; % backup files ; p_list p_bad p_OKK EACHLEFT path_backupDatedTo_dir d_htmlBackup ; %p_list p_bad p_OKK EACHLEFT path_delete ; 02--02 >> still holding ... 02--02 NONLOCAL d_htmlBackup d_temp d_webRaw d_webSite htmlPathsSortedByPath p_webPageList p_webSiteURLlist p_externURLs p_externURL_fails p_externURL_OK p_internURLs p_internURL_fails p_internURL_OK p_pgPosnURLs p_pgPosnURL_fails p_pgPosnURL_OK ; % ; % check each link, save [OK,fail]s ; p_list p_bad p_OKK := EACH execute (EACH link (cart ((solitary 'p_') (solitary linkType) (solitary 'URL') ('s' '_fails' '_OK')) )) ; % backup files ; p_list p_bad p_OKK EACHLEFT path_backupDatedTo_dir d_htmlBackup ; p_list p_bad p_OKK EACHLEFT path_delete ; 02--02 Jackass! shuld be : EACH path_delete p_list p_bad p_OKK ; >> Now everything loaddefs 05-----05 Now try webSite_doAll Let's see if the new coding averts another disaster, eg webPageSite_update : IF flag_backup THEN IF (`m in d_htmlBackup) THEN webPage path_backupTo_dir d_htmlBackup ; ELSE webPage path_backupDatedTo_dir d_htmlBackup ; ENDIF ; ENDIF ; >> Wait a minute, I don't need to back up webPageSite, as it is entirely based on webPageRaw. Only webPageRaw needs backup. Keep it in for now, just in case. webSite_doAll : >> urls don't work, I'll have to che updates later. 02--02 /media/bill/ramdisk/urls intern list.txt /media/bill/SWAPPER/Website - raw/webWork files/urls intern fails.txt /media/bill/SWAPPER/Website - raw/webWork files/urls intern OK.txt ?urls_check file unknown error : p_list /media/bill/ramdisk/urls extern list.txt /media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt /media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt ?urls_check file unknown error : p_list wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls intern fails.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls intern OK.txt': No such file or directory /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 20Nov2020 Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : Failures : +-+------------+ |0|errors list | +-+------------+ | |extern fails| +-+------------+ |0|howell list | +-+------------+ | |intern fails| +-+------------+ Unknowns - I havent written code to really show [OK, fail] : 02--02-----------+ |2 |mailto list| 02--02-----------+ |80|pgPosn list| 02--02-----------+ OKs - these links have been shown to work : ++---------+ ||extern OK| ++---------+ ||intern OK| ++---------+ [fail, unknown, OK, total] counts : 02--02-------------+ | 0|failed links | 02--02-------------+ |82|unknown links| 02--02-------------+ | 0|OK links | 02--02-------------+ |82|total | 02--02-------------+ 02--02 Fix : ?urls_check file unknown error : p_list NUTS!!! more important - backups of webAll[Raw, Site] are dated still!?!? OOPS!! I haven't even written path_backupTo_dir. delete this in fileops.ndf : path_backupTo_dir IS path_backupDatedTo_dir >> I created path_backupTo_dir Back to fixing : ?urls_check file unknown error : p_list No internal [> check d_webRaw index.html - no [> example : ?dirBackup_restoreTo_paths : missing backup fname : webSite [menuHeadFoot, link, TableOfContents, link] tools. >> it doesn't have dates, >> First, I have to re-check code, not as advanced as dirBackupDated_restoreTo_paths qnial> dirBackup_restoreTo_paths o (link d_webRaw 'z_Archive/201117 17h00m21s backups/') htmlPathsSortedByPath ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m24s home.html ?dirBackup_restoreTo_paths : missing backup fname : test- home.html ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m24s Canadian Solar Workshop 2006 home page.html ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m24s CSWProgram.html ?dirBackup_restoreTo_paths : missing backup fname : test- Canadian Solar Workshop 2006 home page.html ?dirBackup_restoreTo_paths : missing backup fname : test- CSWProgram.html ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m24s Authors Guide BLOG home.html ?dirBackup_restoreTo_paths : missing backup fname : test- Authors Guide BLOG home.html ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m24s email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m25s Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?dirBackup_restoreTo_paths : missing backup fname : test- email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?dirBackup_restoreTo_paths : missing backup fname : test- Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?dirBackup_restoreTo_paths : missing backup fname : webSite [menuHeadFoot, link, TableOfContents, link] tools.html >> OK - these files shouldn't be in that dir qnial> webSite_doAll >> same er 02--02 /media/bill/SWAPPER/Website - raw/webWork files/urls intern OK.txt ?urls_check file unknown error : p_list ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt /media/bill/SWAPPER/Website - raw/z_Archive/201120 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt /media/bill/SWAPPER/Website - raw/z_Archive/201120 backups/ rm: cannot remove '/media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt': No such file or directory rm: cannot remove '/media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt': No such file or directory /media/bill/ramdisk/urls extern list.txt /media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt /media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt ?urls_check file unknown error : p_list wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls intern fails.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls intern OK.txt': No such file or directory 02--02 These look good : 'webSite urlList.txt' 'index.html' Might be a fname problem for p_list? Ah hah! d_webSite does not have relative links webPageSite_update backtrack := link (depther reshape (solitary '../')) ; >> why has this stopped working? IF flag_backup THEN IF (`m in d_htmlBackup) THEN webPage path_backupTo_dir d_htmlBackup ; ELSE webPage path_backupDatedTo_dir d_htmlBackup ; ENDIF ; ENDIF ; >> This seems to be doing the right thing THEN line := internalLinks_return_relativePath backtrack ' internalLinks_return_relativePath_test >> [1,3,5,6] out of 12 failed test# 1
  • >> no subDir test# 3
  • >> no subDir test# 5
  • gnuplot.sh is the tiny bash script used to select gnuplot scripts. My other bash scripts can be found here.
  • gnuplot.sh is the tiny bash script used to select gnuplot scripts. My other bash scripts can be found here. >> incomplete subDir test# 6
  • QNial programming language - Quenns University Nested Interactive Array Language (QNial) is my top prefered programming language for modestly complex to insane programming challenges, along with at least 3 other people in the world. Bash scripts make a great companion to QNial. semi-log formula.ndf is the tiny "program" used to set up the semi-log line fits. More generally : here are many of my QNial programs. Subdirectories provide programs for various projects etc.
  • QNial programming language - Quenns University Nested Interactive Array Language (QNial) is my top prefered programming language for modestly complex to insane programming challenges, along with at least 3 other people in the world. Bash scripts make a great companion to QNial. semi-log formula.ndf is the tiny "program" used to set up the semi-log line fits. More generally : here are many of my QNial programs. Subdirectories provide programs for various projects etc. >> This has the right subDirs subDirs sometimes work, sometimes don't! This is new. All failures were at 3 subDirs down Quit for the night. 05-----05 Oddball stuff z_Old a file (conflicting vhanges) $ diff "$d_Qtest""Website updates- tests.ndf" "$d_Qtest""Website updates- tests modfied.ndf" IF flag_debug THEN write 'loading dirBackupDated_restoreTo_paths' ; ENDIF ; #] dirBackup_restoreTo_paths IS OP flag_backupDated d_backup pathList - restore paths listed in a backup (FLAT) dir # 20Nov2020 initial,based on dirBackup_restoreTo_paths # pathList fnames - may be a partial list of fnames in d_backup # error output if an fname in pathList is not in d_backup # webSite work : Most often, pathList will be webPageList, not p_htmlFileList IF flag_break THEN BREAK ; ENDIF ; dirBackupDated_restoreTo_paths IS OP d_backup pathList { LOCAL cmd path pathList pinn pinnList pinnDropList ; IF (NOT path_exists ("w d_backup)) THEN EACH write '?path_backupDatedTo_dir dir unknown error, d_backup : ' d_backup '' ; ELSE % this returns the fname only ; cmd := link 'ls -1 "' d_backup '"' ; pinnList := host_result cmd ; fnameDropList := 17 EACHRIGHT drop pinnList ; fnamePathList := EACH path_extract_fname pathList ; FOR i WITH (tell (gage shape fnamePathList)) DO i_fnameDrop := find_Howell fnamePathList@i fnameDropList ; IF (isfault i_fnameDrop) THEN write link '?dirBackupDated_restoreTo_paths : missing backup fname : ' fnamePathList@i ; ELSE % write link 'cp -p "' (link d_backup pinnList@i_fnameDrop) '" "' pathList@i '" ' ; host link 'cp -p "' (link d_backup pinnList@i_fnameDrop) '" "' pathList@i '" ' ; ENDIF ; ENDFOR ; ENDIF ; } # tests with write rather than host : # dirBackupDated_restoreTo_paths (link d_webRaw 'z_Archive/201119 20h02m56s backups/') htmlPathsSortedByPath >> works great! # olde code cmd := link 'ls -1 "' d_backup '"' ; pinnList := host_result cmd ; FOR path WITH pathList DO write path ; fname := path_extract_fname path ; IF (path_exists (pinn := link d_backup fname)) THEN host link 'cp -p "' pinn '" "' path '" ' ; ELSE write link '?dirBackup_restoreTo_paths : missing backup fname : ' fname ; 24************************24 #] 19Nov2020 fix 'urls errors list.txt' # track down specific !!linkError!! # host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!Civilisations and sun/Howell \- Mega-Life\, Mega-Death and the Sun II\, towards a quasi\-predictive model of the rise and fall of civilisations\.pdf" "FILE" | grep --invert-match "z_Old|z_Archive" ' 02--02 /media/bill/SWAPPER/Website - raw/Galactic rays and evolution/_Galactic rays and evolution - life, the mind, civilisation, economics, financial markets.html:69: /media/bill/SWAPPER/Website - raw/z_Archive/201117 17h00m21s backups/_Climate and sun.html:99: /media/bill/SWAPPER/Website - raw/z_Archive/201117 17h00m21s backups/_Galactic rays and evolution - life, the mind, civilisation, economics, financial markets.html:69: /media/bill/SWAPPER/Website - raw/Climate and sun/_Climate and sun.html:99: 02--02 >> why the z_Archive?? try without -f # host link 'find "' d_webRaw '" -maxdepth 4 -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!Civilisations and sun/Howell \- Mega-Life\, Mega-Death and the Sun II\, towards a quasi\-predictive model of the rise and fall of civilisations\.pdf" "FILE" | grep --invert-match "z_Old|z_Archive" ' >> Yikes! didn't do the trick. II to separate greps for [z_old, z_Archive] >> change fname to Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations 070128.pdf Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf !!linkError!!Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists aren't wearing any clothes.html Oops - this is a link : _Kyoto Premise - the scientists aren't wearing any clothes.html~ >> I have no idea of where this is!? >> In any case, noerror in file itself, try : # host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!Climate \- Kyoto Premise fraud/_Kyoto Premise \- the scientists aren' chr_apo 't wearing any clothes\.html" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" ' /media/bill/SWAPPER/Website - raw/page projects.html:62:
  • Changed (got rid of chr_apo) Also Change : +.....+ !!linkError!!Lucas's Universal Force for electrodynamics, gravity, mass, etc +.....+ To : +.....+ [#=; backtrack ;=#]Lucas/ +.....+ !!linkError!!Howell - Are we ready for global cooling.pdf # host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!Howell \- Are we ready for global cooling\.pdf" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" ' /media/bill/SWAPPER/Website - raw/page Publications & reports.html:62:
  • Bill Howell "Are we ready for global cooling?" - A short presentation to Toastmasters – Dows Lake, Ottawa, 14Mar06. Needs corrections and comments! (some time later...)

    Change : +.....+ !!linkError!!Howell - Are we ready for global cooling.pdf +.....+ To : +.....+ [#=; backtrack ;=#]Howell - Are we ready for global cooling.pdf +.....+ find 'Howell - Are we ready for global cooling.pdf' allFnamesSortedByFname) pick allFnamesSortedByFname qnial> find_Howell 'Howell - Are we ready for global cooling.pdf' allFnamesSortedByFname) pick allPathsSortedByFname ?tokens left: FIND 'Howell - Are we ready for global cooling.pdf' qnial> fnd `# 'my name # is Sue' ?undefined identifier: FND <***> `# 'my name # is Sue' qnial> find `# 'my name # is Sue' 8 >> what's wrong with expression? qnial> (find_Howell 'Howell - Are we ready for global cooling.pdf' allFnamesSortedByFname) pick allPathsSortedByFname ?address >> It isn't in allPathsSortedByFname >> Did I even write this? problematic, leave for now & add to ToDos !!linkError!!International Neural Network Society.JPG qnial> (find_Howell 'International Neural Network Society.JPG' allFnamesSortedByFname) pick allPathsSortedByFname ?address !!linkError!!Neural nets/Conference guides/Author guide website/IEEE electronic Copyright form.html >> This and others should be OK do a mass remove of !!linkError!! host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | sed "s/!!linkError!!//g" ' >> NUTS!! screen output only >> NUTS! what a mess! it will re-generate the same problem!! A I have to change the original In the replacement text: '&\/\n' ; - Should have been : host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | sed "s/!!linkError!!/[#=; backtrack ;=#]/g" ' I need to write a QNial optr to do this via p_temp 20:02 OK - now webSiteURLs_doAll >> YIKES! huge delay to check urls, it looksike a failure of : extern fails, OK] - no fails (eg 4040 code)? impossible as I didn't fix them!?? intern [fails = 1, OK = 0] impossible as intern list had [19, 343] at last sucessful run. intern list on 86 bytes, 1 line mailto, pgPosn weren't touched! >> What went wrong - check a few files! Try : qnial> urls_check 'intern' nah interns webRaw don't have '[#=; backtrack ;=#]'!??? what a mess!! >> SHIT - I goofed up with `; Change : +.....+ # str_changeIn_webRaw '!!linkError!!' '[#=; backtrack ;=#]' +.....+ To : +.....+ # str_changeIn_webRaw '!!linkError!!' '[\#=; backtrack ;=\#]' +.....+ I have to restore last good version of d_webRaw Double-shit - I stupidly dated the filesnot easy to restore!!! Change path_backupDatedTo_dir To path_backupTo_dir Last set without dates : in '/media/bill/SWAPPER/Website - raw/z_Archive/201117 17h00m21s backups' 24************************24 #] 19Nov2020 fix 'urls errors list.txt' Examples : !!linkError!!corona virus/#Corona virus models >> put every link with `# into 'urls pgPosn list.txt' >> It should already be doing that! : ELSEIF (subStr_in_str '#' linker) THEN writefile fpos linker ; >> changed to : ELSEIF (in `# linker) THEN writefile fpos linker ; !!linkError!!Neural nets/Conference guides/ >> subDir problem should now be OK !!linkError!!Software programming & code/Qnial/MY_NDFS/MindCode/ >> link is wrong : moved dir, or [changed, wrong] fname !!linkError!!International Neural Network Society.JPG !!linkError!!LibreCalc bank account macro system.txt !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, towards a quasi-predictive model of the rise and fall of civilisations.pdf Some we [lost, not created] : !!linkError!!LibreCalc bank account macro system.txt Some [files, dir]s will NOT have been rsync'd : Qnial, etc etc >finish an rsync program that I had started I need [find-grep-sed]s to [locate, change] affected files, en mass and one by one as I work on them. 1. remove ALL !!linkError!! and re-run, that should reduce list by 20-40% 2. [find, fix] individual errors 24************************24 #] 19Nov2020 fix urls smmary table # a := tell 5 # b := reverse tell 10 qnial> (a (EACHLEFT EACHRIGHT subStr_in_str b)) sublist b ?first arg of sublist not boolean qnial> (a EACHLEFT EACHRIGHT subStr_in_str b) sublist b ?first arg of sublist not boolean qnial> (a EACHALL subStr_in_str b) sublist b ?first arg of sublist not boolean qnial> (a ITERATE subStr_in_str b) ?first arg of sublist not boolean qnial> EACH = (cart a b) oooooooool oooooooolo oooooooloo oooooolooo oooooloooo qnial> (rows (EACH = (cart a b))) EACHLEFT sublist b +-+-+-+-+-+ |0|1|2|3|4| +-+-+-+-+-+ # a := 'jim' 'nancy' 'john' 'betty' 'harry' # b := 'fred' 'jim' 'harry' 'john' 'dan' 'nancy' 'betty' 'floyd' qnial> (cols (EACH = (cart a b))) EACHLEFT sublist b ++------------05-----05-------------++-----------+------------+05-----05 ||+----05-----05|+---+|05-----05-----+||+---05-----05|+----05-----05||+---+| |||fred|nancy|||dan|||harry|floyd||||jim|betty|||john|harry||||dan|| ||+----05-----05|+---+|05-----05-----+||+---05-----05|+----05-----05||+---+| ++------------05-----05-------------++-----------+------------+05-----05 qnial> (cols (EACH = (cart a b))) EACHLEFT sublist a +05-----05-------+------++-------+-------++-------+ ||+---+|05-----05|+----+||05-----05|05-----05||05-----05| |||jim|||harry|||john||||nancy|||betty||||harry|| ||+---+|05-----05|+----+||05-----05|05-----05||05-----05| +05-----05-------+------++-------+-------++-------+ qnial> (rows (EACH = (cart a b))) EACHLEFT sublist a +-------++-------++------+ |05-----05||05-----05||+----+| ||nancy||||betty||||john|| |05-----05||05-----05||+----+| +-------++-------++------+ qnial> (rows (EACH = (cart a b))) EACHLEFT sublist b 05-----05-------+------+-------+-------------+ |+---+|05-----05|+----+|05-----05|05-----05-----+| ||jim|||nancy|||john|||betty|||harry|harry|| |+---+|05-----05|+----+|05-----05|05-----05-----+| 05-----05-------+------+-------+-------------+ # b := 'fred' 'jim' 'harry' 'john' 'dan' 'nancy' 'betty' 'floyd' 'eloise' qnial> (rows (EACH = (cart a b))) EACHLEFT sublist b 05-----05-------+------+-------+-------+ |+---+|05-----05|+----+|05-----05|05-----05| ||jim|||nancy|||john|||betty|||harry|| |+---+|05-----05|+----+|05-----05|05-----05| 05-----05-------+------+-------+-------+ # b := 'fred' 'dan' 'betty' 'jim' 'harry' 'john' 'dan' 'nancy' 'betty' 'floyd' 'harry' 'eloise' qnial> (rows (EACH = (cart a b))) EACHLEFT sublist b 05-----05-------+------+-------------+-------------+ |+---+|05-----05|+----+|05-----05-----+|05-----05-----+| ||jim|||nancy|||john|||betty|betty|||harry|harry|| |+---+|05-----05|+----+|05-----05-----+|05-----05-----+| 05-----05-------+------+-------------+-------------+ qnial> (rows (EACH subStr_in_str (cart a b))) EACHLEFT sublist b 05-----05-------+------+-------------+-------------+ |+---+|05-----05|+----+|05-----05-----+|05-----05-----+| ||jim|||nancy|||john|||betty|betty|||harry|harry|| |+---+|05-----05|+----+|05-----05-----+|05-----05-----+| 05-----05-------+------+-------------+-------------+ # b := 'fred' 'dan1' 'betty1' 'jim' 'harry1' 'john' 'dan2' 'nancy' 'betty2' 'floyd' 'harry2' 'eloise' qnial> b := 'fred' 'dan1' 'betty1' 'jim' 'harry1' 'john' 'dan2' 'nancy' 'betty2' 'floyd' 'harry2' 'eloise' +----+----+------+---+------+----+----05-----05------05-----05------+------+ |fred|dan1|betty1|jim|harry1|john|dan2|nancy|betty2|floyd|harry2|eloise| +----+----+------+---+------+----+----05-----05------05-----05------+------+ qnial> (rows (EACH subStr_in_str (cart a b))) EACHLEFT sublist b 05-----05-------+------+---------------+---------------+ |+---+|05-----05|+----+|+------+------+|+------+------+| ||jim|||nancy|||john|||betty1|betty2|||harry1|harry2|| |+---+|05-----05|+----+|+------+------+|+------+------+| 05-----05-------+------+---------------+---------------+ 05-----05 olde code n_fails := sum (fails_i sublist counts) ; n_OKKs := sum (unkns_i sublist counts) ; n_unkns := sum (OKKs_i sublist counts) ; # find "$d_Qndfs" -maxdepth 3 -type f -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "merge_2_1" "FILE" 24************************24 #] 18Nov2020 I adapted these. They work. #] webURLs_extract IS - extract all link urls from a website [external, internal, menu, pagePosn] #] urls_check IS OP linkType - create sublists of [internal,xternal] links classed as [fail, OK] #] check internal with path_exists '-f', externals with curl Backups gave errors (that's OK) 02--02 qnial> urls_check 'intern' ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls intern fails.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls intern OK.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ +-- qnial> urls_check 'extern' ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ 02--02 HTTP/1.1 404 Not Found - was classified as OK for external link I changed to : IF (OR ('200' '300' '301' '302' EACHLEFT in_string curlHdr)) I need to make an easy list of code explanations Adapt& run website_link_counts : qnial> see "MERGE_2_1 merge_2_1 IS OPERATION A { 0 catenate ( A @ 0 A @ 1 ) } qnial> website_link_counts ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/webSite linkType fnames.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/webSite linkType counts table.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?f_html_reformat file unknown error : p_linkTypeL Ouput file : ?invalid left arg in split ?invalid left arg in split OK - the code mostly works, now to look at failures subDirs - are failing,perhaps because of '-f' option for p_exists? 05-----05 remove olde code from urls_check : allLinks := strList_readFrom_path p_webSiteURLlist ; % construct path for each link ; p_list p_clean p_sort := EACH execute ( EACH link (cart ((solitary 'p_') (solitary linkType) (solitary 'URL') ('s' '_clean' '_sort') )) ) ; IF (NOT AND (EACHRIGHT file_exists ("r p_list) ("w p_clean))) THEN write fault link '?urls_check file unknown error : p_list or d_clean' ; ELSE flst := open p_list "r ; fcln := open p_clean "w ; write (2 2 reshape flst p_list fcln p_clean) ; WHILE (~= ??eof (line := readfile flst)) DO %write line ; % ; IF (OR (= 'howell' linkType) (= 'extern' linkType)) THEN % 08Oct2020 I can't think of any cleaning required - really just copy ; % in the case of 'howell', the source files will have to be corrected to be internal links only ; writefile fcln line ; % ; ELSE % extract text of line ; colons := findall `: line ; p_path := (first colons) take line ; IF (= 'pgPosn' linkType) THEN d_base := p_path ; ELSE d_base := ((last findall `/ p_path) + 1) take p_path ; ENDIF ; lineTxt := rest drop (second colons) line ; lineTxt := (find `" lineTxt) take lineTxt ; %write link d_base lineTxt ; % crawl up path_link directory reversals, at the same time truncating path_src ; WHILE (~= null lineTxt) DO IF (= './' (2 take lineTxt)) THEN lineTxt := 2 drop lineTxt ; %d_base := d_base ; ELSEIF (= '../' (3 take lineTxt)) THEN lineTxt := 3 drop lineTxt ; d_base := ((last front findall `/ d_base) + 1) take d_base ; ELSE EXIT 'urls_internal_check' ; ENDIF ; ENDWHILE ; writefile fcln (link d_base lineTxt) ; % ; ENDIF ; ENDWHILE ; EACH close flst fcln ; host link 'sort -u "' p_clean '" >"' p_sort '" ' ; host 'sleep 1s' ; ENDIF ; # olde code IF (= 'pgPosn' linkType) THEN p_link pagePosn := string_cut_by_char `# p_link ; IF (NOT file_exists '-f' p_link) THEN write fault link '?urls_check file unknown error : p_link' ; ELSE ???????????????????? % check if pagePosition is setup ; IF (= null (host_result link 'grep ' chr_apo '' chr_apo ' "' p_link '" ')) THEN writefile fbad p_link ; ELSE writefile fOKK p_link ; ENDIF ; ENDIF ; # old code writefile fout (first host_result link 'wc -l "' (link d_webRaw fname) '" ') ; % tbl_rows := link table_rows (host_result (link 'wc -l "' (link d_webRaw fname) '" '))) ; write table_rows ; % picture merge_2_1 tbl_linkCnt tbl_tots ; # p_htmlPathsSiteList htmlPathsSiteSortedByPath ; cmd := link 'grep -E -i --with-filename --line-number ">"' p_webSiteURLlist '" ' ; 24************************24 #] 17Nov2020 ALL webPages : +----+ 7. check all links with 'website urls.ndf' & grep !!linkError!! see link d_Qndfs 'website urls notes.txt' 24************************24 #] 17Nov2020 ALL webPages : 1. backups qnial> pathList_backupTo_dir htmlPathsSortedByPath d_htmlBackup >> OK 2. pathList_change_headFoot htmlPathsSortedByPath >> This was already done. >> Check - OK 3. Make sure that [data, optrs] are up-to-date qnial> lq_fileops qnial> loaddefs link d_Qtest 'Website updates- tests.ndf' 4. d_webRaw - update links, ensure [full, proper] subDir & backtrack >> first time use!!! qnial> webAllRawOrSite_update l "webPageRaw_update >> seems OK, no faults 5. d_webSite execute embeds [menu, Head, Foot, body], provide proper relative links >> first time use!!! qnial> webAllRawOrSite_update l "webPageSite_update 6. check 5 random webPageSites for results : check all main menu items check all subMenu items, but for common subMenus, just do once check footer - [GNU, Creative Commons] images, List of webPages selected : 'page Software programming.html' 'Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html' 'Neil Howell/_Neil Howell.html' 'Professional & Resume/_Resumes, work experience.html' 'Solar modeling and forecasting/_Solar modeling & forecasting.html' 'Solar modeling and forecasting/_Solar modeling & forecasting.html' : [Puetz, Randell Mills] links STILL fail!! : exasperating!, >> these were NOT updated!!! files last changed 27Oct2020!?!?!? d_webRaw update 14:41 ?? - this wasn't done either!!?? 201117 16h25m37s backups - three backups since 16:09, this is the most recent No use looking at the other webPageSites. For some reason, webAllRawOrSite_update doesn't work. What am I missing? Redo - this time do pathList_change_headFoot OK pathList_change_headFoot htmlPathsSortedByPath OK webAllRawOrSite_update l "webPageRaw_update NO webAllRawOrSite_update l "webPageSite_update >> so the last step above doesn't work. Why? >> The processing for both was far too fast. Something's wrong check for file updates beyond 16:50 : NO webAllRawOrSite_update l "webPageRaw_update >> however, there is a new timed backup dir >> So why doesn't webAllRawOrSite_update work? webAllRawOrSite_update Change : +.....+ apply optr_rawOrSite flag_backup webPage ; +.....+ To : +.....+ apply optr_rawOrSite (flag_backup webPage) ; +.....+ Re-try qnial> loaddefs link d_Qtest 'Website updates- tests.ndf' qnial> webAllRawOrSite_update l "webPageRaw_update >> OK, at least one webPage updated (several more also) >> runs much slower (only takes 5-10s?) qnial> webAllRawOrSite_update l "webPageSite_update >> OK, at least one webPage updated (several more also) >> runs much slower (only takes 5-10s?) So that was the problem. 05-----05 Redo 6. check 5 random webPageSites for results : check all main menu items check all subMenu items, but for common subMenus, just do once check footer - [GNU, Creative Commons] images, List of webPages selected : 'page Software programming.html' mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page, but goes to right page perhaps this is due to an apostrophe in the fname or whatever? mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do subMenu - QNial link doesn't go to my new web-page footer - dir list and [GNU, Creative Commons] images : OK 'Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html' mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do same as for previous webPage subMenu Charvatova - gives "' activity.html " at top of web-page (like earlier case) subMenu Projects failed links - Randell Mills, S&P500 1872-2020, 83y trend - goes to covid-19 webPage!! COVID-19 - goes to S&P500 Cdn Solar Forecast - gives "' forecasting.html " at top of web-page (like earlier cases) SAFIRE - electric sun experiment - add as well to [economics, markets] subMenu footer - dir list and [GNU, Creative Commons] images : OK 'Neil Howell/_Neil Howell.html' mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do subMenu Hosted sites - OK footer - dir list and [GNU, Creative Commons] images : OK 'Professional & Resume/_Resumes, work experience.html' mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do subMenu Professional - OK footer - dir list and [GNU, Creative Commons] images : OK 'Solar modeling and forecasting/_Solar modeling & forecasting.html' Cdn Solar Forecast - gives "' forecasting.html " at top of web-page (like earlier cases) mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do I didn't recheck Projects subMenu footer - dir list and [GNU, Creative Commons] images : OK +----+ 7. check all links with 'website urls.ndf' & grep !!linkError!! 24************************24 #] 17Nov2020 New menus items : 05-----05 'Menu projects.html' - rearrange and add missing webPages: Lies, Damned Lies Pandemics (general link) influenza corona virus suicide economics & markets SP500-Schiller vs T-bill multifractal Fiboci mirror Fed Rese control zone Problematic web-pages -are due to [move, dirChanges] Randell Mills Puetz & Borchardt /media/bill/SWAPPER/Website - raw/Projects - mini/Puetz & Borchardt/Howell - comments on Puetz UWS, the greatest of cycles, human implications.odt /media/bill/SWAPPER/Website - raw/Projects - mini/Randell Mills/Howell - review of Holverstott 2016 Randell Mills hydrino energy.pdf 05-----05 'Menu neural nets.html' Neural nets main link OK - leave the rest of the [subDir, fname] corrections for later, after the url checks 24************************24 #] 17Nov2020 Test webPages : Post-backups, check all links! 13:43 Is str_splitLftRgtTo_midIndxs_StrList still a problem? midIndxs lineList := str_splitLftRgtTo_midIndxs_StrList strLeft strRight liner ; put '[#=; backtrack ;=#]' into 5 files : 'fin [Footer, footer [Neil Howell,Paul Vauhan,Steven Wickson,Steven Yaskell].html' qnial> loaddefs link d_Qtest 'Website updates- tests.ndf' qnial> webPageRaw_update_test l (do all as the files aren't being over-written) qnial> webPageSite_update_test l Oh shit! 'whole-line-embed footer.txt' Change : +.....+ [#!: path_insertIn_fHand (link d_webWork 'fin Footer.html') fout ; +.....+ To : +.....+ [#!: path_executeEmbedsInsertIn_fHand (link d_webRaw 'webWork files/fin Footer.html') phraseValueList ; +.....+ qnial> webPageRaw_update_test l (do all as the files aren't being over-written) qnial> webPageSite_update_test l Can't just do that. I have to repeat pathList_change_headFoot pathList_change_headFoot htmlPathsSortedByPath webPageRaw_update_test : I don't think that I need to do this, as subDirs already put in? qnial> webPageSite_update_test l >> OK, [GNU, Creative Commons] images ap >> BUT! stupid `; problem is back pathList_change_headFoot - remove `; ??? Change : +.....+ cmd := link 'cat "' p_temp3 '" | sed "s|\(.*\)\(.*\)|<TITLE> ' fname ' ; |" >"' p_temp4 '"' ; +.....+ To : +.....+ cmd := link 'cat "' p_temp3 '" | sed "s|\(.*\)\(.*\)|<TITLE> ' fname ' |" >"' p_temp4 '"' ; +.....+ >> I thought that I had alrdone this - probably directly in the files? qnial> pathList_change_headFoot htmlPathsSortedByPath qnial> webPageSite_update_test l >> I STILL get a '` 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html ; This appears in 'pathList_change_headFoot temp4.txt' : Steven Wickson.html ; >> OOPS! forgot qnial> lq_fileops OK AWESOME! - basics seem to work well... (maybe older files will have problems?) 05-----05 12:57 With the 3-problem fixes below, test webPage[Raw, Site]s qnial> lq_fileops qnial> loaddefs link d_Qtest 'Website updates- tests.ndf' qnial> webPageRaw_update_test l qnial> webPageSite_update_test l 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html page Howell - blog.html Howell - corona virus.html _Lies, damned lies, and scientists.html Howell - influenza virus.html >> NUTS!!! I over-wrote the webPageRaws!!! (fuck-up) fix 'webPageSite_update' Change back to previous : +.....+ host link 'cp "' p_temp '" "' webPage '"' ; +.....+ To : +.....+ host link 'cp "' p_temp '" "' p_webSite '"' ; +.....+ Copy-back from backup dir qnial> webPageSite_update_test l >> no extraneous `; >> test- files NOT stored on webSite (where are they stored? >> failed [GNU, Creative Commons] images Damned failed [GNU, Creative Commons] images '1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' : Perhaps these have moved. Is p_allFileList up to date? GNU Public License Creative Commons License /media/bill/SWAPPER/Website - raw/Creative commons.png - correct subDir fname (both web[Raw, Site]) /media/bill/SWAPPER/Website - raw/gnu-head-mini.png - correct subDir fname (both web[Raw, Site]) /media/bill/SWAPPER/Website - raw/logo Bill OReilly.png /media/bill/SWAPPER/Website - raw/logo Creative Commons.png webPageRaw_update - check to make sure that '"' is used as StrRght, not '">' THEN line := internalLinks_return_relativePath backtrack '> these are OK Problematic web-pages -are due to [move, dirChanges] Randell Mills Puetz & Borchardt 08:51 (earlier) 05-----05 First run all webPageRaw_update_test with save flag_backup = l, check embeds loaddefs link d_Qtest 'Website updates- tests.ndf' webPageRaw_update_test l >> oops - missing headFoot fileops.ndf inserted : d_webWork := link d_webRaw 'webWork files/' ; >> now it looks good! >> must have had this in global varSpace before... 05-----05 Now, run all webPageSite_update_test with save flag_backup = l, check links qnial> webPageSite_update_test l subDirfname = economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html >> `; appears at top of webPage? >> [menuHeadFoot, body]-links look good (except known problems links) >> [GNU, Creative Commons] images don't appear subDirfname = page Howell - blog.html >> same as 1st, but GNU image does appear subDirfname = Pandemics, health, and the Sun/corona virus/Howell - corona virus.html >> same as 1st subDirfname = Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html >> same as 1st, also 'Problems with Science ' subMenu links don't work : Lies, Damned Lies, and Scientists GR turkey, QM fools paradise subDirfname = Pandemics, health, and the Sun/influenza/Howell - influenza virus.html >> same as 1st 05-----05 Fix the `; problem Can't see where this comes from check 'pathList_change_headFoot temp[1-5].txt' temp1 - no temp2 - no temp3 - no temp4 - no So it must be the temp5 step pathList_change_headFoot Change : +.....+ cmd := link 'cat "' p_temp4 '" | sed "s|\(\[#!: str_executeEmbeds (link d_webWork ' chr_apo 'Menu\)\(.*.html' chr_apo ')\)\(.*\)|\[#!: path_executeEmbedsInsertIn_fHand (link d_webWork ' chr_apo 'Menu\2 phraseValueList ; |" >"' p_temp5 '"' ; +.....+ To : +.....+ cmd := link 'cat "' p_temp4 '" | sed "s|\(\[#!: str_executeEmbeds (link d_webWork ' chr_apo 'Menu\)\(.*.html' chr_apo ')\)\(.*\)|\[#!: path_executeEmbedsInsertIn_fHand (link d_webWork ' chr_apo 'Menu\2 phraseValueList |" >"' p_temp5 '"' ; +.....+ >> removed `;, but I can't see how this will help >> problem is header, not the footer whole-line-embed header.txt [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand (link d_webWork 'fin Head_one.html') fout ; [#!: writefile fout ' path ' ; [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout ; >> but this is similar to the footer, which doesn't have this problem... [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand (link d_webWork 'fin Footer.html') fout ; >> So maybe the problem is from : [#!: writefile fout ' path ' ; >> except the `; comes BEFORE the menus This is driving me nuts... outputwebPageRaw : has ' QNial.html ; ' That`; will be a problem 'whole-line-embed header.txt' Change : +.....+ QNial.html ; +.....+ To : +.....+ QNial.html +.....+ 05-----05 Fix the [GNU, Creative Commons] images 'fin [Footer, footer [Neil Howell,Paul Vauhan,Steven Wickson,Steven Yaskell].html' Change : +.....+ Creative Commons License +.....+ To : +.....+ GNU Public License Creative Commons License +.....+ >> Hopefully, that will do it. 05-----05 Redirect 'test- *' file creation to z_Archive This was done even today! it's not an olde thing Not in : 'fileops.ndf' 'Website updates- tests.ndf' 'Website updates.ndf' 'Website header.ndf' 'strings.ndf' Where the heck is it? check all backup optrs path_backupDatedToSameDir IS OP path - backup dated version of a file in same directory path_backupDatedTo_dir IS OP path dirBackup - backup dated version of a file to a specified FLAT dir dirBackup_restoreTo_paths IS OP d_backup p_pathList - restore paths listed in a backup (FLAT) dir path_backupDated_delete IS OP path - rename a file with date precursor Maybe it's in MY_NDFS somewhere other than the ndfs above? Oop the ORIGINAL filenames have 'test- ' problem is that they are saved to d_webRaw, not d_Qtest Look in 'Website updates.ndf' >> I can't see where the problem occurs... ??? webPageSite_update Change : +.....+ p_webSite := link d_webSite subDir fname ; write link 'subDirfname = ' subDir fname ; IF (path_exists '-f' p_temp) THEN host (link 'diff --width=85 "' p_webSite '" "' p_temp '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; host link 'echo "" >>"' p_log '"' ; IF flag_backup THEN host link 'mv "' p_temp '" "' p_webSite '"' ; ENDIF ; ELSE host link 'echo ?webPageSite_update error, p_temp not created >>"' p_log '"' ; ENDIF ; +.....+ To : +.....+ p_webSite := link d_webSite subDir fname ; write link 'subDirfname = ' subDir fname ; IF (path_exists '-f' p_temp) THEN host (link 'diff --width=85 "' p_webSite '" "' p_temp '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; host link 'echo "" >>"' p_log '"' ; IF flag_backup THEN host link 'mv "' p_temp '" "' webPage '"' ; ENDIF ; ELSE host link 'echo ?webPageSite_update error, p_temp not created >>"' p_log '"' ; ENDIF ; +.....+ >> Hopefully, that will do it. 24************************24 #] 16Nov2020 webPageSite_update_test - check actual change in webPageSite Note: there STILL seems to be a subDir problem with menus? Just run & check qnial> webPageSite_update_test l main menu - all links are OK Projects : all are EXCEPT (as before) Randell Mills- hydrinos Puetz - The Greatest of cycles bodyLinks : All seem OK (not many) external : I noticed that this didn't work Ben Davidson of Suspicious Observers Try all 5 test webpages 'economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' 'page Howell - blog.html' 'Pandemics, health, and the Sun/corona virus/Howell - corona virus.html' 'Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html' 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html' First I have to update the whole-line-embeds. >> OK, all tests look good Backup all targeted webPages : qnial> pathList_backupTo_dir htmlPathsSortedByPath (link d_webRaw 'z_Archive/') cp: cannot stat '/media/bill/SWAPPER/Website - raw/economics, markets/SP500/multi-fractal/test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html': No such file or directory cp: cannot stat '/media/bill/SWAPPER/Website - raw/Lies, Damned Lies, and Scientists/test- _Lies, damned lies, and scientists.html': No such file or directory cp: cannot stat '/media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/test- Howell - corona virus.html': No such file or directory cp: cannot stat '/media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/influenza/test- Howell - influenza virus.html': No such file or directory cp: cannot stat '/media/bill/SWAPPER/Website - raw/test- page Howell - blog.html': No such file or directory >> I deleted the test- webPages 05-----05 qnial> pathList_change_headFoot htmlPathsSortedByPath test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html cat: '/media/bill/SWAPPER/Website - raw/economics, markets/SP500/multi-fractal/test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html': No such file or directory test- _Lies, damned lies, and scientists.html cat: '/media/bill/SWAPPER/Website - raw/Lies, Damned Lies, and Scientists/test- _Lies, damned lies, and scientists.html': No such file or directory test- Howell - corona virus.html cat: '/media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/test- Howell - corona virus.html': No such file or directory test- Howell - influenza virus.html cat: '/media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/influenza/test- Howell - influenza virus.html': No such file or directory test- page Howell - blog.html cat: '/media/bill/SWAPPER/Website - raw/test- page Howell - blog.html': No such file or directory >> only failures are noed above, basically only test files >> - so all normal webPages were processed (very quickly!) 24************************24 #] 16Nov2020 webPageSite_update, [str, path]_executeEmbeds qnial> webPageSite_update_test o headFoots OK Menus : backtrack remains a string!? Home small or sleeping : bodyLinks : don't even appear Break for brunch qnial> webPageSite_update_test o ------------------------------------------------------------- Break debug loop: enter debug commands, expressions or type: resume to exit debug loop executes the indicated debug command current call stack : webpagesite_update_test webpagesite_update_output webpagesite_update str_executeembeds path_executeembedsinsertin_fhand path_executeembeds str_executeembeds ------------------------------------------------------------- -->[stepv] nextv ?.. IF ( not or ( isfault Midindxs ) ( = Null Midindxs ) ) THEN Midlinks := EACH execute Midlinks ; (...) -->[nextv] str phraseValueList +----------------------------------------------------------+----------------------------+ | |+------+-------------------+| | ||fout 6|backtrack ?no_value|| | |+------+-------------------+| +----------------------------------------------------------+----------------------------+ >> Oops, n value for backtrack - co Change : 05-----05 path_executeEmbeds IS OP path phraseValueList { LOCAL backtrack finn fout line p_temp ; % ; % Create p_temp (str), and delete past versions so update failures will result in a diff error message ; p_temp := executeEmbedsGet_pathTemp ; IF (path_exists '-f' p_temp) THEN host link 'rm "' p_temp '"' ; ENDIF ; % ; finn := open path "r ; fout := open p_temp "w ; WHILE (NOT isfault (line := readfile finn)) DO IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line (("fout fout)("backtrack backtrack)) ; ENDIF ; writefile fout line ; ENDWHILE ; EACH close finn fout ; p_temp } +.....+ To : +.....+ path_executeEmbeds IS OP path phraseValueList { LOCAL backtrack finn fout line p_temp ; % ; % Create p_temp (str), and delete past versions so update failures will result in a diff error message ; p_temp := executeEmbedsGet_pathTemp ; IF (path_exists '-f' p_temp) THEN host link 'rm "' p_temp '"' ; ENDIF ; % ; finn := open path "r ; fout := open p_temp "w ; WHILE (NOT isfault (line := readfile finn)) DO IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line phraseValueList ; ENDIF ; writefile fout line ; ENDWHILE ; EACH close finn fout ; p_temp } +.....+ Retry qnial> webPageSite_update_test o >> still: backtrack rather than n*'../' bodyLinks lack subDir! str_executeEmbeds doesn't put the midLinks back into strList!! I must have dropped it during copy-paste? Change : +.....+ midLinks := EACH execute midLinks ; +.....+ To : +.....+ strList#midIndxs := EACH execute midLinks ; +.....+ Retry qnial> webPageSite_update_test o >> OK!! '../../' substituted for 'backtrack' >> bodyLinks 05-----05
  • K.F. Tapping, R.G. Mathias, D.L. Surkan, " Pandemics and solar activity" . This is an unpublished, expanded paper of the 2001 paper by the same authors, which is listed in the references below. 05-----05 becomes 05-----05
  • K.F. Tapping, R.G. Mathias, D.L. Surkan, Pandemics and solar activity" . This is an unpublished, expanded paper of the 2001 paper by the same authors, which is listed in the references below. 05-----05 >> '' subStr_in_str str) THEN BREAK ; ENDIF ; >> this worked within str_executeEmbeds -->[nextv] " subDirfname = Pandemics, health, and the Sun/influenza/Howell - influenza virus.html >> so why isn't it in the output file? In webPageSite_update : 05-----05 WHILE (NOT isfault (line := readfile finn)) DO % first executeEmbeds if present, with (phrase values) pairList ; IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line (("fout fout)("backtrack backtrack)) ; ENDIF ; % now process links ; IF ('> This is BACKWARDS!! Change to : 05-----05 WHILE (NOT isfault (line := readfile finn)) DO % process links ; IF ('' subStr_in_str str) THEN BREAK ; ENDIF ; Re-Try Here is the problem : ? 05-----05 -->[nextv] line " -->[nextv] ?.. ( '[nextv] " ?.. Line := str_executeembeds Line ( ( "fout Fout ) ( "backtrack Backtrack ) ) -->[nextv] " 05-----05 >> missing subDir!?, but at least the '../../' is written to the output now. internalLinks_return_relativePath isn't putting in the subDir - why? Move break to internalLinks_return_relativePath : IF ('' subStr_in_str line) THEN BREAK ; ENDIF ; Nuts! - elimiating line because of `# 05-----05 ?.. ( or ( Midindxslines_bads EACHLEFT substr_in_str Linelist @ I ) ) -->[nextv] ?.. or ( Midindxslines_bads EACHLEFT substr_in_str Linelist @ I ) -->[nextv] l l ?.. Null 05-----05 Remove '#' from midIndxsLines_bads Change : +.....+ IF (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@i)) THEN +.....+ To : +.....+ IF (OR (= `# (first lineList@i)) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@i))) THEN +.....+ ?.. Linelist @ I := link Backtrack ( ( I_fname pick Allpathssortedbyfname ) str_remove_substr D_webraw ) -->[nextv] +-+--------++-+--------------------+-+-+-++ | ||"|| +-+--------++-+--------------------+-+-+-++ >> '../../' should NOT be in there - how did this happen? I added : liner := line str_remove_subStr '[#=; backtrack ;=#]' ; After much floundering : ?.. Linelist @ I := link Backtrack ( ( I_fname pick Allpathssortedbyfname ) str_remove_substr D_webraw ) -->[nextv] i_fname 2477 -->[nextv] +-+--------++-+----------------------------------------------------------------------------------------------- | ||"|| ---------+-+-+-++ OK, now subDirs are added 24************************24 #] 16Nov2020 webPageSite_update, [str, path]_executeEmbeds current issues : '[#=; backtrack ;=#]' rather than n*'../' bodyLinks lack subDir! 'strings.ndf' Change : +.....+ strList#midIndxs := EACH execute midLinks ; +.....+ To : +.....+ midLinks := EACH execute midLinks ; +.....+ qnial> flag_break := l l qnial> webPageSite_update_test o subDirfname = Pandemics, health, and the Sun/influenza/Howell - influenza virus.html >> the break in str_executeEmbeds never occurs, WHY? midIndxs strList := str_splitLftRgtTo_midIndxs_StrList '[#=; ' ' ;=#]' strNew ; -->[nextv] midIndxs strList ++----------------------------------------------------------------+ ||+---------------------------05-----05---------05-----05------------+| ||| || ||+---------------------------05-----05---------05-----05------------+| ++----------------------------------------------------------------+ >> why is midIndxs null? Run str_splitLftRgtTo_midIndxs_StrList tests, something must have changed? I have to create a test in link d_Qtest 'strings- tests.ndf' Check str_splitWith_subStr_test All 12 work EXCEPT the 2 with '[#=; '!! qnial> str_splitLftRgtTo_midIndxs_StrList_test #05-----05 str_splitLftRgtTo_midIndxs_StrList_test, Mon Nov 16 10:30:23 2020 # string_splitWith_string_test example 1 : FAILED - result does NOT match standard 05-----05-----+------------------------------------------------------------------+ |[#=; | ;=#]|| 05-----05-----+------------------------------------------------------------------+ +---------05-----05---------05-----05--------------------------------------+ || +---------05-----05---------05-----05--------------------------------------+ ++------------------------------------------------------------------------+ ||+---------05-----05---------05-----05--------------------------------------+| ||||| ||+---------05-----05---------05-----05--------------------------------------+| ++------------------------------------------------------------------------+ qnial> loaddefs link d_Qtest 'strings- tests.ndf' >>> loading start : strings- tests.ndf <<< loading ended : strings- tests.ndf qnial> str_splitLftRgtTo_midIndxs_StrList_test #05-----05 str_splitLftRgtTo_midIndxs_StrList_test, Mon Nov 16 10:31:04 2020 # string_splitWith_string_test example 1 : FAILED - result does NOT match standard 05-----05-----+------------------------------------------------------------------+ |[#=; | ;=#]|| 05-----05-----+------------------------------------------------------------------+ +-+------------------------------------------------------------------------+ |2|+---------05-----05---------05-----05--------------------------------------+| | |||| | |+---------05-----05---------05-----05--------------------------------------+| +-+------------------------------------------------------------------------+ ++------------------------------------------------------------------------+ ||+---------05-----05---------05-----05--------------------------------------+| ||||| ||+---------05-----05---------05-----05--------------------------------------+| ++------------------------------------------------------------------------+ >> Interesting, missing midIndxs. what happened? str_splitWith_subStr_test : 8 tests are OK break in str_splitLftRgtTo_midIndxs_StrList_test str_splitLftRgtTo_midIndxs_StrList Change : +.....+ IF (OR EACH OR (null EACHRIGHT = i_heads i_tails)) +.....+ To : +.....+ IF (OR (null EACHRIGHT = i_heads i_tails)) +.....+ I think the change was to make weird heasdTails work str_splitLftRgtTo_midIndxs_StrList IS OP strLft strRgt str { LOCAL i i_heads i_tails midIndxs splits valids ; IF flag_break THEN BREAK ; ENDIF ; splits := link (strRgt EACHRIGHT str_splitWith_subStr (str_splitWith_subStr strLft str) ) ; midIndxs := tell (gage shape splits) ; i_heads := (strLft EACHRIGHT = splits) sublist midIndxs ; i_tails := (strRgt EACHRIGHT = splits) sublist midIndxs ; IF (OR (null EACHRIGHT = i_heads i_tails)) THEN null (fault '?str_splitLftRgtTo_midIndxs_StrList error : OR[i_heads, i_tails] is null') ELSE valids := 3 + (((i_heads + 4) EACHLEFT in i_tails) sublist i_heads) ; valids splits ENDIF } str_splitLftRgtTo_midIndxs_StrList Change : +.....+ valids := 3 + (((i_heads + 4) EACHLEFT in i_tails) sublist i_heads) ; +.....+ To : +.....+ valids := 1 + (((i_heads + 2) EACHLEFT in i_tails) sublist i_heads) ; +.....+ >> OK, this works. But it must have "damaged" the other application. Add more str_splitLftRgtTo_midIndxs_StrList_test & re-test ['> perhaps because midInxs is a list?! >> OK, that was that problem. Now to change str_splitLftRgtTo_midIndxs_StrList for ['> OK, now all 5 str_splitLftRgtTo_midIndxs_StrList_test are OK Double-check other link d_Qtest 'strings- tests.ndf' : I changed outputs of all tests to use : test_comment t_name t_input t_standard t_result ; >> I need to fix the format! qnial> strings_alltest link d_Qtest '201116 12h25m44s alltest strings.txt' : +----+ Summary of test results : /media/bill/PROJECTS/Qnial/code develop_test/201116 12h25m44s alltest strings.txt, date= 201116 12h25m # str_to_unicodeList_test example #1 : OK - result matches standard # str_to_unicodeList_test example #2 : OK - result matches standard # str_to_unicodeList_test example #3 : OK - result matches standard # str_to_unicodeList_test example #4 : OK - result matches standard # str_to_unicodeList_test example #5 : OK - result matches standard # str_to_unicodeList_test example #6 : OK - result matches standard # string_sub_test example #1 : OK - result matches standard # string_sub_test example #2 : OK - result matches standard # string_sub_test example #3 : OK - result matches standard # string_sub_test example #4 : OK - result matches standard # string_sub_test example #5 : OK - result matches standard # string_sub_test example #6 : OK - result matches standard # string_sub_test example #7 : OK - result matches standard # string_sub_test example #8 : OK - result matches standard # string_sub_test example #9 : OK - result matches standard # string_sub_test example #10 : OK - result matches standard # string_sub_test example #11 : OK - result matches standard # string_sub_test example #12 : OK - result matches standard # string_sub_test example #13 : OK - result matches standard # str_splitWith_subStr_test example #1 : OK - result matches standard # str_splitWith_subStr_test example #2 : OK - result matches standard # str_splitWith_subStr_test example #3 : OK - result matches standard # str_splitWith_subStr_test example #4 : OK - result matches standard # str_splitWith_subStr_test example #5 : OK - result matches standard # str_splitWith_subStr_test example #6 : OK - result matches standard # str_splitWith_subStr_test example #7 : OK - result matches standard # str_splitWith_subStr_test example #8 : OK - result matches standard # str_splitLftRgtTo_midIndxs_StrList_test example #1 : OK - result matches standard # str_splitLftRgtTo_midIndxs_StrList_test example #2 : OK - result matches standard # str_splitLftRgtTo_midIndxs_StrList_test example #3 : OK - result matches standard # str_splitLftRgtTo_midIndxs_StrList_test example #4 : OK - result matches standard # str_splitLftRgtTo_midIndxs_StrList_test example #5 : OK - result matches standard +----+ 24************************24 #] 15Nov2020 [str, path]_executeEmbeds 06:20 [develop, test] new code with 'Howell - influenza virus.html' Key upgrades to : #] str_executeEmbeds IS OP str phraseValueList - execute embeds in line, return a str #] path_executeEmbeds IS OP path phraseValueList - execute embeds in a file 'Howell - influenza virus.html' New optrs : #] executeEmbedsGet_pathTemp IS - generate a unique p_temp fname #] path_executeEmbedsInsertIn_fHand IS OP path phraseValueList - execute embeds, insert in fileHandle 14:34 webPageSite_update_test with 'Howell - influenza virus.html' 02--02 qnial> webPageSite_update_test o ?type error in fault ?type error in fault ?type error in fault ?type error in fault ?type error in fault p_webSite = /media/bill/HOWELL_BASE/Website/Pandemics, health, and the Sun/influenza/Howell - influenza virus.html subDir = Pandemics, health, and the Sun/influenza/ 02--02 >> NO [embeds, links] survived, >> later : nyet - the backtracks survived as original, no processing the whole-line-embeds disappeared entirely 'Website updates.ndf' Change : +.....+ line := str_executeEmbeds line fout backtrack ; +.....+ To : +.....+ line := str_executeEmbeds line (("fout fout)("backtrack backtrack)) ; +.....+ >> No faults, but no embeds either... internal links no longer in webPage Check within-line-embeds '[#=; ' break in str_executeEmbeds 02--02 qnial> webPageSite_update_test o >> no break occurred, same error? 24************************24 #] 14Nov2020 fix up tests - a bit confusing 21:27 'Website updates.ndf', Change : +.....+ line := str_executeEmbeds line fout backtrack ; +.....+ To : +.....+ line := str_executeEmbeds line (("fout fout) ("backtrack backtrack)) ; +.....+ > I probably need a list of paired (("phrase value)...) 16:57 str_executeEmbeds isrepeating the footer part multiple times? looking at a typical full-line embed, which is not set up properly : 02--02 [#!: str_executeEmbeds (link d_webWork 'fin Head_one.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; 02--02 Note the recursive use of embedding. [#!: strList_writeTo_path (str_executeEmbeds (link d_webWork 'fin Head_one.html')) (p_temp := link d_temp 'executeEmbed temp.txt') ; path_insertIn_fHand p_temp fout ; ???????????? 16:52 I fixed path_retrieve_subDirFname $ find "$d_Qroot" -maxdepth 3 -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number 'path_retrieve_subDirFname' "FILE" /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:30: path_retrieve_subDirFname IS OP path dirBase - returns (subDir fname) from a path /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:245:IF flag_debug THEN write 'loading path_retrieve_subDirFname' ; ENDIF ; /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:247:#] path_retrieve_subDirFname IS OP path dirBase - returns (subDir fname) from a path /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:252: path_retrieve_subDirFname IS OP path dirBase /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:256: THEN fault '?path_retrieve_subDirFname error, dirBase not in path' /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:260: THEN fault '?path_retrieve_subDirFname error, fname' #] 14Nov2020 11:49 $ find "$d_Qroot" -maxdepth 3 -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number 'd_webDone' "FILE" 05-----05 /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:181:# Note that menuHeadFoot are NOT [copied, converted] to d_webDone, as they are useless there. /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:192: NONLOCAL d_htmlBackup d_webRaw d_webDone ; /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:265:# Note that menuHeadFoot are NOT [copied, converted] to d_webDone, as they are useless there. /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:276: NONLOCAL d_htmlBackup d_webRaw d_webDone ; /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:358: d_htmlBackup := link d_webDone 'z_Archive/' timestamp_YYMMDD_HMS ' backups/' ; /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:20: - for testing only (depends on [pinn, d_webRaw, d_webDone]) /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:22: depther inputs (eg n times '../') are arbitrary - for testing only (depends on [pinn, d_webRaw, d_webDone]) /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:60:#] - for testing only (depends on [pinn, d_webRaw, d_webDone]) /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:234:#] depther inputs (eg n times '../') are arbitrary - for testing only (depends on [pinn, d_webRaw, d_webDone]) /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:400: NONLOCAL d_htmlBackup d_webRaw d_webDone ; /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:419: NONLOCAL d_htmlBackup d_webRaw d_webDone ; 05-----05 >> replace d_webDone with d_webSite ***** #] 13Nov2020 thorough check of [menuHeadFoot, body]links several menus are not working - even more subMenus fail (maybe most?) bodylinks are NOT working, the subDirs are missing! I revamped webPageRaw_update so that no [test, backup] files appear in d_webRaw - they are in d_htmlBackup. The subDirs are still missing - may not be a problem as lon this is done for the webSite? BUT - the code indicates that the full path SHOULD appear!!??? 02--02 lineList@i := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; ... lineList@i := link backtrack ((i_subDir pick allSubDirsSortedBySubdir) str_remove_subStr d_webRaw) ; 02--02 Run a test webPageDone_update to see what happens. 05-----05 Olde code # 13Nov2020 no longer used dw_base := d_webRaw ; ds_base := d_webDone ; ****** #] 11Nov2020 update to website, check menuHeadFoots webSite_update etc, etc >> webSite_update seems to work. From a quick check - most menu items seem to work. A few exceptions were noted : 02--02 top menu - all looks good subMenu Neural Networks : Neural Nets - no works MindCode - OK subMenu Projects : MindCode neural network - OK Randell Mills- hydrinos - no works Puetz - The Greatest of cycles - no works rest of subMenus look good 02--02 >> a thorough check is needed, including bodyLinks! Do a more thorough check Friday! ****** #] 11Nov2020 webPage_update_test continued Idiot - I chopped the " and not the >, so now changed to : THEN line := internalLinks_return_relativePath backtrack 'this Directory's listing. Check d_website version : No update? - oops, at present, this goes to 'test- ' versions. Leave it for debugging >> test versions don't have menuHeadFoots Now, run ALL webPage_update_test : check log file, and output files >> still a problem with ./ eg : You can see these files via this Directory's listing. >> but only for [1,2]? backup '1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' by dating then rename 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' to '1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' >> This reduces the [known, good] changes from future diff outputs, making it easier to focus on real problems. 05-----05 Why is ./ still a problem? : I added './' to midIndxsLines_bads Removed from internalLinks_return_relativePath : % check if ./ ; ELSEIF (= './' lineList@i) THEN null ; Re-run webPage_update_test for only : 'economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' >> 05-----05 menuHeadFoot disappearance : str_executeEmbeds 02--02 -->[nextv] str fout backtrack +------------------------------------------------------------------------------------------------------------- |[#!: pinn_writeExecute_pout (link d_webWork 'fin Head_one.html') stdTmp d_webRaw d_webSite ; path_ins +------------------------------------------------------------------------------------------------------------- ------------------------------+-+---------+ ertIn_fHand stdTmp fout ; |4|../../../| ------------------------------+-+---------+ -->[nextv] ?undefined identifier: PINN_WRITEEXECUTE_POUT <***> ( LINK D_WEBWORK 02--02 >> Arrrggghhh! I forgot to change the embeds!! Create a new fileops.ndf optr : str_replaceIn_pathList IS OP flag_backup d_backup strOld strNew f_Pattern pathList str_replaceIn_pathList IS OP flag_backup d_backup strOld strNew f_Pattern pathList I did a huge amount of fixing : created str_replaceIn_pathList converted od optr name : str_replaceIn_pathList d_webRaw 'pinn_writeExecute_pout' 'str_executeEmbeds' htmlPathsSortedByFname 24************************24 #] 10Nov2020 webPage_update_test - continue to change, debug Run (single example for development) : webPage_update_test oops - blew up 02--02 qnial> webPage_update_test increasing call stack to 200 increasing call stack to 300 ... Segmentation fault 02--02 >> probably a path error or infinite loop [webPage, pout] ARE checked to stop progress, so maybe an infinite loop? 24************************24 #] 10Nov2020 05-----05 Problem with subDir faults - internalLinks_return_relativePath_test tests #[5 6] More efficient to create a new list allEndSubDirsSortedByFullSubdir Key changes to internalLinks_return_relativePath_test >> now all tests work nicely! 05-----05 test webPage_update # Set of tests : # htmlFname := 'economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' # htmlFname := 'page Howell - blog.html' # htmlFname := 'Pandemics, health, and the Sun/corona virus/Howell - corona virus.html' # htmlFname := 'Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html' # htmlFname := 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html' # webPage_update (d_webRaw link htmlFname) 'page Howell - blog.html'
    >> IMG SRC problem - it goes to ">, but should only go to the next " Change : +.....+ IF ('' line ; +.....+ To : +.....+ IF ('> one step, but NO backtracks & subDir not provided for > ALL internalLinks_return_relativePath_test are OK. >> But now for ALL test-output file : no menueHeadFoot ' '[nextv] line
    ?.. Midindxs Linelist := str_splitlftrgtto_midindxs_strlist Strleft Strright Liner -->[nextv] lineList ?no_value -->[nextv] ++-------------------------------------------------------------------------+ ||?str_splitLftRgtTo_midIndxs_StrList error : OR[i_heads, i_tails] is null| ++-------------------------------------------------------------------------+ fix str_splitLftRgtTo_midIndxs_StrList to use " = strRgt Change : +.....+ valids := 1 + (((i_heads + 2) EACHLEFT in i_tails) sublist i_heads) ; +.....+ To : +.....+ valids := 1 + (((i_heads + 4) EACHLEFT in i_tails) sublist i_heads) ; +.....+ Also - remove " from strRght Change str_splitLftRgtTo_midIndxs_StrList : +.....+ valids := 2 + (((i_heads + 4) EACHLEFT in i_tails) sublist i_heads) ; +.....+ To : +.....+ valids := 3 + (((i_heads + 4) EACHLEFT in i_tails) sublist i_heads) ; +.....+ Remove " from strLeft strRght in 'Website updates- tests.ndf' Re-run : internalLinks_return_backupSubDirFnames_test : 02--02 # internalLinks_return_backupSubDirFnames_test example #4 : FAILED - result does NOT match standard
  • gnuplot Ive used the unofficial extension .plt to designate gnuplot scripts for each of the graphs. You can see these files via this Directorys listing.
  • gnuplot Ive used the unofficial extension .plt to designate gnuplot scripts for each of the graphs. You can see these files via this Directorys listing. 02--02 >> now the ./ doesn't work - oops, I had forgotten to add it to internalLinks_return_backupSubDirFnames internalLinks_return_relativePath_test : 02--02 # internalLinks_return_relativePath_test example #9 : FAILED - result does NOT match standard WIDTH=90% NAME="1872-2020 SP500 index semi-log detrended"

    02--02 >> most files are OK >> how did this get the wrong file???? Oops - I mixed up the standard with another example >> Now ALL tests are OK If possible, I need to consolidate internalLinks_return_backupSubDirFnames - This is a "shorthand" version for editing, '[#=; backtrack ;=#]' used by webPage_convertBodyLinks internalLinks_return_relativePath - This is the "full" version for updating the webSite used by webPage_update For : internalLinks_return_relativePath_BodyLinks_test t_input := '[#=; backtrack ;=#]' blah blah OK - more changes to consolidate [internalLinks_return_backupSubDirFnames, internalLinks_return_relativePath] Re-run : internalLinks_return_relativePath_BodyLinks_test : 02--02 [3,5,6,8,9] failed, but [3,5,6,9] are due to incorrect standards, given my current "full path" approach - I fixed the standard response [3,5,6,9] - now the full '> All are OK now internalLinks_return_relativePath_test : >> [1-12] all are OK 05-----05 Olde code # OUTMODED! now outputs go into sa d_webRaw subDir as the webPage IF flag_debug THEN write 'loading webPage_update_test' ; ENDIF ; #] webPage_update_test IS - Check for proper processing of embedded executables # These tests cannot be run from d_Qtests : I must create "test" output files as standards in d_webSite # uncomment to run one test (insert line below, remove extra line for next test definition below) webPage_update_test IS OP webPage { LOCAL comments flog i_test p_inn p_log p_std ; flag_screenOut := o ; p_log := link d_Qtest 'webPage_update_test log.txt' ; path_backupDated_delete p_log ; flog := open p_log "a ; flog EACHRIGHT writefile '********' (link 'webPage_update_test, ' timestamp_DDMMMYYYY_HMS) '' ; close flog ; i_test := 0 ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_test_output flag_screenOut i_test p_inn p_std comments ; } # webPage_update_test_bag webPage_update_test IS { LOCAL comments flog i_test p_inn p_log p_std ; flag_screenOut := o ; p_log := link d_Qtest 'webPage_update_test log.txt' ; path_backupDated_delete p_log ; flog := open p_log "a ; flog EACHRIGHT writefile '********' (link 'webPage_update_test, ' timestamp_DDMMMYYYY_HMS) '' ; close flog ; i_test := 0 ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- page Howell - blog.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- page Howell - blog.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_test_output flag_screenOut i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_test_output flag_screenOut i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- Howell - corona virus.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- Howell - corona virus.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_test_output flag_screenOut i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- _Lies, damned lies, and scientists.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- _Lies, damned lies, and scientists.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_test_output flag_screenOut i_test p_inn p_std comments ; webPage_update_output i_test p_inn p_std comments ; } # olde version IF flag_debug THEN write 'loading internalLinks_return_backupSubDirFnames' ; ENDIF ; #] internalLinks_return_backupSubDirFnames IS OP strLeft strRight line - returns backupFnameSubDir #] for an an internal link, where SubDir goes into d_webSite # Fixing links is noble, but perhaps equally important is [de-list, label]ing those that are flawed? # and labelling "bad links" for manfixes later ; # subDirs may be vulnerable to duplicates in different directories? Only the first match is used? # EACH non-http link should have EITHER '[#=; backtrack ;=#]' or '!!linkError!!', but not both. # I need to do more for paths with `#, as some van be "rescued". later ... internalLinks_return_backupSubDirFnames IS OP strLeft strRight line { LOCAL i lineList fixIndxs_in fname midIndx midIndxs midLinks n_midLinks path subDir ; NONLOCAL allPathsSortedByPath allFnamesSortedByFname ; % ; midIndxs lineList := str_splitLftRgtTo_midIndxs_StrList strLeft strRight line ; midLinks := lineList#midIndxs ; n_midLinks := gage shape midLinks ; fixIndxs_in := n_midLinks reshape l ; % most lines do not have [ gnuplot Ive used the unofficial extension .plt to designate gnuplot scripts for each of the graphs. You can see these files via this Directorys listing.' # str_remove_subStr '
  • gnuplot Ive used the unofficial extension .plt to designate gnuplot scripts for each of the graphs. You can see these files via this Directorys listing.' 'hello' # (solitary 'Past & future worlds.html') EACHLEFT find_howell Allfnamessortedbyfname # loaddefs link d_Qndfs 'Website updates.ndf' # olde code removed (path_extract_dir pinn) (path_extract_fname pinn) #] 07Nov2020 : pouter := pout ; IF (~= stdTmp pouter) THEN pouter := link d_out subDir ; ENDIF ; depther_global := depther ; continue := l ; IF continue THEN ENDIF ; 24************************24 #] 09Nov2020 So, backtracks seem OK, but fname only, no paths!!! re-run internalLinks_return_relativePath_test : >> They ALL failed!!! (arghhh!) quick tes qnial> 1028 pick allPathsSortedByFname /media/bill/SWAPPER/Website - raw/Software programming & code/bin/dir_size sum, net test.txt >> OK run test on one file >> internalLinks_return_relativePath flag_break is not encountered for correct fname, no subDir >> I had problem before!! reverted to fail qnial> internalLinks_return_relativePath_test #05-----05 internalLinks_return_relativePath_test, Mon Nov 9 17:08:26 2020 # internalLinks_return_relativePath_test example #1 : FAILED - result does NOT match standard t_input, t_standard, t_result = +---------02--02--------------------------------------------------------------+ ||
  • | +---------02--02--------------------------------------------------------------+
  • ?op_parameter Oops!! - left-over for some reason? Change : +.....+ t_result := internalLinks_return_relativePath l t_input ; +.....+ To : +.....+ t_result := internalLinks_return_relativePath t_input ; +.....+ Are '../../../' being removed? No - I forgot to add that coding. I applied remove to entire line of internalLinks_return_relativePath : line := line str_remove_subStr '../' ; midIndxs lineList := str_splitLftRgtTo_midIndxs_StrList strLeft strRight line ; changed lineliner so it can be modified Re-try internalLinks_return_relativePath_test : >> Nuts - I got the midIndx wrong - it does refer to path pathsIndxs := midIndxs + 3 ; paths := lineList#pathsIndxs ; pathList := null ; I significantly simplified internalLinks_return_relativePath I am missing a test for bads, eg from internalLinks_return_backupSubDirFnames : IF (OR (midIndxsLines_bads EACHLEFT subStr_in_str path)) THEN fixIndxs_in@i := o ; ENDIF ; I adapted this] Re-try internalLinks_return_relativePath_test : 24************************24 #] 09Nov2020 webPage_update_test After MANY, many changes, run test : >> OOPS : 1. 'A HREF="' etc removed 2. only '../' appears 3. p_log doesn't work 1. internalLinks_return_relativePath - removed : % remove [#= =#] brackets ; shaper := gage shape midIndxs ; lineList#(midIndxs - 1) := shaper reshape (solitary null) ; lineList#(midIndxs + 1) := shaper reshape (solitary null) ; 2. in webPage_update : 02--02 depther_global := 0 ; IF (OR ('Menu' 'fin Head' 'fin Footer' 'fin footer' EACHLEFT subStr_in_str pinn)) THEN depther := depther_global ; ELSE depther := (gage shape (`/ findAll_Howell pinn )) - (gage shape (`/ findAll_Howell d_webRaw)); ENDIF ; 02--02 >> but d_Qtests is used, NOT d_webRaw U could put in something like : 02--02 ELSEIF (d_Qtest subStr_in_str pinn) THEN depther := (gage shape (`/ findAll_Howell pinn )) - (gage shape (`/ findAll_Howell d_webRaw)); 02--02 >> but this is just complicates everything. Just leave as is and rely on actual webPage updates to check backtracks 3. flag_test is currently set in webPage_update_test_output, so it's OK : webPage_update l p_inn p_temp_webPageUpdate ; Re-run webPage_update_test_output with log output and flag_tst := o This allows diff to compare. Interesting - stray `> gave double-HREF!?!? In addition to the video of the presentation, "Howell 161220 Big Data, Deep Learning, Safety.ogv">, I have also In addition to the video of the presentation, "Howell 161220 Big Data, Deep Learning, Safety.ogv!!linkError!!">, I have also posted >> worry about this later - good that '!!linkError!!' flags it. 05-----05 Try all tests qnial> webPage_update_test diff: /media/bill/PROJECTS/Qnial/code develop_test/test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html update.html: No such file or directory was missing 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html update.html' >> copied from p_temp_webPageUpdate Redo tests, but with flag_screenOut # htmlFname := 'economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' # webPage_update l (d_webRaw link htmlFname) (d_webSite link htmlFname) >> seems OK >> retried after failures below - still seems to work # htmlFname := 'page Howell - blog.html' # webPage_update l (d_webRaw link htmlFname) (d_webSite link htmlFname) >> total failure of backtracks >> wait - root wbPages won't have backtracks, but then we sould see a full oath!! # htmlFname := 'Pandemics, health, and the Sun/corona virus/Howell - corona virus.html' # webPage_update l (d_webRaw link htmlFname) (d_webSite link htmlFname) >> total failure of backtracks >> [#=; backtrack ;=#] not in webPage? (did I run webPage_convertBodyLinks?) >> wait - root wbPages won't have backtracks, but then we sould see a full oath!! # htmlFname := 'Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html' # webPage_update l (d_webRaw link htmlFname) (d_webSite link htmlFname) >> backtrack OK, but fname only, no paths!!! # htmlFname := 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html' # webPage_update l (d_webRaw link htmlFname) (d_webSite link htmlFname) >> backtrack OK, but fname only, no paths!!! 05-----05 So, backtracks seem OK, but fname only, no paths!!! stop now - supper and preps for FireBoD meeting 24************************24 09Nov2020 05-----05 Problem with backtrack (was due to faulty code, see below) : # tests - playing with backtrack # backtrack := link chr_apo '../../../' chr_apo ' ' chr_apo 'banana' chr_apo # backtrack := link 'pass ' chr_apo '../../../' chr_apo # backtrack := link chr_apo '../../../' chr_apo # execute backtrack # olde code %strList#midIndxs := (chr_apo EACHRIGHT link strList#midIndxs) EACHLEFT link chr_apo ; # tests - t_input t_input := '
  • ' 1 (link 'pass ' chr_apo '../../../' chr_apo) ;
  • ?.. Strlist # Midindxs := EACH execute Strlist # Midindxs -->[nextv] backtrack pass '../../../' -->[nextv] execute backtrack ../../../ -->[nextv] each execute midLinks +----------------+ |pass '../../../'| +----------------+ -->[nextv] execute link 'pass ' chr_apo '../../../' chr_apo ../../../ >> why isn't this working? -->[nextv] backtrack pass '../../../' -->[nextv] execute backtrack ../../../ # 02--02 t_input := '
  • ' 1 (link chr_apo '../../../' chr_apo) ;
  • >> close, but no cigar 02--02 t_input := '
  • ' 1 '5' ; >> works fine, ts just a string that should remain a string (without quotes) that is a problem >> after Change : +.....+ strList#midIndxs := EACH execute strList#midIndxs ; +.....+ To : +.....+ strList#midIndxs := EACH execute midLinks ; +.....+ ?.. Strlist # Midindxs := EACH execute Midlinks -->[nextv] +--------------------------05-----05-----------05-----05---------------------------+ |
  • | +--------------------------05-----05-----------05-----05---------------------------+ 02--02 Now try, post-correction : t_input := '
  • ' 1 '../../../' ; # str_executeEmbeds_test example #1 : OK - result matches standard t_input, t_standard, t_result = +------------------------------------------------------------------------+-+---------+ |
  • |1|../../../| +------------------------------------------------------------------------+-+---------+
  • 05-----05 Remove : # loaddefs link d_Qtest 'Website updates- tests.ndf' IF flag_debug THEN write 'loading webPage_update_output' ; ENDIF ; # webPage_update_output IS - write results to log file # 07Nov2020 initial webPage_update_output IS OP i_test p_inn p_std comments { LOCAL flog p_temp_BodyLinks ; NONLOCAL d_webRaw d_webSite ; p_temp_webPageUpdate := link d_temp 'webPageUpdate temp.txt' ; p_log := link d_Qtest 'webPage_update_test log.txt' ; flog := open p_log "a ; flog EACHRIGHT writefile '........' (link '# webPage_update example #' (string i_test)) (link 'webPage_update_test for : "' (path_extract_fname p_inn) '"') ; flog EACHRIGHT writefile comments ; writefile flog 'diff results : ' ; close flog ; % pinn_executeEmbeddedTo_pout has the [dir, file] existence checks, so no need here ; pinn_executeEmbeddedTo_pout p_inn p_temp_webPageUpdate d_webRaw d_webSite ; host (link 'diff --width=85 "' p_std '" "' p_temp_webPageUpdate '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; host link 'echo "" >>"' p_log '"' ; % 07Nov2020 manually move p_temp_webPageUpdate to p_std as required ; } # loaddefs link d_Qtest 'Website updates- tests.ndf' IF flag_debug THEN write 'loading webPage_update_test' ; ENDIF ; # webPage_update_test IS - change links in the body of html to the relative format # This doesn't help at all, as I can't test the links unless a file is in d_webSte!! # 07Nov2020 test created webPage_update_test IS { LOCAL comments flog i_test p_inn p_log p_std ; p_log := link d_Qtest 'webPage_update_test log.txt' ; path_backupDated_delete p_log ; flog := open p_log "a ; flog EACHRIGHT writefile '********' (link 'webPage_update_test, ' timestamp_DDMMMYYYY_HMS) '' ; close flog ; i_test := 0 ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- page Howell - blog.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- page Howell - blog.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_output i_test p_inn p_std comments ; } # webPage_update_test_bag % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- page Howell - blog.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- page Howell - blog.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_output i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_output i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- Howell - corona virus.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- Howell - corona virus.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_output i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- _Lies, damned lies, and scientists.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- _Lies, damned lies, and scientists.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_output i_test p_inn p_std comments ; 24************************24 #] 08Nov2020 test & fix pinn_writeExecute_pout 05-----05 # Header example : [#!: pinn_writeExecute_pout (link d_webWork 'fin Head_one.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; Howell web - Past and Future Worlds [#!: pinn_writeExecute_pout (link d_webWork 'fin Head_two.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; [#!: pinn_writeExecute_pout (link d_webWork 'Menu.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; [#!: pinn_writeExecute_pout (link d_webWork 'Menu Howell videos.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; # For all webPages, should change to : [#!: menuHeadFoot_writeTo_fout (link d_webWork 'fin Head_one.html') fout backtrack ; Howell web - Past and Future Worlds [#!: menuHeadFoot_writeTo_fout (link d_webWork 'fin Head_two.html') fout backtrack ; [#!: menuHeadFoot_writeTo_fout (link d_webWork 'Menu.html') fout backtrack ; [#!: menuHeadFoot_writeTo_fout (link d_webWork 'Menu Howell videos.html') fout backtrack ; 24************************24 07Nov2020 Abandoned as useless : created : webPage_update_output IS - write results to log file pinn_writeExecute_pout - I commented out : % 0Nov2020 comment out : IF (~= stdTmp pouter) THEN pouter := link d_out subDir ; ENDIF ; pouter >> pouter is last line of optr webPage_update_test on 'test- page Howell - blog.html convertBodyLinks.html' : >> WRONG!!!, needs FULL subDir!!! most links are the same... what is wrong? nyet : I think its because pinn_writeExecute_pout is baffled by having dwebRaw & d_webSite in same dir? nyet - useless : I created webPage_update_output to handle test webPage updates. I removed the 'pouter' return of pinn_executeEmbeddedTo_pout!! (idiot) 02--02 Set of tests : see link d_Qndfs 'file_ops.ndf' >> the full path of a file is NOT provided, nor are the '../' Where is "backtrack"? It's in pinn_executeEmbedsTo_pout I have to find the full path to [fnames, subDirs] in pinn_executeEmbedsTo_pout Major adaptation from internalLinks_return_backupSubDirFnames 17:38 OK - loaddefs works, but probably not pinn_executeEmbedsTo_pout Break for the day, do dishes. 24************************24 07Nov2020 05-----05 See how many linkErrors remain : $ find "$d_webRaw" -maxdepth 3 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" | grep --invert-match "z_Archive" 02--02 These ones are known problems (probably never did create the files?) : /media/bill/SWAPPER/Website - raw/page Howell - blog.html:75:
  • 02--02 These ones shouldn't be a problem, but... : fname problems? : /media/bill/SWAPPER/Website - raw/page Howell - blog.html:763:
  • >> just recopy path /media/bill/SWAPPER/Website - raw/page Howell - blog.html:888:
  • >> oops, apo /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/Howell - pandemics and disease blog.html:45:

    >> oops "2020" /media/bill/SWAPPER/Website - raw/page Publications & reports.html:61:
  • Bill Howell "Are we ready for global cooling?" - A short presentation to Toastmasters – Dows Lake, Ottawa, 14Mar06. Needs corrections and comments! (some time later...)

    /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:177: /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:179: /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:181: /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:194: /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:196: /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:204: >> I either fixed the fname, or re-[copy, paste]ed the fname ?subDir not working? : /media/bill/SWAPPER/Website - raw/page Software programming.html:56:
    /media/bill/SWAPPER/Website - raw/page Software programming.html:12:
  • /media/bill/SWAPPER/Website - raw/index.html:100:
  • Cool emails /media/bill/SWAPPER/Website - raw/index.html:122:
  • Puetz greatest of cycles /media/bill/SWAPPER/Website - raw/index.html:130:
  • System_maintenance /media/bill/SWAPPER/Website - raw/page blogs.html:15:
  • /media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html:18:
  • >> I corrected "Software programming & code/Qnial programming language/" -> "Software programming & code/Qnial/" >> I corrected severeal other subDir mistakes, including the requirement for a FULL subDir >> Is the "multiple subD" issue manifesting? eg [Cool emails, Scenes,...] /media/bill/SWAPPER/Website - raw/Lies, Damned Lies, and Scientists/General Relativity is a turkey, Quantum Mechanics is a fools paradise.html:13:I first posted this theme in my review "???". >> OK - never put in a link. I think this was 'Howell - review of Holverstott 2016 Hydrino energy.pdf' /media/bill/SWAPPER/Website - raw/index.html:111:
  • Linux bash scripts >> OK - wrong subDir I changed to 'Software programming & code/bin/' 02--02 05-----05 05-----05 Rerun qnial> loaddefs link d_Qtest 'Website updates- tests.ndf' 05-----05 qnial> webSite_convertBodyLinks 02--02 ?webPage_convertBodyLinks file unknown error, one of : /media/bill/SWAPPER/Website - raw/economics, markets/Nuclear for tar sands 23Sep05.html /media/bill/ramdisk/convertBodyLinks temp.txt diff: /media/bill/SWAPPER/Website - raw/z_Archive/201107 09h58m28s backups/Nuclear for tar sands 23Sep05.html: No such file or directory diff: /media/bill/SWAPPER/Website - raw/economics, markets/Nuclear for tar sands 23Sep05.html: No such file or directory 02--02 >> If p_htmlFileList is updated each loaddef, why does 'Nuclear for tar sands 23Sep05.html' even appear? Are z_archive files included (shouldn't be)? >> 'Nuclear for tar sands 23Sep05.html' WAS in p_htmlFileList. Why? Check 'webWork files/201107 09h58m28s webSite_convertBodyLinks log.txt' 02--02 These changes didn't work? for now - manually fix in d_webRaw "active" files by adding [#=; backtrack ;=#]: webPage : "Past & future worlds.html" diff results :
  • webPage : "index.html" diff results :
  • Cool emails
  • Linux bash scripts
  • Puetz - greatest of cycles
  • System_maintenance webPage : "page blogs.html" diff results :
  • webPage : "Howell - corona virus.html" diff results : webPage : "page Software programming.html" diff results :

  • >> Check the new d_webRaw html files : Maybe the problem is with : "reaching up" the directory path? - maybe look at later.. "Howell - review of Holverstott 2016 Hydrino energy". webPage : "page Howell - blog.html" diff results :
  • webPage : "Howell - pandemics and disease blog.html" diff results :

    >> OK - they are all good. 02--02 Other problems to fix : >> why wasn't !!linkError!! (or [#=; backtrack ;=#]) inserted? >> I manually put in [#=; backtrack ;=#] 02--02 05-----05 $ find "$d_webRaw" -maxdepth 3 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" | grep --invert-match "z_Archive" /media/bill/SWAPPER/Website - raw/page Howell - blog.html:75:
  • >> leave it as a known problem, I added a comment that the link won't work /media/bill/SWAPPER/Website - raw/page Publications & reports.html:61:
  • Bill Howell "Are we ready for global cooling?" - A short presentation to Toastmasters – Dows Lake, Ottawa, 14Mar06. Needs corrections and comments! (some time later...)

    >> I added [#=; backtrack ;=#] 24************************24 #] 06Nov2020 webSite_convertBodyLinks link d_webRaw 'webWork files/201106 17h48m29s webSite_convertBodyLinks log.txt' >> most have no diff (no changes) those that do show NO [#=; backtrack ;=#] >> I must have the diff files switched? - yes, this was fixed take changes "permanent" At point, webSite_convertBodyLinks does NOT change the original files. check "_Climate and sun.html" : But first, change original file & re-run (I haven't set up a full webSite test). Change : +.....+ webPage_convertBodyLinks l o webPage ; +.....+ To : +.....+ webPage_convertBodyLinks l l webPage ; +.....+ >> Mostly looks good, and at least subDirs shortened to last subDir and have inserted [#=; backtrack ;=#] >> OOPS, all files gave diff error, eg : diff: /media/bill/ramdisk/convertBodyLinks temp.txt: No such file or directory This moves p_temp_BodyLinks IF flag_move THEN host link 'mv "' p_temp_BodyLinks '" "' webPage '"' ; ENDIF ; so the appropriate diff is between [(link d_htmlBackup fname), webPage] However, that won't help now that the original htmls have been changed. It's still possible by restoring 24************************24 #] 06Nov2020 05-----05 >> OCH!! tons of '!!linkError!!' This result is wrong - I must have a coding error that generates this problem? 1. reinstate "$d_webRaw""z_Archive/201105 18h37m51s backups/" 2. manual fix of links on webSite 3. redo all 'Website updates- tests.ndf' - find problem 4. fix problem with webPage_convertBodyLinks 05-----05 1. reinstate "$d_webRaw""z_Archive/201105 18h37m51s backups/" qnial> dirBackup_restoreTo_paths (link d_webRaw 'z_Archive/201105 18h37m51s backups/') p_webPageList $ find "$d_webRaw" -maxdepth 3 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" >"$d_webRaw""webWork files/5 linkerrors.xt" >> 506 lines affected! - many subDirs that don't with `/? - many moved web-[page, dir]s - maybe [TableOfContent, menuHeadFoot]?? Too many z_Archive : $ find "$d_webRaw" -maxdepth 3 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" | grep --invert-match "z_Archive" >"$d_webRaw""webWork files/5 linkerrors.xt" >> Now down to "only" 124 problems in 17 html files. 05-----05 2. manual fix of links on webSite Fix the 17 html files NOW! reduce extra work later. see commentary in : "$d_webRaw""webWork files/201106 5 linkerrors.txt" >> Maybe "???". 05-----05 delete redundant directories [d_webSite, www.BillHowell.ca] - video [production, active] - already gone - another subDir can't remember... 05-----05 3. redo all 'Website updates- tests.ndf' - find problem internalLinks_return_backupSubDirFnames_test >> ALL failed! this is an issue! internalLinks_return_backupSubDirFnames IS OP strLeft strRight line >> no longer uses d_webRaw >> OK, now all tests are OK 05-----05 4. fix problem with webPage_convertBodyLinks webPage_convertBodyLinks_test qnial> webPage_convertBodyLinks_test diff: /media/bill/PROJECTS/Qnial/code develop_test/test- HELP.html convertBodyLinks.html: No such file or directory >> What? that file exists!??? OOPS - extra space was removed >> only ONE test was logged : webPage_convertBodyLinks_test for : "test- page Howell - blog.html" >> OOPS! did I set webPage_convertBodyLinks to repeatedly delete p_log? No - only when test is initiated, so thatOK. So why only one test was logged? -> the first test Yet obviously from the error message, 'test- HELP.html' was run. p_log is opened in "a append mode, no host commands have `>, just '>>'. I fixed a problem - should ope_log in BOTH [webPage_convertBodyLinks_test, webPage_convertBodyLinks_output] webPage_convertBodyLinks_test >> OK, now all tests run, only one has diff results (i.e. no changes to the others) ........ # webPage_convertBodyLinks example #4 webPage_convertBodyLinks_test for : "test- _Lies, damned lies, and scientists.html" diff results :
  • Title, table of condents, copyright
  • Introduction
  • Intro - The impossible conclusion, Scientists can't think
  • logical, and scientific thinking by essentially all scientists
  • profile
  • challenged, and when they crumble
  • and Scientific Thinking?
  • B2 - Cheating theory and Game theory
  • B3 - Pre-and-post-Science Philosophies for Thinking
  • C1 - A Brave new world
  • C2 - The rise & fall of Enlightenment
  • C3 - Suggestions for science, policy, and society
  • D0 - Conclusions
  • 24************************24 #] 05Nov2020 webSite_convertBodyLinks Code modified to [capture process p_log, backup files, log diffs] 02--02 % create a new backup directory for every use of webSite_convert, as damage can be VERY time-costly ; d_htmlBackup := link d_webRaw 'z_Archive/' timestamp_YYMMDD_HMS ' backups/' ; host link 'mkdir "' d_htmlBackup '" ' ; % ; flog := open p_log "w ; flog EACHRIGHT writefile '********' (link 'webSite_convertBodyLinks, ' timestamp_DDMMMYYYY_HMS) '' ; close flog ; FOR webPage WITH (strList_readFrom_path p_webPageList) DO fname := path_extract_fname webPage ; p_backup := link d_htmlBackup fname ; flog := open p_log "a ; flog EACHRIGHT writefile '........' (link 'webPage : "' fname '"') 'diff results : ' ; close flog ; % webPage_convertBodyLinks o webPage d_webRaw ; % host (link 'diff --width=85 "' p_backup '" "' webPage '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; host link 'echo "" >>"' p_log '"' ; ENDFOR ; 02--02 05-----05 qnial> webSite_convertBodyLinks 2_website htmlFileLs.txt : /media/bill/SWAPPER/Website - raw/Projects - mini/Diversity - ssh site/diversity_public/home.html /media/bill/SWAPPER/Website - raw/Projects - mini/Solar system/Cdn Solar Forecasting/Canadian Solar Workshop 2006 home page.html /media/bill/SWAPPER/Website - raw/Projects - mini/Solar system/Cdn Solar Forecasting/CSWProgram.html /media/bill/SWAPPER/Website - raw/Projects - mini/wordpress site/Authors Guide BLOG home.html /media/bill/SWAPPER/Website - raw/security/encryption-decryption instructions.html /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html /media/bill/SWAPPER/Website - raw/Solar modeling and forecasting/_Solar modeling & forecasting.html /media/bill/SWAPPER/Website - raw/Steven H Yaskell/0_Steven H Yaskell.html /media/bill/SWAPPER/Website - raw/webWork files/4_test Kyoto Premise - the scientists arent wearing any clothes (copy).html /media/bill/SWAPPER/Website - raw/webWork files/fin organisations.html 201105 17h55m50s webSite_convertBodyLinks log.txt : webPage : "home.html" webPage : "Canadian Solar Workshop 2006 home page.html" webPage : "CSWProgram.html" webPage : "Authors Guide BLOG home.html" webPage : "encryption-decryption instructions.html" webPage : "email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html" webPage : "Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html" webPage : "QNial.html" webPage : "_Solar modeling & forecasting.html" webPage : "0_Steven H Yaskell.html" webPage : "4_test Kyoto Premise - the scientists arent wearing any clothes (copy).html" Only one NOT in 'webSite_convertBodyLinks log.txt : ' : /media/bill/SWAPPER/Website - raw/webWork files/fin organisations.html >> worry about later 05-----05 Now to uncomment : % webPage_convertBodyLinks o webPage d_webRaw ; % host (link 'diff --width=85 "' p_backup '" "' webPage '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; qnial> webSite_convertBodyLinks >> OOPS! backups are NOT being made!! >> The actual backups are done by webPage_convertBodyLinks, whick I hadn't activated Try with flag_backup, but not yet flag_move qnial> webSite_convertBodyLinks >> NO diffs? Maybe the last time I over-wrote the webPages, it was done properly? 05-----05 Big mistake with my find in this section : ignore it all $ find "$d_webRaw" -maxdepth 3 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" >"$d_webRaw""webWork files/5 linkerrors.xt" >> using "!!linkError!!" gave jillions of : grep: invalid max count >> OCH!! tons of '!!linkError!!' This result is wrong - I must have a coding error that generates this problem? Many are the result of a lack of `/ at the end of a subDir Maybe due to eliminating d_webaw as "base dir"!!!???!!! 467 cases in total - I have to sed these out and re-translate!! find ONLY in recent backup dir $ find "$d_webRaw""z_Archive/201105 18h37m51s backups/" -maxdepth 0 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" >"$d_webRaw""webWork files/5 linkerrors.xt" Nyet - reinstate "$d_webRaw""z_Archive/201105 18h37m51s backups/" fix problem with webPage_convertBodyLinks break for the day! 24************************24 #] 02Nov2020 test "real" webPages to make sure they are saved, and that other problems don't arise. Adjust link d_Qtest 'Website updates.ndf' and cold loaddef : Uncomment in webPage_convertBodyLinks : % host link 'mv "' p_temp_webPage_convertEncoding '" "' webPage '"' ; Set flag_backup := l qnial> bye / qnial qnial> loaddefs link d_Qtest 'Website updates.ndf' define pinn - use one of already-tested webPages, example : qnial> pinn := link d_webRaw 'page Howell - blog.html' qnial> webPage_convertBodyLinks l pinn d_webRaw analyse the results - check diff output - open both the [raw, bodyLinked] versions, visually compare by searching 'A >"' p_log '"')) ) EACHLEFT link chr_apo ; write writefileList ; EACH host writefileList ; 05-----05 qnial> pinn := link d_webRaw 'page Howell - blog.html' qnial> webPage_convertBodyLinks l pinn d_webRaw # $ diff --width=85 "$d_webRaw""page Howell - blog.html" "$d_Q_tests""test- page Howell - blog.html convertBodyLinks.html" --suppress-common-lines Issues : +-+ +-+ 05-----05 qnial> pinn := link d_webRaw 'economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' qnial> webPage_convertBodyLinks l pinn d_webRaw # $ diff --width=85 "$d_webRaw""economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html" "$d_Q_tests""test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html convertBodyLinks.html" --suppress-common-lines Issues : +-+ +-+ 05-----05 qnial> pinn := link d_webRaw 'Pandemics, health, and the Sun/corona virus/Howell - corona virus.html' qnial> webPage_convertBodyLinks l pinn d_webRaw # $ diff --width=85 "$d_webRaw""Pandemics, health, and the Sun/corona virus/Howell - corona virus.html" "$d_Q_tests""webPage_convertEncoding temp.txt" --suppress-common-lines Issues : +-+ +-+ 24************************24 #] 02Nov2020 link_fixErrs - remove ../ and fix many links loaddefs link d_Qtest 'Website updates- tests.ndf' internalLinks_return_backupSubDirFnames_test >> OK, seems to work fine now on all tests webPage_convertBodyLinks_test >> I no longer have diff results? lost the code I threw quick code up - doesn't see, to be working?... 05-----05 First test : p_inn := link d_Qtest 'test- Howell - corona virus.html' ; >> looks good, with notable issues : 02--02 >> Most (10) bodyLinks work well, eg : 89,90c87,88 < Anglophone < Scandanavia --- > Anglophone > Scandanavia 02--02 >> Some IMG work, eg : 256c254 < --- > 272c270 < --- > 02--02 >> Why didn't these work? IMG may not be working? Maybe the files were renamed? >> Yikes! It appears that I overwrote the test file father than the pocessed file. >> Many already have !!linkError!! although the program should handle these. 179c177 < --- > 181c179 < --- > 02--02 >> Two cases where subDirs are included when that is not necessary?

    02--02 >> Missing lines : "webPage_convertEncoding temp.txt" --suppress-common-lines 66,67d65 < < 02--02 >> Is there a problem with the initial removal of [#=; backtrack ;=#]? But that didn't seem to be a problem with most? >> There is no diff output? But it works well from bash!! Still, the output file is better than what I had, so I will rename it and copy over the existing standard. Fix diff problem! >> Nuts, I shortened the filename : p_std := link d_Qtest 'test- Howell - corona virus.html webPage_convertBodyLinks.html' ; >> I removed these. Try the same test again. >> OK (no change in output), but diff STILL doesn't work. 05-----05 Second test : $ diff --width=85 "$d_Qroot""code develop_test/'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html convertBodyLinks.html" "$d_temp""webPage_convertEncoding temp.txt" --suppress-common-lines NO initial conversion?? No faults, either. I was missing flag_backup (o) : webPage_convertBodyLinks o p_inn p_std ; Rerun : >> Works beautifully, including partial subDirs. still no QNial diff, though. 05-----05 Third test : 'test- page Howell - blog.html' As with previous tests : run once, over-write p_std, then manually check. (diff will give no feedback at this stage). Issues : +-+ 02--02 >> Awesome - most link conversions worked! 05-----05 Fourth test : 'test- HELP.html' This is a 'Conference Guide' webPage! As with previous tests : run once, over-write p_std, then manually check. (diff will give no feedback at this stage). Issues : Not many BodyLinks, but at least the milto:s look OK. Confrence Guide menue part of the raw files. Great deal of work to fix all that up to current approach. Forget it. 05-----05 Fifth & final test : 'test- Menu.html' This is a 'Conference Guide' menu! As with previous tests : run once, over-write p_std, then manually check. (diff will give no feedback at this stage). Issues : No bodyLinks. All this shows is that it's a waste of time to process menuHeadFoot files... 05-----05 olde code % Doesn't work!? ; midIndxs midLinks := EACH link fixIndxs (EACH solitary fixLinks) ; midIndxsLines_ignoreBads_test ; link_fixErrs_test ; 24************************24 #] 02Nov2020 reve old code IF flag_debug THEN write 'loading midIndxsLines_ignoreBads_test' ; ENDIF ; #] midIndxsLines_ignoreBads_test IS - midIndxsLines_ignoreBads_test IS { LOCAL are_good ; % ; EACH write_testStr '#05-----05' (link 'midIndxsLines_ignoreBads_test, ' timestamp) ; i_test := 0 ; % ; i_test := i_test + 1 ; t_name := link '# midIndxsLines_ignoreBads_test example #' (string i_test) ; t_input := (22 4 3 12) ('Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf' 'https://www.mackinac.org/SP1998-01' '#Key [results, comments]') ; t_standard := (22 12) (solitary 'Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf') ; t_result := midIndxsLines_ignoreBads t_input ; test_comment t_name t_input t_standard t_result ; % ; i_test := i_test + 1 ; t_name := link '# midIndxsLines_ignoreBads_test example #' (string i_test) ; t_input := 22 (solitary 'https://www.mackinac.org/SP1998-01') ; t_standard := null null ; t_result := midIndxsLines_ignoreBads t_input ; test_comment t_name t_input t_standard t_result ; } IF flag_debug THEN write 'loading link_fixErrs_test' ; ENDIF ; #] link_fixErrs_test IS - link_fixErrs_test IS { LOCAL are_good ; % ; EACH write_testStr '#05-----05' (link 'link_fixErrs_test, ' timestamp) ; i_test := 0 ; % ; i_test := i_test + 1 ; t_name := link '# link_fixErrs_test example #' (string i_test) ; t_input := (solitary 22) (solitary 'Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf') ; t_standard := (solitary 22) (solitary 'Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf') ; t_result := link_fixErrs t_input ; test_comment t_name t_input t_standard t_result ; EACH write '........' '30Oct2020 Simple case of fname-only' '' ; % ; i_test := i_test + 1 ; t_name := link '# link_fixErrs_test example #' (string i_test) ; t_input := (22 4 3) ('#Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf' 'Civilisations and sun/' '[#=; backtrack ;=#]Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.') ; t_standard := (solitary 4) (solitary 'Civilisations and sun/') ; t_result := link_fixErrs t_input ; test_comment t_name t_input t_standard t_result ; EACH write '........' '30Oct2020 Initially returned null. ../ problem?' '' ; % ; i_test := i_test + 1 ; t_name := link '# link_fixErrs_test example #' (string i_test) ; t_input := (22 4 3) ('Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf' 'https://www.mackinac.org/SP1998-01' '#Key [results, comments]') ; t_standard := null null ; t_result := link_fixErrs t_input ; test_comment t_name t_input t_standard t_result ; EACH write '........' '30Oct2020 Initially returned null. ../ problem?' '' ; } # find_Howell '#Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf' allFnamesGradeupList 24************************24 #] 01Nov2020 link_fixErrs - remove ../ and fix many links good progress, not done initial revamp... 05-----05 link d_Qndfs 'Website updates.ndf' # old code IF (NOT err_fixed) THEN % Later, tackle common problems [./, ???, etc] ; null ; ENDIF ; IF (NOT isfault (find_Howell errLinks@i allFnamesSortedByFname)) THEN fixIndxs@i := solitary errIndxs@i ; fixLinks@i := solitary errLinks@i ; err_fixed := l ; ENDIF ; % check if a legitimate subDir without fname ; IF (NOT err_fixed) THEN subLink := str_remove_subStr errLinks@i '../' ; IF (NOT isfault (find_Howell subLink allSubDirsSortedBySubdir)) THEN fixIndxs@i := solitary errIndxs@i ; fixLinks@i := solitary subLink ; err_fixed := l ; ENDIF ; ENDIF ; IF (NOT OR (isfault midIndxs) (= null midIndxs)) IF flag_debug THEN write 'loading link_fixErrs' ; ENDIF ; #] link_fixErrs IS OP errIndxs errLinks - returns fixed links, or the original "bad link" # Fixing links is noble, but perhaps more important is labelling those that are flawed? EACH non-http link should have EITHER '[#=; backtrack ;=#]' or '!!linkError!!', but not both. link_fixErrs IS OP errIndxs errLinks { LOCAL i err_fixed fixIndxs fixLinks fixLink subLink ; NONLOCAL allFnamesSortedByFname allSubDirsSortedBySubdir ; fixIndxs fixLinks := errIndxs errLinks ; % 31Oct2020 IMPORTANT - use [strList, NOT indxs] for shape-related optrs, as atomic indxs screw up ; err_fixed := o ; FOR i WITH (tell (gage shape errLinks)) DO % check if legitimate fname-only. If so - don't label as error. ; IF (NOT isfault (find_Howell errLinks@i allFnamesSortedByFname)) THEN err_fixed := l ; ENDIF ; % check if a legitimate subDir without fname ; IF (NOT err_fixed) THEN IF (`/ = (last fixLinks@i)) THEN subLink := str_remove_subStr errLinks@i '../' ; IF (NOT isfault (find_Howell subLink allSubDirsSortedBySubdir)) THEN fixLinks@i := subLink ; err_fixed := l ; ENDIF ; ENDIF ; ENDIF ; IF (NOT err_fixed) THEN % label this link as an error as it has failed all tests ; fixLinks@i := link '!!linkError!!' fixLinks@i ; ENDIF ; ENDFOR ; fixIndxs fixLinks } #] midIndxsLines_ignoreBads IS OP webPage webSite - remove "bad links" that should not be processed # 29Oct2020 initial IF flag_break THEN BREAK ; ENDIF ; 0 midIndxsLines_bads := 'http' '#' 'mailto:' '!!linkError!!' ; midIndxsLines_badShapes := EACH (gage shape) midIndxsLines_bads ; midIndxsLines_ignoreBads IS OP midIndices midList { LOCAL are_good result ; NONLOCAL midIndxsLines_bads midIndxsLines_badShapes ; IF (= null midIndices) THEN result := null ; ELSE takeArgs := midIndxsLines_badShapes cart midList ; are_good := NOT EACH OR (cols EACHALL OR (midIndxsLines_bads EACHLEFT EACHRIGHT = (EACH take takeArgs))) ; result := are_good EACHRIGHT sublist midIndices midList ; ENDIF ; result } # 'mailto:IEEE%20WCCI%202020%20HELP%20daemon%20?subject=IEEE%20WCCI%202020%20HELP%20:%20testing&body=Approximately 10 minutes after sending this email, you should receive two emails :%0D%0A 1. a confirmation that you have sent the email%0D%0A 2. the email that was forwarded by the daemon : addressed to me, and cc-d to you. %0D%0A Normally you do NOT receive the forwarded email, just the confirmation.%0D%0A%0D%0AYou can type in the email body below, and add cc: recipients, but extra To: recipients are ignored. DO NOT change the Subject:, or your email goes straight to trash!') are_good := NOT ( = (EACH take takeArgs))) ; zzTHEN zz fixIndxs fixLinks := errIndxs errLinks ; zz % 31Oct2020 IMPORTANT - use [strList, NOT indxs] for shape-related optrs, as atomic indxs screw up ; zz err_fixed := o ; IF (NOT err_fixed) THEN % label this link as an error as it has failed all tests ; fixLinks@i := link '!!linkError!!' fixLinks@i ; ENDIF ; fixIndxs fixLinks % remove [#=; backtrack ;=#] if present' ; % find [[dir, fname]-only, '!!linkError!!'] ; fnames := (`/ EACHRIGHT (1 + last findall) midLinks) EACHBOTH drop midLinks ; fnames := (`/ EACHRIGHT (1 + last findall) midLinks) EACHBOTH drop midLinks ; link_errs := EACH isfault fnames ; link_goos := EACH NOT link_errs ; % goos means good [indx, link]s, aligns coding ; errIndxs errLinks := link_errs EACHRIGHT sublist midIndxs midLinks ; gooIndxs gooLinks := link_goos EACHRIGHT sublist midIndxs fnames ; % link_fixErrs returns (fixIndxs fixLinks), including the original "bad links" ; % This avoids loss of text and hints at how to manually fix the links ; % ordering of the midLinks is unimportant, as long as [Indx, Link]s are properly paired ; fixIndxs fixLinks := link_fixErrs errIndxs errLinks ; midIndxs midLinks := gooIndxs gooLinks EACHBOTH link fixIndxs fixLinks ; midLinks := '[#=; backtrack ;=#]' EACHRIGHT link midLinks ; # old code fnameIndxs := fnames EACHLEFT find_Howell allFnamesSortedByFname ; fnameSubDirs := fnamesIndxs EACHLEFT pick allSubDirs ; midLinks := midLinks EACHLEFT str_remove_subStr d_webRaw ; link_errs := EACH isfault fnames ; IF (OR link_errs) THEN link_goos := EACH NOT link_errs ; % goos means good [indx, link]s, aligns coding ; errIndxs errLinks := link_errs EACHRIGHT sublist midIndxs midLinks ; gooIndxs gooLinks := link_goos EACHRIGHT sublist midIndxs midLinks ; % link_fixErrs returns fixed errLinks, or the original "bad links" ; % This avoids loss of text and hints at how to manually fix the links ; fixIndxs fixLinks := link_fixErrs errIndxs errLinks ; % ordering of the midLinks is unimportant, as long as [Indx, Link]s are properly paired ; midIndxs midLinks := gooIndxs gooLinks EACHBOTH link fixIndxs fixLinks ; ENDIF ; midIndxs midLinks := gooIndxs gooLinks EACHBOTH link fixIndxs fixLinks ; midIndxs midLinks := link_fixErrs midIndxs midLinks ; 24************************24 #] 31Oct2020 link_fixErrs loaddefs link d_Qtest 'Website updates- tests.ndf' qnial> link_fixErrs_test Tricky solitary issues... allSubDirsList - WRONG!, it includes filenames. [webSite_extractAll_pathsSubDirsFnames, webSite_extractHTML_pathsSubDirsFnames] : rewritten, corrected, update 'Website updates- tests.ndf' Now to test : qnial> internalLinks_return_backupSubDirFnames_test Many, many, many fixes (round in circles). Only remaining problem : 05-----05 # internalLinks_return_backupSubDirFnames_test example #2 : FAILED - result does NOT match standard t_input, t_standard, t_result = +---------02--02-----------------------------------------------------+----------------------------------+ ||
  • |/media/bill/SWAPPER/Website - raw/| +---------02--02-----------------------------------------------------+----------------------------------+
  • ........ #] 30Oct2020 Simple check of multiple "Table of Contents
  • >> no result 02--02 While most results are provided in sections above, links to data [spreadsheets, text files] and software [???, source code] are listed below along with brief comments. A full listing of files (including other SP500 web-pages) can be seen via this Directory's listing. Hopefully this will help those who want to do something different, as the programs etc mayhelp with [learning, debugging].
  • gnuplot I've used the unofficial extension .plt to designate gnuplot scripts for each of the graphs. You can see these files via this Directory's listing.
  • gnuplot.sh is the tiny bash script used to select gnuplot scripts. My other bash scripts can be found here.
  • QNial programming language - Quenn's University Nested Interactive Array Language (Q'Nial) is my top prefered programming language for modestly complex to insane programming challenges, along with at least 3 other people in the world. Bash scripts make a great companion to QNial. semi-log formula.ndf is the tiny "program" used to set up the semi-log line fits. More generally : here are many of my QNial programs. Subdirectories provide programs for various projects etc. >> Oops - entire paragraphs removed! no file (need to add) >> I need to fix this 02--02 All ' loaddefs link d_Qtest 'Website updates- tests.ndf' currently tests only 'test- page Howell - blog webPage_convertBodyLinks.html' qnial> webPage_convertBodyLinks_test 05-----05 -->[stepv] nextv ?.. Fnames := ( `/ EACHRIGHT ( 1 + last findall ) Linelist # Indices ) EACHBOTH drop Linelist # Indices -->[nextv] +---------+ |SP1998-01| +---------+ ?.. Fnamesubdirs := ( Fnames EACHLEFT find Fhtmlgradeuplist ) EACHLEFT pick Htmlfilesgradeuplist -->[nextv] ?address ?.. Fnamesubdirsweb := Fnamesubdirs cart ( solitary Website ) -->[nextv] +---------------------------------------------+ |+--------+----------------------------------+| ||?address|/media/bill/SWAPPER/Website - raw/|| |+--------+----------------------------------+| +---------------------------------------------+ 05-----05 >> big screwup, I need to exclude ^[http, #, mailto:] I added : indiceslineList_bads := 'http' '#' 'mailto:' ; indiceslineList_badShapes := EACH (gage shape) indiceslineList_bads ; indiceslineList_removeHttpHashMailto IS OP indices lineList { LOCAL are_good ; NONLOCAL indiceslineList_bads indiceslineList_badShapes ; IF flag_break THEN BREAK ; ENDIF ; IF (= null indices) THEN null ELSE takeArgs := indiceslineList_badShapes cart lineList ; are_good := NOT EACH OR (cols EACHALL OR (indiceslineList_bads EACHLEFT EACHRIGHT = (EACH take takeArgs))) ; are_good EACHRIGHT sublist indices lineList ENDIF } Retry qnial> fonn qnial> webPage_convertBodyLinks_test All of a sudden, path_retrieve_subDirFname_test fails!? Glad that I set up the test!!! Actually, I changed path_retrieve_subDirFname to be more generic, so I must change the test. >> OK, works revamped internalLinks_return_backupSubDirFnames : internalLinks_return_backupSubDirFnames IS OP strLeft strRight line webSite { LOCAL fnames fnamesIndices fnameSubDirs fnameSubDirsWeb midlList midlIndices ; NONLOCAL htmlFnamesGradeupList htmlSubDirsList ; IF flag_break THEN BREAK ; ENDIF ; midlIndices lineList := str_splitLftRgtTo_midlIndices_StrList '' line ; IF (~= null midlIndices) THEN midlList := midlIndices EACHLEFT choose lineList ; midlIndices midlList := midlIndicesLines_removeBads midlIndices (midlIndices choose lineList) ; IF (~= null midlList) THEN fnames := (`/ EACHRIGHT (1 + last findAll_Howell) midlList) EACHBOTH drop midlList ; IF (isfault fnames) THEN line ELSE fnamesIndices := fnames EACHLEFT find htmlFnamesGradeupList ; fnameSubDirs := fnamesIndices EACHLEFT pick htmlSubDirsList ; midlList := '[#=; backtrack ;=#]' EACHRIGHT link fnameSubDirs ; lineList#midlIndices := internalLinks_return_backupSubDirFnames midlIndices lineList webSite ; link lineList ENDIF ELSE line ENDIF ; ELSE line ENDIF } # loaddefs link d_Qndfs 'Website updates.ndf' # loaddefs link d_Qtest 'Website updates- tests.ndf' # webPage_convertBodyLinks_test >> Oops all lines with '> Oh, OK. This is an old filename still in the webPage. So the line should be returned! >> I put in findAll_Howell, and cfor an error ?.. Linelist # Midlindices := internallinks_return_backupsubdirfnames Midlindices Linelist Website -->[nextv] +----------------------+---------+-------------02--02+ |
  • ||| +----------------------+---------+-------------02--02+ >> OOPS again Change : +.....+ lineList#midlIndices := internalLinks_return_backupSubDirFnames midlIndices lineList webSite ; +.....+ To : +.....+ lineList#midlIndices := midlList ; +.....+ Dark matter video 1 - initial, simple.mpeg >> NUTS!! I have to use allFnamesGradeupList etc, because links aren't restricted to html files. >> Have to bye & start - somehow new coding didn't take effect I got internallinks_return_backupsubdirfnames to work Still no '17) errors reported like : 05-----05 /media/bill/PROJECTS/Qnial/code develop_test/test- page Howell - blog.html WRONG! non-null diff result : 42d41 <
  • 48c47 <
  • --- >
  • 55c54 < If we take an "Electric Universe" perspective, then perhaps shifts in the galactic currents could be expected to "light up" or "extinguish" stars to various degrees as the currents shift and move. In other words, the "lit-up regions" motions may relate more to drifts of galactic currents than to the motions of the stars themselves? My own [cheap, crappy] animation for the spiral currents moving though stations stars is shown in my video (mpeg format) : Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Dark matter video 1 - initial, simple.mpeg

    --- 05-----05 >> dropthe entire line :
  • >> but this line appears (modified) :
  • >> but this line was properly translated :
  • So - just mv : link d_temp 'webPage_convertEncoding temp.txt' back to the test standard file : link d_Qtest 'test- page Howell - blog webPage_convertBodyLinks.html' Rerun : qnial> webPage_convertBodyLinks_test #05-----05 str_to_unicodeList_test, Thu Oct 29 19:56:31 2020 # webPage_convertBodyLinks example #1 /media/bill/PROJECTS/Qnial/code develop_test/test- page Howell - blog.html OK - diff is null, so the standard result was generated. Looks great! But there may be errors that I am not picking up There are three otthst files - do tomorrow as I'm too blah to do a good job now. 05-----05 Remove old code fileops.c IF flag_debug THEN write 'loading path_retrieve_subDirFname' ; ENDIF ; #] path_retrieve_subDirFname IS OP path dirBase - returns fnameSubDir for an fname in dirBase # 28Oct2020 fix links in the body of the dirBase # 29Oct2020 make more generic - remove conditions for [http, #, [#=; backtrack ;=#]] # This is vulnerable to duplicate filenames in different directories!!! Only the first match is used. # I need to do more for paths with `#, as some do require processing. later ... path_retrieve_subDirFname IS OP path dirBase { LOCAL fname fPath subDirFname ; NONLOCAL webSiteAllPathList ; fname := path_extract_fname path ; IF (isfault fPath) THEN path ELSE subDirFname := str_extractPast_strFront path dirBase ; link '[#=; backtrack ;=#]' subDirFname ENDIF } # for tests, see link d_Qtest 'file_ops- test.ndf' # old code IF (chr_in_str `# path) THEN path ELSEIF (= 'http' ( 4 take path)) THEN path ELSE fname := path_extract_fname path ; fPath := first ((fname EACHRIGHT str_in_path webSiteAllPathList) sublist webSiteAllPathList) ; % write 'fPath = ' fPath ; IF (isfault fPath) THEN path ELSE subDirFname := str_extractPast_strFront fPath dirBase ; link '[#=; backtrack ;=#]' subDirFname ENDIF ENDIF 05-----05 str_splitLftRgtTo_midIndxs_StrList UST return a list of solitary indices, not a list of numbers. Otherwise, a list of one integer causes faults!! #] 31Oct2020 NYET - I reverall this, and simple used : FOR i WITH (tell (gage shape errLinks)) DO 02--02 str_splitLftRgtTo_midIndxs_StrList IS OP strLft strRgt str L midIndxs := EACH solitary (tell (gage shape splits)) ; >> This will affect many operators!!! $ find "$d_Qndfs" -maxdepth 3 -type f -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "str_splitLftRgtTo_midIndxs_StrList" "FILE" /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1585:IF flag_debug THEN write 'loading str_splitLftRgtTo_midIndxs_StrList' ; ENDIF ; /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1587:#] str_splitLftRgtTo_midIndxs_StrList IS OP strLft strRgt str - split str by paired [left, right]-end-marks /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1590:# 19Oct2020 initial, based on str_splitLftRgtTo_midIndxs_StrList /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1597: str_splitLftRgtTo_midIndxs_StrList IS OP strLft strRgt str /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1604: THEN fault '?str_splitLftRgtTo_midIndxs_StrList error : OR[i_heads, i_tails] is null' /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1611: str_splitLftRgtTo_Indxs_StrList IS str_splitLftRgtTo_midIndxs_StrList /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:151: midIndxs lineList := str_splitLftRgtTo_midIndxs_StrList strLeft strRight line ; /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:332: THEN indices lineList := str_splitLftRgtTo_midIndxs_StrList 'mailto:' '">' line ; /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:862: indicesMidls strList := str_splitLftRgtTo_midIndxs_StrList '[#=; ' ' ;=#]' line ; $ find "$d_Qndfs" -maxdepth 1 -type f -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "str_splitLftRgtTo_Indxs_StrList" "FILE" /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1611: str_splitLftRgtTo_Indxs_StrList IS str_splitLftRgtTo_midIndxs_StrList OK - easy to change,then test 24************************24 #] 2Oct2020 create filename-only sorted lists for p_[all, html]FileList I already did this for my symbols system see link d_QNial_mine 'Website header.ndf' webSite_sortCullGradeupOn1st_allPathsAndFnames IS { LOCAL fnameList ; NONLOCAL d_webRaw allFilesList allFilesGradeupList fnameGradeupList p_allFileList ; host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*" | grep --invert-match "Conference guides\|z_Old\|z_Archive\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations|i9018xtp.default/extensions/" | sort -u >"' p_allFileList '" ' ; % ; allFilesList := strList_readFrom_path p_allFileList ; fnameList := (`/ EACHRIGHT (1 + last findall) allFilesList) EACHBOTH drop allFilesList ; fnameGradeupList allFilesGradeupList := lists_sortupCullOn1st (fnameList allFilesList) ; } Seems to work fine 24************************24 #] 28Oct2020 webPage_convertBodyLinks see link d_Qtest 'file_ops- test.ndf' I did my first full-file test. 05-----05 for link d_Qtest 'test- Howell - corona virus webPage_convertBodyLinks.html' : line 325-328, with multiple Howell - Pandemics and the sun Howell - Selected pandemics & epidemics.pdf Hoyte & Schatten year - solar influence on climate & natural systems, graphs.pdf Tapping, Mathias, Surkan - Pandemics & solar activity Only a few of '> same problems of [missing, incomplete] subDir OK, Change : +.....+ lineList#indices := EACH path_retrieve_subDirFname fnameSubDir ; +.....+ To : +.....+ lineList#indices := EACH path_retrieve_subDirFname fnameSubDir webSite ; +.....+ The test were not processed to replace %20 with space - check current files in dwebRaw Not working for directory links.. none of the links has [#=; backtrack ;=#] ??? fileops.ndf : 05-----05 path_retrieve_subDirFname IS OP path dirBase { LOCAL fname fPath subDirFname ; NONLOCAL webSiteAllPathList ; IF (chr_in_str `# path) THEN path ELSEIF (= 'http' ( 4 take path)) THEN path ELSE fname := path_extract_fname path ; fPath := first ((fname EACHRIGHT subStr_in_str webSiteAllPathList) sublist webSiteAllPathList) ; % write 'fPath = ' fPath ; IF (isfault fPath) THEN path ELSE subDirFname := str_extractPast_strFront fPath dirBase ; link '[#=; backtrack ;=#]' subDirFname ENDIF ENDIF } 05-----05 >> [#=; backtrack ;=#] should be there! >> so the > not because of `[ as that isn't in the relevant code >> I didn't see any cases due to `# webPage_convertBodyLinks_test : webPage_convertBodyLinks p_inn d_Qtest ; % output goes to p_temp_webPage_convertEncoding ; >> is d_Qtest the problem? >> try d_webRaw >> still doesn't work It seems like this isn't working - fails and simply writes the line : IF (subStr_in_str ' View -> Directory listing filters -> check ONLY "Temporary & backup files" for local filters -> Edit filter rules -> "Temporary & backup files" : Filename ends with : [~, .bak.] Filename contains : [References, z_Archive, z_Old, z_References] check ALL : Conditions are case sensitive, Filter applies to : Files, Directories Click OK to retain changes Remove remaining transfer queue : Menu -> Edit -> Clear private data -> check Clear transfer queue box click in transfer [queued files, failed transfers, successful transfers] windows and [clear, delete] lists after all done Re-instate transfer only newer files : Menu -> Edit -> settings -> Transfers -> File exists action : Downloads -> Overwrite file if source file newer Uploads -> Overwrite file if source file newer +---+ >> OK, now to test Holidays - neural networks and genomics.html massive link screwup (missing ) http://www.billhowell.ca/Projects%20-%20mini/Puetz%20&%20%20Borchardt/Howell%20-%20comments%20on%20Puetz%20UWS,%20the%20greatest%20of%20cycles,%20human%20implications.odt http://www.billhowell.ca/Software%20programming%20&%20code/Qnial/ >> should link to web-page! Mostly the site looks really good! Leave the remaining correctipns for later... 24************************24 #] 27Oct2020 after [restructure, rename]ing of d_web[Raw, Site] : webSite_list_htmlFiles qnial> bye qnial> lq_fileops ; loaddefs link d_Qndfs 'Website updates.ndf' ; webSite_convert ; webSite_update >> all seemed to work submenus : Home n/a Neural Nets none work (blue font in manu) Projects most work still including COVID-19, but NOT [MindCode, Lucas, Puetz, Randall] Software programming & code none work Professional & Resume Resume works, still not Education Publications & reports n/a Howell-produced videos none work (blue font in manu) Blogs all work Cool stuff n/a - just lists directory content Crazy themes and stories all work except still not deer Hosted sub-sites all work except Wickson Neil Howell's Art all work except Wickson Neural Nets none work (blue font in manu) I can't see why this won't work ??? Projects most work still including COVID-19, but NOT [MindCode, Lucas, Puetz, Randall] fix filenames of [MindCode, Puetz, Randell Mills] don't know why Lucas doesn't work removed COVID-19 - need a submenu for pandemics (later) Software programming & code none work might have been weird special character? Howell-produced videos none work (blue font in manu) I can't see why this won't work ??? Hosted sub-sites all work except Wickson I can't see why this won't work ??? qnial> webSite_convert ; webSite_update submenus : Home n/a Neural Nets same - none work (blue font in manu) Projects many work still, but NOT [MindCode, Lucas, Puetz, Randall, Icebreaker] Software programming & code none work still Professional & Resume Resume works, still not Education Publications & reports n/a Howell-produced videos none work (blue font in manu) Blogs all work Cool stuff n/a - just lists directory content Crazy themes and stories all work except still not deer Hosted sub-sites all work except Wickson Neil Howell's Art all work except Wickson >> Seems to be a problem with directories that are NOT directly under the menu host? Hosted sub-sites all work except Wickson Neil Howell's Art all work except Wickson added space after 'Steven' (doesn't make sense, try anyways) Neural Nets same - none work (blue font in manu) Howell-produced videos none work (blue font in manu) no idea of what the problem is maybe add space before \n>? Projects many work still, but NOT [MindCode, Lucas, Puetz, Randall, Icebreaker] I give up for now. Just refresh, and take a big break to do income taxes. qnial> lq_fileops ; loaddefs link d_Qndfs 'Website updates.ndf' ; webSite_convert ; webSite_update Software programming & code -> removed weird character in d_webSite >> NUTS!! It did work. The menue selections were blue because I hadn't tried them yet! >> Now all work. Neural Nets >> All work except [Neural Nets, MindCode] Status submenus : Home n/a Neural Nets All work except [Neural Nets, MindCode] Projects most work still, but NOT [MindCode, Puetz, Randall] Software programming & code all work Professional & Resume all work Publications & reports n/a Howell-produced videos all work Blogs all work Cool stuff n/a - just lists directory content Crazy themes and stories all work Hosted sub-sites all work Neil Howell's Art all work MindCode - change filename 10 Howell - MindCode Manifesto.odt Howell - comments on Puetz UWS, the greatest of cycles, human implications.odt Howell - review of Holverstott 2016 Randell Mills hydrino energy.pdf qnial> lq_fileops ; loaddefs link d_Qndfs 'Website updates.ndf' ; webSite_convert ; webSite_update >> Several errors : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial - Howells web-page.html ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial - Howells web-page.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : >> /media/bill/SWAPPER/Website - raw/webWork files/4_test Kyoto Premise - the scientists arent wearing any clothes (copy).html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : >> This was deleted, removed from '2_website p_webPageList.txt' 05-----05 OK - I should now remove write each file for [webSite_convert, webSite_update] so it's much easier to see the errors!!! Both are working very well now. qnial> lq_fileops ; loaddefs link d_Qndfs 'Website updates.ndf' ; webSite_convert ; webSite_update >>> loading start : file_ops.ndf <<< loading ended : file_ops.ndf >>> loading start : Website updates.ndf >>>>>> loading start : Website header.ndf <<<<<< loading ended : Website header.ndf <<< loading ended : Website updates.ndf ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/z_Archive/201027 13h47m09s backups/ /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/z_Archive/201027 13h47m09s backups/ /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/z_Archive/201027 13h47m09s backups/ /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial - Howells web-page.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/webWork files/fin footer.html /media/bill/ramdisk/stdTmp.txt ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html /media/bill/HOWELL_BASE/Website/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and De code Base64 Files, instructions.html /media/bill/HOWELL_BASE/Website/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial - Howells web-page.html /media/bill/HOWELL_BASE/Website/Software programming & code/Qnial/QNial - Howells web-page.html >> This is much more useful. Status submenus : Home n/a Neural Nets All work except [Neural Nets] Projects most work still, but NOT [Puetz, Randall] Software programming & code all work Professional & Resume all work Publications & reports n/a Howell-produced videos all work Blogs all work Cool stuff n/a - just lists directory content Crazy themes and stories all work Hosted sub-sites all work Neil Howell's Art all work 24************************24 #] 27Oct2020 Fix backups in fileops.ndf, pinn_writeExecute_pout Doesn't work for first level of webSite directory : backtrack -> should change insertion for depther = [0,-1] >>> continue := l ; depther := (gage shape (array_findAll_subArray `/ subDir)) - 1 ; IF (0 < depther) THEN backtrack := link (depther reshape (solitary '../')) ; ELSEIF (0 = depther) THEN backtrack := '' ; ELSEIF (-1 = depther) THEN backtrack := '' ; ELSE write '?pinn_writeExecute_pout error : depther out of range : ' depther ; continue := o ; ENDIF ; IF continue THEN <<< Now to do : qnial> webSite_convert >> Seems OK qnial> webSite_update >> Seems OK 05-----05 Problematic menus : Menu.html works for all except [Home, Hosted Web-pages, Neural nets] Menu blogs.html works for all Menu crazy themes and stories.html doesn't work for [Deer collision, ] Menu earth, sun, astro, history.html ?? not implemented in main menu (Home) Menu hosted subsites.html only works for Neil Howell, none of top menu work Menu Howell videos.html none of [top, video] menus work Menu Lies, Damned Lies, and Scientists.html Menu neural nets.html Menu professional and resume.html Menu projects.html Menu software programming.html 05-----05 Menu.html in sub-pages : Neural Nets none work (blue font in manu) Projects all work Software programming & code all work Professional & Resume none work (blue font in manu) Publications & reports all work Howell-produced videos none work (blue font in manu) Blogs all work Cool stuff n/a - just lists directory content Crazy themes and stories all work Hosted sub-sites all work Neil Howell's Art none work (blue font in manu) From now on, just assume blue font means broken link... as a first guess. Also - footer images [GNU, Creative Commons] don't work if top menu doesn't. 05-----05 pinn_writeExecute_pout - I probably had it right the first time... depther := (gage shape (array_findAll_subArray `/ subDir)) ; 24************************24 #] 26Oct2020 path_insertIn_fHand [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout I seem to remember removing an "inner path_insertIn_fHand" at sometime to help with debugging? In any case, I need that now! [#!: pinn_writeExecute_pout path d_inn d_out ; path_insertIn_fHand d_out fout ; 05-----05 Added to 'webPage_convertEncoding IS OP webPage' : sed_insertFix2 := link ';s|\[#!: path_insertIn_fHand (link d_webWork \(.*\)) fout ' '|[#!: pinn_writeExecute_pout (link d_webWork \1) stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; |' ; 05-----05 qnial> webPage_convertEncoding (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') >> At end of file : [#!: pinn_writeExecute_pout (link d_webWork 'fin Footer.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; qnial> webPage_convert o (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') >> looks good... qnial> webPage_update (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') 05-----05 OK - convert the example to change the original file : qnial> webPage_convert l (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') qnial> webPage_update (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') STUPID!!! I change links to webPage_convert l (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') qnial> webPage_update (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') >> OK, everything now looks fine EXCEPT backtracks in the menus, which were NOT converted. Menus- I have the WRONG symbols in all but [Menu, Menu blogs] [#!; -> change to [#!; [#!: -> writeExecute [#=; -> menuHeadFoots 05-----05 Do whole website (should test more, but I'm gettign sick of this). qnial> webSite_convert >> Seemed to go well qnial> webSite_update >> Oops - menus etc , I removed : # old code % first update (execute embedded) menuHeadFoots, as they are used by the webPages ; % MHFs are saved in d_webWork or other directory of d_webRaw ; menuHeadFootList := list_readFrm_path p_menuHeadFootList ; FOR MHF WITH menuHeadFootList DO write MHF ; webPage_update MHF ; ENDFOR ; 05-----05 Problematic menus : Menu.html works for all except [Home, Hosted Web-pages, Neural nets] Menu blogs.html works for all Menu crazy themes and stories.html doesn't work for [Deer collision, ] Menu earth, sun, astro, history.html ?? not implemented in main menu (Home) Menu hosted subsites.html only works for Neil Howell, none of top menu work Menu Howell videos.html none of [top, video] menus work Menu Lies, Damned Lies, and Scientists.html Menu neural nets.html Menu professional and resume.html Menu projects.html Menu software programming.html >> OK, this doesn't work. Now to find the [problem, solution] I have one too many backtrack, so take out the "+ 1" that I added earlier today! Change : +.....+ depther := (gage shape (array_findAll_subArray `/ subDir)) + 1 ; +.....+ To : +.....+ depther := gage shape (array_findAll_subArray `/ subDir) ; +.....+ 05-----05 Re-do whole webSite qnial> webSite_convert >> Seemed to go well qnial> webSite_update >> YIKES, none of the menues work!! It may not be useful to keep running webSite_convert, unless problems pop up. By now all targeted files have been converted! Problematic menus : I looks like links come up too far (missing a .../ !?? which I just took out!? Change back to : depther := (gage shape (array_findAll_subArray `/ subDir)) + 1 ; lq_fileops qnial> webSite_update file:///media/bill/HOWELL_BASE/Website/page%20projects.html Neural Nets menu item : file:///media/bill/Neural%20nets/Neural%20Networks.html Now I'm up two levels, so I need to fix. Check d_webRaw first >> Idiot. No menus there! Of course. Try this, though it should crash in d_webRoot? depther := (gage shape (array_findAll_subArray `/ subDir)) - 1 ; >> Seems to work well?!!! 24************************24 #] 25Oct2020 backtracks were not executed. Why, all of a sudden, don't they work? fileops.ndf flag_break : pinn_writeExecute_pout IS OP pinn d_inn d_out >> nuts : 1. subDir wasn't added! It is needed to go down from [d_webRaw, d_webSite] 2. menuHeadFoots must be executed as well! (I stupidly removed that code - but for conversions) 24************************24 #] 25Oct2020 Corrections cycle in d_webRaw # Fix previous conversions of web-[pages, site] # 25Oct2020 - It's much easier just to : 1. dirBackup_restoreTo_paths webpages from an earlier date 2. add corrections to webPage_convertEncoding 3. webSite_convert 05-----05 #] 25Oct2020 19:45 1. qnial> dirBackup_restoreTo_paths (link d_webRaw 'z_Archive/201025 19h11m26s backups/') p_webPageList cp: cannot stat '/media/bill/SWAPPER/Website - raw/z_Archive/201025 19h11m26s backups/201022 18h08m34s Howell - influenza virus.html': No such file or directory >> I deleted that file (again? - 3rd or 4th time) 2. add corrections to webPage_convertEncoding - sed_footLevels added [F,f] & capitalized Footer : sed_footLevels := ';s|fin [F,f]ooter[1-9]\.html|fin Footer\.html|' ; 3. webSite_convert >> seems OK? 05-----05 WebSite update - one-way flow of html files, so backups are not an issue 1. webSite_update >> None of menus work!!? - shouldn't have added 1? >> Ah Hah! - backtracks were not executed. Why all of a sudden they don't work? 24************************24 25Oct2020 05-----05 webPage_update IS OP webPage qnial> webPage_update (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') >> It didn't work!! 05-----05 Now for the "Big Test" : qnial> webSite_update I need to update [menu, header, footer]s FIRST, then the others! eg split p_htmlFileList into [[menu, header, footer]s, regular webPages] CRAP! ALL html webPages have been destroyed! huge work to put back BUT - they look correct!!!? 05-----05 All the way back to : # dirBackup_restoreTo_paths (link d_webRaw 'z_Archive/201025 18h31m43sbackups/') p_webPageList >> seemed OK? qnial> webSite_convert ?path_backupTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/influenza/201022 18h08m34s Howell - influenza virus.html /media/bill/SWAPPER/Website - raw/z_Archive/201025 19h11m26s backups/ ?webPage_convertEncoding file unknown error, webPage : /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/influenza/201022 18h08m34s Howell - influenza virus.html [#!: path_insertIn_fHand (link d_webWork 'fin footer.html') fout Binary file (standard input) matches >> NUTS! - back to the same old shit! I'm going around in circles 24************************24 24Oct2020 05-----05 webSite_update d_webRaw Crap - I didn't fix up [webPage, webSite] update operators! Time to go to bed! 05-----05 So now try : webSite_convert d_webRaw >> didn't work at all?? forgot flag_overwrite webPage dirBackup webSite_convert d_webRaw >> all webPages returned ?webPage_convertEncoding file unknown error, webPage : /media/bill/SWAPPER/Website - raw/ A few corrections of cockups and it ran well. Now to check a random selection of updated webPages in d_webRaw 05-----05 webPage_convertEncoding - outputs message at end of output file : Binary file (standard input) matches Why?? - maybe missing apos at [start, end] of sed? I added them in >> didn't help, sed didn't seem to run? >> I was already using quote at [start, end] - which is good, so that wasn't the problem By now, the message is in the original file, so remove it and see what happens. >> OK, it's no longer there! So I'm ready to do entire website? Scary... I need a backup did rsync backup - looks good 05-----05 Test again webPage_convert # webPage_convert o (link d_webRaw 'Pandemics, health, and the Sun/_Pandemics, health, and the sun.html') (link d_Qndfs 'z_Archive/201024 backups/') >> OK # /media/bill/SWAPPER/Website - raw/Climate and sun/_Climate and sun.html \.\./ 3; :&file-insert &: 5; \[#=; backtrack ;=#\] 0; # webPage_convert o (link d_webRaw 'Climate and sun/_Climate and sun.html') (link d_Qndfs 'z_Archive/201024 backups/') >> mailtos not fixed, file notgenerated OK, now it's working well - BUT : 1. get strange "Binary file inpute" message at end of new file - ?? I don't know what the issue is?? 2. have [Menu1, fin footer1, etc] - I added [sed_menuLevels sed_footLevels] Try again : # webPage_convert o (link d_webRaw 'Pandemics, health, and the Sun/_Pandemics, health, and the sun.html') (link d_Qndfs 'z_Archive/201024 backups/') >> OK for [Menus, foot] but still get error at end of output : Binary file (standard input) matches Try a save, and test all [links, mailtos] : # webPage_convert l (link d_webRaw 'Pandemics, health, and the Sun/_Pandemics, health, and the sun.html') (link d_webRaw 'z_Archive/201024 backups/') >> Oops, sometimes "fin [F,f]ooter..." -> BUT, I only want to change fin Footer!! as the others aren't indexed A few links to my website are broken - fix later stage >> For some reson, the "junk message" doesn't show? (I don't get it) >> file overwritten - Yes, as it should >> path_backupTo_dir webPage dirBackup ; did NOT work!?!? flag_break := l ?.. host link 'mv "' P_temp_webpage_fixmailtos '" ' " Webpage '" ' -->[nextv] webPage /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/_Pandemics, health, and the sun.html -->[nextv] ?invalid host command man mv >> specify directory ONLY! nyet - I had swith apo & quote NUTS!! - I specified the WRONG d_bazckup!!! # webPage_convert l (link d_webRaw 'Pandemics, health, and the Sun/_Pandemics, health, and the sun.html') (link d_webRaw 'z_Archive/201024 backups/') >> OK - now it goes to the right directory >> still have issue with : Binary file (standard input) matches 05-----05 Testing 'webPage_convert' # webPage_convert o (link d_Qtest 'test- HELP.html') (link d_webRaw 'z_Archive/201024 backups/') 05-----05 Test pathListFile_findCountsBY_strList # list_readFrm_path p_htmlFileList # pathListFile_findCountsBY_strList p_htmlFileList ('"mailto' '../' ':&file-insert &:' '[#=; backtrack ;=#]') >> OK, works, but I doubt the "backtrack" counts per file # pListFiles_findCountsBY_strList p_htmlFileList ('"mailto' '\.\./' ':&file-insert &:' '\[#=; backtrack ;=#\]') p_findCountsBY_strList # good example of a file with embeddeds : /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/_Pandemics, health, and the sun.html \.\./ 2; :&file-insert &: 5; \[#=; backtrack ;=#\] 0; 05-----05 # p_temp := link d_temp 'update_encoding temp.txt' # example with ' ' # generate_menus_levels dw_base # cmd := link find "' strOld '" -maxdepth 3 -name "' pname '" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '&file-insert &:' "FILE" | sed 's/:&file-insert &:*/:&menu-insert &:/' | sort -u # code thoughts ELSEIF (in_string ':&title-insert &:' line) THEN % insert the web-page title construct ; IF (~= null (line := strings_between '"' '"' line)) THEN line := execute line ; write line ; insertInPath_fHand line fout ; ENDIF ; # "$d_Qndfs"'webSite/0_webSite QNial notes.txt' # enddoc