#] #] ********************* #] "$d_bin"'webSite update notes.txt' - LOCAL d_web update, notes # www.BillHowell.ca 01Sep2023 initial, previous notes in QNial, etc ("see also" below Setup) # view in text editor, using constant-width font (eg courier), tabWidth = 3 #] +-----+ #] see also : #] "$d_bin"'webSite update [old ToDo, templt, geany regexpr]s.txt' #] "$d_bin"'webSite maintenance specific files.link.txt' #48************************************************48 #24************************24 # Table of Contents, generate with : # $ grep "^#]" "$d_bin"'webSite update notes.txt' | sed "s/^#\]/ /" >"$d_bin"'webSite update TbleOfContents.txt' # #24************************24 #] +-----+ #] ToDos active : 14Sep2023 pOvrClassL_get_pClassL pOvrClassL pHtmlClassAll_L - generate a pList of all classes 16Sep2023 must also trap [no-tab, empty] strP in pHum_sed_pCde? ...later... don't think affects the current situation (no empty pCde), but may make code more bullet-proof in general 17Sep2023 I need a script to better handle dirChanges across processes Already done? : 'lftp update www-BillHowell-ca.sh' 20Sep2023 format 'status & updates' MenuTops [All_, TrNN] I need to split [projmajor, projmini, pandemics, etc] - by color? 20Sep2023 MenuTop [copyright, help] are incomplete, fix classes 23Sep2023 probably deleted, must get from backup : /home/bill/web/eir3.gif /home/bill/web/../eir_subscribe_button.gif /home/bill/web/eirtoc/2000/eirtoc_2742.html 23Sep2023 /0_test/fileops/povrL_pStrP_replace/fileops dif.txt, OK but error msg - later 26Sep2023 webSite_getCheck_links() : further split Hrf into [html, txt, doc, ods, etc] 27Sep2023 steps of pWebPageL_pStrP_replaceGetBad() - checks should be logged by func!! #] +-----+ #24************************24 #08********08 #] ??Sep2023 #08********08 #] ??Sep2023 #08********08 #] ??Sep2023 #08********08 #] ??Sep2023 #08********08 #] ??Sep2023 #08********08 #] ??Sep2023 #08********08 #] ??Sep2023 #08********08 #] ??Sep2023 #08********08 #] 30Sep2023 fix webSite_get_links_run() in filops.sh +-----+ olde code if [ -f "$pMixTmp901" ];then rm "$pMixTmp901" fi nItr=0 bolLnk=1 while [ 1 -eq "$bolLnk" ]; do nItr=$(( nItr + 1 )) # echo "$linMix901" >>"$pMixTmp901" # diff "$pMixTmp901" "$pLnkTmp901" --suppress-common-lines | grep ">" | sed 's/> //' >"$pDifTmp901" # grep "$sedExpr901" "$pDifTmp901" >"$pLnkWWk901" # cp "$pLnkWWk901" "$pBadTmp901" # rm "$pMixTmp901" # if [ -s "$pLnkWWk901" ]; then # bolLnk=1 # else bolLnk=0 # fi # if ! [ 6 -le "$nItr" ]; then # break # fi done #08********08 #] 27Sep2023 pHtmlPathAll_L upload onLine - online webPages, this MUST be run ONLY from : # "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' It worked GREAT!! Many links don't work for captioned-iages, but it's a start. +-----+ olde code useless - done by dWebPage_update_dWebOnly in "$d_bin_secure" : # update_dWebHtmlOnly(no args) - rm html in dWebOnly to make sure it's "clean", cp current # otherwise old html will be re-sent to webSiteOnline # 17Sep2023 inital update_dWebHtmlOnly() { date_ymdhms=$(date +"%0y%0m%0d %0kh%0Mm%0Ss") echo >>"$p_log" "$date_ymdhms update_dWebHtmlOnly" d_webOnly="$d_PROJECTS"'webHtmlOnly' pWebPageL="$d_webWork"'pHtmlPathAll_L.txt' find "$d_webOnly" -type f -print0 | xargs -0 rm -f # transfer only webPages from d_web to d_webOnly while IFS='' read -u 9 pWebPage; do pWebOnly=$( echo "$pWebPage" | sed 's|/home/bill/web/|/home/bill/PROJECTS/webHtmlOnly/|' ) # echo "$pWebOnly" cp -p "$pWebPage" "$pWebOnly" done 9<"$pWebPageL" } # pHtmlPathExclL - not necessary, as pHtmlPathAll_L is a specified list of files pTmp="$d_temp"'dWeb_update_dOnline html temp.txt' #08********08 #] 27Sep2023 update_dWebHtmlOnly(no args) - [edit, run] update_dWebHtmlOnly(no args) - rm html in dWebOnly to make sure it's "clean", cp current I added : # transfer only webPages from d_web to d_webOnly while IFS='' read -u 9 pWebPage; do pWebOnly=$( echo "$pWebPage" | sed 's|/home/bill/web/|/home/bill/PROJECTS/webHtmlOnly/|' ) # echo "$pWebOnly" cp -p "$pWebPage" "$pWebOnly" done 9<"$pWebPageL" 19:16$ bash "$d_bin"'fileops run webSite.sh' cp: cannot create regular file '/home/bill/PROJECTS/webHtmlOnly/Neural nets/callerID-SNNs/callerID-SNNs.html': No such file or directory cp: cannot create regular file '/home/bill/PROJECTS/webHtmlOnly/Neural nets/callerID-SNNs/nomenclature.html': No such file or directory >> oops, dir is lacking 19:17$ bash "$d_bin"'fileops run webSite.sh' >> perfect? #08********08 #] 27Sep2023 iterate : new "$d_webWork"'pStrPAll_L change.txt' +-----+ webSite_check_internalLinks(no args) check if-valid internal links /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p030tbl01.02 complementary streams: What- [rapid, stable] learn invariant object categories, Where- [labile spatial, action] actions.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p035fig01.22 Presentation [normal, silence, noise replaced].png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p128fig04.04 reflectance changes at contours: fill-in color contours.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p392fig11.30 How multiple scales vote for multiple depths, scale-to-depth and depth-to-scale maps.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p428fig12.25 ARTSPEECH: auditory-articulatory feedback loop & imitative map, [auditory, motor] dimensionally consistent, motor theory of speech.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p505fig13.32 Behavioral contrast: [response suppression, antagonist rebound] both calibrated by shock levels.png /home/bill/web/Neural nets/TrNNs_ART/videoProdn/Grossberg's Consciousness: video script.html /home/bill/web/webWork/pMenuTopHelp TrNNs_ART.html >> not bad, most are missing /images- captioned/ Two need fixing : /home/bill/web/Neural nets/TrNNs_ART/videoProdn/Grossberg's Consciousness: video script.html /home/bill/web/webWork/pMenuTopHelp TrNNs_ART.html pWebPageL_str_extractPthLWithStr_pout '/videoProdn/Grossberg's Consciousness: video script.html' >> not easy to search this one /home/bill/web/Neural nets/TrNNs_ART/videoProdn/Grossbergs Consciousness: video script.html >> it was in pWebPageL_str_extractPthLWithStr_pout '/web/webWork/pMenuTopHelp TrNNs_ART.html' /home/bill/web/Neural nets/TrNNs_ART/webWork/bash script: put [caption, reference]s on [figure, table]s.html:27:copyrights /web/webWork/pMenuTopHelp TrNNs_ART.html +-----+ re-run pWebPageL_pStrP_replaceGetBad(no args) clean [pth, link], errorOutputs webSite_check_internalLinks(no args) check if-valid internal links /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p030tbl01.02 complementary streams: What- [rapid, stable] learn invariant object categories, Where- [labile spatial, action] actions.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p035fig01.22 Presentation [normal, silence, noise replaced].png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p128fig04.04 reflectance changes at contours: fill-in color contours.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p392fig11.30 How multiple scales vote for multiple depths, scale-to-depth and depth-to-scale maps.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p428fig12.25 ARTSPEECH: auditory-articulatory feedback loop & imitative map, [auditory, motor] dimensionally consistent, motor theory of speech.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p505fig13.32 Behavioral contrast: [response suppression, antagonist rebound] both calibrated by shock levels.png /home/bill/web/Neural nets/webWork/pMenuTopHelp TrNNs_ART.html >> OK - all remaining errors are due to known missing /images- captioned/ +-----+ olde code # str_strP_replace - standard function of mine #str_strP_replace 1 1 '' 'https://www.BillHowell.ca/webSites - other people/ /home/bill/web/webOther/' #str_strP_replace 1 1 'www.BillHowell.ca/webSites - other people/Neil Howell/_Neil Howell.html">' 'www.BillHowell.ca /home/bill/web/webOther/' # webSite_get_internalLinksWithFnamLinNum # (no args) extract internal links #08********08 #] 27Sep2023 pWebPageL_pStrP_replaceGetBad(no args) clean [pth, link], errorOutputs see - I wrote a script to capture checks in a logFile : "$d_webWork"'pWebPageL_pStrP_replaceGetBad_logCheck.txt' 14:02$ bash "$d_bin"'fileops run webSite.sh' 27Sep2023 improve pInn_archiveLocal_pDateMod, two-step process to have [fixed, longTerm] fnames 1. backup dated 'archive' in z_Archive (z_Archive will have a z_Archive!) 2. save current file to fixed named 'archive' in z_Archive >> OK works Make new "$d_webWork"'pStrPAll_L change.txt' from "$d_webWork"'pWebSiteLinkFailL.txt' except - have to actually re-generate /images- captioned/ : /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p030tbl01.02 complementary streams: What- [rapid, stable] learn invariant object categories, Where- [labile spatial, action] actions.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p035fig01.22 Presentation [normal, silence, noise replaced].png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p128fig04.04 reflectance changes at contours: fill-in color contours.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p392fig11.30 How multiple scales vote for multiple depths, scale-to-depth and depth-to-scale maps.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p428fig12.25 ARTSPEECH: auditory-articulatory feedback loop & imitative map, [auditory, motor] dimensionally consistent, motor theory of speech.png +-----+ Search for some via "$d_bin"'fileops run webSite.sh', manually fix each : pWebPageL_str_extractPthLWithStr_pout '\/home\/bill\/web\/???' /home/bill/web/ProjMajor/Sun pandemics, health/corona virus/Fauci covid emails/0_Howell - Fauci corona virus emails, text [rendition, analysis].html:339:
  • QNial programming language - Queen's University of Kingston, Ontario, Canada, "Q'Nial Nested Interactive Array Language" is my top prefered programming language for modestly complex to insane programming challenges, along with perhaps 3 other people in the world (if they are still alive). pWebPageL_str_extractPthLWithStr_pout 'SP500 1928-2020 yahoo finance.dat' /home/bill/web/economics, markets/market data/SLregress/200912 semi-log/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html:157:
  • Yahoo finance data (23Feb2023 the text file has been lost, but the data is in the same spreadsheet as per the TradingView data). I was happy to have another "somewhat independent" data source, even if they are both from the same S&P or other source. This really helps as a check on my data treatment (see the section above "Comparison of [TradingView, Yahoo finance] data"). pWebPageL_str_extractPthLWithStr_pout '\/home\/bill\/web\/web\/Neural nets\/TrNNs_ART\/For whom the bell tolls\.html' /home/bill/web/Neural nets/TrNNs_ART/references- non-Grossberg.html:54:
  • Terrence J. Sejnowski 21Aug2023 "Large Language Models and the Reverse Turing Test", Neural Computation (2023) 35 (3): 309–342 (33 pages) https://direct.mit.edu/neco/issue (also copy in case original link fails) pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/Taylors concepts.html' /home/bill/web/Neural nets/TrNNs_ART/Introduction.html:127:
  • pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/bash script: thematic, list of txt files to search.txt' /home/bill/web/Neural nets/TrNNs_ART/[definitions, models] of consciousness.html:237:16Jul2023 I am currently lacking a coherent overall webPage for Grossberg's Consciousness. In the meantime refer to the very detailed listing of consciousness and other themes as a starting point to peruse for Grossberg's ideas. This webPage is a compilation of themes extracted from files listing [chapter, section, figure, table, comment]s.
    /home/bill/web/Neural nets/TrNNs_ART/Grossbergs [core, fun, strange] concepts.html:119: /home/bill/web/Neural nets/TrNNs_ART/videoProdn/Grossbergs Consciousness: video script.html:75:Viewers may list their own comments in files (on or more files from different people, for example), to include in Files listing [chapter, section, figure, table, selected Grossberg quotes, my comments]s. These files of lists are my basis for providing much more detailed information. While this is FAR LESS HELPFUL than the text of the book or its index alone, it can complement the book index, and it has the advantages that : /TrNNs_ART/webWork/bash script: thematic, list of txt files to search.txt pWebPageL_str_extractPthLWithStr_pout '/IllusionOfTheYear/Peter Veto - The Phantom Wiggle.ogv' /home/bill/web/Neural nets/TrNNs_ART/IllusionOfTheYear/0_IllusionOfTheYear.html:133: /TrNNs_ART/IllusionOfTheYear/2021 Peter Veto - The Phantom Wiggle.ogv pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/machine consciousness, the need notes.html' /home/bill/web/Neural nets/TrNNs_ART/machine consciousness, the need.html:160:Howell - the need for machine consciousness
    I changed this to : machine consciousness - see for example :
    Howell 30Dec2011, page 39 "Part VI - Far beyond current toolsets"
    /TrNNs_ART/videoProdn/Grossbergs Consciousness: video script.html change to : Rather than just watch this video, you can follow it by reading the script and following its links, once I write it... pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/webWork/copyright home.html' /home/bill/web/Neural nets/TrNNs_ART/webWork/bash script: put [caption, reference]s on [figure, table]s.html:26: /home/bill/web/Neural nets/TrNNs_ART/webWork/pMenuTopStatus TrNNs_ART.html:26: /TrNNs_ART/webWork/pMenuTopCopyright TrNNs_ART.html pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/webWork/Menu zhome, Grossberg.html' /home/bill/web/Neural nets/TrNNs_ART/webWork/bash script: put [caption, reference]s on [figure, table]s.html:21: /home/bill/web/Neural nets/TrNNs_ART/webWork/pMenuTopStatus TrNNs_ART.html:21: /TrNNs_ART/webWork/pMenuTopMenu TrNNs_ART.html pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/webWork/pMenuTopCopyright TrNN_ART.html' /home/bill/web/Neural nets/TrNNs_ART/webWork/pMenuTopCopyright TrNNs_ART.html:1: /home/bill/web/Neural nets/TrNNs_ART/webWork/pMenuTopCopyright TrNNs_ART.html:27: pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/webWork/status Grossbergs overview.html' /home/bill/web/Neural nets/TrNNs_ART/webWork/bash script: put [caption, reference]s on [figure, table]s.html:24: /TrNNs_ART/webWork/pMenuTopStatus TrNNs_ART.html To fix now : y /home/bill/web/??? y /home/bill/web/economics, markets/market data/SLregress/200912 semi-log/SP500 1928-2020 yahoo finance.dat y /home/bill/web/Neural nets/TrNNs_ART/bash script: thematic, list of txt files to search.txt y /home/bill/web/Neural nets/TrNNs_ART/IllusionOfTheYear/Peter Veto - The Phantom Wiggle.ogv y /home/bill/web/Neural nets/TrNNs_ART/machine consciousness, the need notes.html y /home/bill/web/Neural nets/TrNNs_ART/Taylors concepts.html y /home/bill/web/Neural nets/TrNNs_ART/videoProdn/Grossberg's Consciousness: video script.html y /home/bill/web/Neural nets/TrNNs_ART/webWork/copyright home.html y /home/bill/web/Neural nets/TrNNs_ART/webWork/Menu zhome, Grossberg.html y /home/bill/web/Neural nets/TrNNs_ART/webWork/pMenuTopCopyright TrNN_ART.html y /home/bill/web/Neural nets/TrNNs_ART/webWork/status Grossbergs overview.html y /home/bill/web/web/Neural nets/TrNNs_ART/For whom the bell tolls.html #pWebPageL_str_extractPthLWithStr_pout '\/home\/bill\/web\/???' #pWebPageL_str_extractPthLWithStr_pout 'SP500 1928-2020 yahoo finance.dat' #pWebPageL_str_extractPthLWithStr_pout '\/home\/bill\/web\/web\/Neural nets\/TrNNs_ART\/For whom the bell tolls\.html' #pWebPageL_str_extractPthLWithStr_pout '\/TrNNs_ART\/Taylors concepts\.html' #pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/bash script: thematic, list of txt files to search.txt' #pWebPageL_str_extractPthLWithStr_pout '/IllusionOfTheYear/Peter Veto - The Phantom Wiggle.ogv' #pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/machine consciousness, the need notes.html' #pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/webWork/copyright home.html' #pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/webWork/Menu zhome, Grossberg.html' #pWebPageL_str_extractPthLWithStr_pout '/TrNNs_ART/webWork/pMenuTopCopyright TrNN_ART.html' +-----+ olde code pWebPageL_pStrP_replaceGetBad() if [ -s "$pLog" ]; then pInn_archiveLocal_pDateMod "$pLog" fi echo >>"$pLog" '24************************24' echo >>"$pLog" 'dWeb_replaceGetBad_pPthLinkClass(no args)' echo >>"$pLog" "$date_ymdhms" echo >>"$pLog" '' echo >>"$pLog579" '' #08********08 #] 26Sep2023 new "$d_webWork"'pStrPAll_L change.txt' fix [TrNN, Charvatova] dirs +-----+ problematic links /home/bill/web/!!linkError!!/national/nationalpost/ /home/bill/web/!!linkError!!/national/nationalpost/search/ /economics, markets/market data/SLregress/200912 semi-log/SP500 1928-2020 yahoo finance.dat I changed the fNam strange hyphen : /TrNNs_ART/IllusionOfTheYear/2021 2nd prize, Michael Cohen- The Changing Room Illusion.ogv 2021 2nd prize, Michael Cohen– The Changing Room Illusion.ogv /Sun Charvatova/Radioisotopes/Howell Aug08 - Charvatovas hypothesis & Isotopic solar proxies.pdf plus other Charvatova missing images!! : /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p030tbl01.02 complementary streams: What- [rapid, stable] learn invariant object categories, Where- [labile spatial, action] actions.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p035fig01.22 Presentation [normal, silence, noise replaced].png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p428fig12.25 ARTSPEECH: auditory-articulatory feedback loop & imitative map, [auditory, motor] dimensionally consistent, motor theory of speech.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p505fig13.32 Behavioral contrast: [response suppression, antagonist rebound] both calibrated by shock levels.png don't know why the lnk error: /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p128fig04.04 reflectance changes at contours: fill-in color contours.png /home/bill/web/Neural nets/TrNNs_ART/images- captioned/p392fig11.30 How multiple scales vote for multiple depths, scale-to-depth and depth-to-scale maps.png /TrNNs_ART/machine consciousness, the need notes.html I lost the pdf : /TrNNs_ART/Sejnowski 21Aug2022 Large Language Models and the Reverse Turing Test.pdf /TrNNs_ART/For whom the bell tolls.html +-----+ 27Sep2023 continued - finish edits of "$d_webWork"'pStrPAll_L change.txt' OK - ready to do change #08********08 #] 26Sep2023 pLnkBad.txt - fix these links (140), then pWebPageL_pStrP_replaceGetBad() cp pLnkBad.txt to "$d_webWork"'pStrPAll_L change.txt' check "$d_webWork"'pStrPAll_L change.txt' for non-tabbed lines : 17:54$ sed s'|\x9|xyZyx|' "$d_webWork"'pStrPAll_L change.txt' | grep --invert-match 'xyZyx' >> It's fine! 18:11$ bash "$d_bin"'fileops run webSite.sh' /home/bill/web/bin/fileops.sh: line 1968: syntax error near unexpected token `<' /home/bill/web/bin/fileops.sh: line 1968: `search :
  • \n\t\t\t\t (.*)<\/a>' /home/bill/web/bin/fileops run webSite.sh: line 126: webSite_check_internalLinks: command not found webSite_check_internalLinks() - check if-valid internal links (to within webSite) : 18:14$ bash "$d_bin"'fileops run webSite.sh' >> 277 failed links, mostly [TrNN, Charvatova], plus : +-----+ /home/bill/web/??? /home/bill/web/Cool stuff/Prechter 1999 Landslide elections & Stock Prices.jpg /home/bill/web/economics, markets/market data/SLregress/200912 semi-log/SP500 1928-2020 yahoo finance.dat /home/bill/web/economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html /home/bill/web/!!linkError!!/national/nationalpost/ /home/bill/web/!!linkError!!/national/nationalpost/search/ /home/bill/web/ProjMajor/Electric Universe/Anderson - Electric scarring/Howell 120903 Paul Anderson's Electric scarring of the Earth.pdf /home/bill/web/web/Neural nets/TrNNs_ART/Sejnowski 21Aug2022 Large Language Models and the Reverse Turing Test.pdf /home/bill/web/webWork/help home.html /home/bill/web/webWork/webSite summary of [file, link] cart [types, counts].txt +-----+ +-----+ problematic links Howell's page Howell's page ./Dark matter video 1 - initial, simple.mpeg ./Dark matter video 1 - initial, simple.mpeg Hussar on the map of South-Central Alberta.jpg Hussar on the map of South-Central Alberta.jpg https://thefederalist.com/2020/04/18/10-deadliest-pandemics-in-history-were-much-worse-than-coronavirus-so-far/ https://thefederalist.com/2020/04/18/10-deadliest-pandemics-in-history-were-much-worse-than-coronavirus-so-far/ link link mailto:kozmoklimate@gmail.com mailto:kozmoklimate@gmail.com missing link missing link #08********08 #] 26Sep2023 webSite_getCheck_links() - revamp with diff 13:55$ bash "$d_bin"'fileops run webSite.sh' ~ >> It works beautifully!! 'webSite_getCheck_links log.txt' : nTotHrfImg = nLnkHrf + nLnkImg | 3707 = 2698 + 1009 nTotIntExtBad = nLnkInt + nLnkExt + nLnkBad | 3707 = 1289 + 2278 + 140 >> This will make it MUCH esier to [check, correct] links!! at present, only ONE [HREF, IMG] link each is extracted per line so it will miss many links 26Sep2023 webSite_getCheck_links() : further split Hrf into [html, txt, doc, ods, etc] #08********08 #] 24Sep2023 webSite_getCheck_links() - revamp see "$d_bin"'fileops notes.txt' +-----+ 25Sep2023 revamped webSite_getCheck_links() 17:17$ bash "$d_bin"'fileops run webSite.sh' /home/bill/web/bin/fileops.sh: line 2242: [: f: binary operator expected xargs: IFILE: No such file or directory /home/bill/web/bin/fileops.sh: line 2253: [: f: binary operator expected grep: /media/bill/ramdisk/webSite_getCheck_links pMixTmp.txt: No such file or directory /home/bill/web/bin/fileops.sh: line 2277: [: -z: integer expression expected mv: cannot stat '/media/bill/ramdisk/webSite_getCheck_links pLnkTmp.txt': No such file or directory wc: '/home/bill/web/ WorkpLnkHrfTmp.txt': No such file or directory change : if [ -z "$pAllTmp901" ]; then bolLnk=0 else bolLnk=1 fi to : if [ -s "$pAllTmp901" ]; then bolLnk=1 else bolLnk=0 fi change : if ! [ 6 -le -z "$nItr" ]; then to : if ! [ 6 -le "$nItr" ]; then change : cat "$pHtmPathAllL901" | tr \\m \\0 | xargs -0 IFILE grep ' # [\t]*
  • # old format with QNial - remove, deal with later : # shouldn't be there - fix script /economics, markets/[interest, exchange] rates/[interest, exchange] rate models.html --> /home/bill/web/ProjMini/hydrogen/ --> /home/bill/web/home.html (internal links are being revamped, most don't work) /home/bill/web/home.html (internal links are being revamped, most don't work) problem with `& : /web/Sun Charvatova/ activity/ activity/ activity/ /web/ProjMajor/Charvatova solar inertial motion probably deleted, must get from backup : /home/bill/web/eir3.gif /home/bill/web/../eir_subscribe_button.gif /home/bill/web/eirtoc/2000/eirtoc_2742.html fileops.sh - webSite_getCheck_internalLinks() change : ;s||\1.html|Ig to : ;s|> not found - I probably deleted, must retrieve from backu +-----+ olde code # example notes : # pHtmlPathExclL.txt - # "230817 change pHtmlPathExclL" : # ProjMini/TrNNs_ART/captions html /Neural nets/TrNNs_ART/captions html/ # I also changes all dirs in pHtmlPathExclL to [start, end] with `/ #08********08 #] 20Sep2023 continue - check all menu links ad pages they touch 20Sep2023 MenuTop [copyright, help] are incomplete STOP! this a manual waste of time use webSite_[get, check]_internalLinks - this checks ALL internal links +-----+ olde code combine : webSite_get_internalLinks() { pHtmlPathAllL901="$d_webWork"'pHtmlPathAll_L.txt' pWebSiteLinkL901="$d_webWork"'pWebSiteLinkL.txt' cat "$pHtmlPathAllL901" | tr \\n \\0 | xargs -0 -IFILE grep -i '\/home\/bill\/web\/' "FILE" | sed 's|.*.*|\1|Ig;s|.*|\1.html|Ig' | sed 's|
    ||g' | grep "^/home/bill/web/" | sed 's|\".*||' | sed 's|#.*||' | sed 's|^\/home\/bill\/web\/$||' | sed 's|^\/home\/bill\/web\/???$||' | grep --invert-match "^$" | sort -u >"$pWebSiteLinkL901" } # webSite_check_internalLinks() - check if-exist internal links (to within webSite) # QNial version # urls_check IS OP linkType d_backup - reate sublists of [internal,xternal] links # >> I'm just checking if a path exists # 14Sep2023 initial from QNial webSite_check_internalLinks() { pWebSiteLinkL366="$d_webWork"'pWebSiteLinkL.txt' pWebLinkFailL366="$d_webWork"'pWebLinkFailL.txt' if [ -f "$pWebLinkFailL366" ]; then rm "$pWebLinkFailL366" fi while IFS='' read -u 366 pWebSiteLink366; do if ! [[ -f "$pWebSiteLink366" || -d "$pWebSiteLink366" ]]; then echo "$pWebSiteLink366" >>"$pWebLinkFailL366" fi done 366<"$pWebSiteLinkL366" } #08********08 #] 20Sep2023 format 'status & updates' MenuTops [All_, TrNN] I need to split [projmajor, projmini, pandemics, etc] ... done for now, much detail to fix #08********08 #] 18Sep2023 dir_updateWebPage_dWeb() 23:16$ bash "$d_PROJECTS"'bin - secure/lftp update entire webSite.sh' /home/bill/PROJECTS/bin - secure/lftp update entire webSite.sh: line 150: : No such file or directory ~ 23:28$ >> seems Ok NUTS!! by mistake, I ran 'lftp update www-BillHowell-ca.sh' 20Sep2022 fix 'pMenuTopStatus.html' >> done fix 'pHtmlClassAll_L.txt' - today's changes

    Projects

    callerID_SNNs change : projmini to: neural TrNN_ART change : projnimi to : TrNN_ART #08********08 #] 18Sep2023 lftp adjustments for uploads, see : "$d_SysMaint"'Linux/lftp notes.txt' - [upload, maintain] webSite, my main app for this!! dir_updateNonHtml_dWeb() 18:12$ bash "$d_PROJECTS"'bin - secure/lftp update entire webSite.sh' ~ 23:04$ >> done #08********08 #] 17Sep2023 dir_updateNonHtml_dWeb() First FileZilla-transfer dirs that have moved (I had to first delete these in /Neural nets/) : /ProjMini/Transformer NNs/ -> /Neural nets/Transformer NNs/ /ProjMini/TrNNs_ART/ -> /Neural nets/TrNNs_ART/ >> there are probably others... "$d_bin"'fileops run webSite.sh' dir_updateNonWebPage_dWeb moved : "$d_web"'NRCan reports/Social media/' (?) to : "$d_web"'Neural nets/references/' (?) moved most of : "$d_ProjMajor"'OPM/' to : "$d_PROJECTS"'ProjMajor/OPM/' >> my files retained, will be lftp'd 14:48$ find '/home/bill/web/Neural nets/References/' -type f -name "Howell*" /home/bill/web/Neural nets/References/Howell 110903 - Confabulation Theory, Plausible next sentence survey.doc /home/bill/web/Neural nets/References/Howell 111006 - Semantics beyond search.doc /home/bill/web/Neural nets/References/Howell 111117 - How to set up & use data mining with Social media.doc /home/bill/web/Neural nets/References/Howell 110902 - Systems design issues for social media.doc /home/bill/web/Neural nets/References/Howell 111230 - Social graphs, social sets, and social media.doc I can't FileZilla rm '/billhowell.ca/NRCan reports/' becasue online files have non-ASCII hyphens from MicroSoft word. : /home/bill/web/Neural nets/References/Howell 111006 - Semantics beyond search.doc /home/bill/web/Neural nets/References/Howell 110902 - Systems design issues for social media.doc /home/bill/web/Neural nets/References/Howell 111230 - Social graphs, social sets, and social media.doc >> I would have to do that with ssh? >> tons of other files probably have similar problems! FileZilla used to change dirs 755 files 644 +-----+ olde code "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' # update entire webSite - this excludes html files # updateLast="$1" # LOCALDIR="$2" # REMOTEDIR="$3" # pExcludesNoHtml="$4" #dir_update_dWeb '230902 9h19m03' '/home/bill/web/ProjMini/TrNNs_ART/' '/billhowell.ca/ProjMini/TrNNs-ART/' "$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca excludes.txt' #dir_update_dWeb '230902 14h50m28' '/home/bill/web/' '/billhowell.ca/' "$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca excludes.txt' # test file #dir_update_dWeb '/home/bill/web/bin/0_test/lftp one file/' '/billhowell.ca/bin/0_test/lftp one file' "$d_PROJECTS"'bin - secure/lftp update www-BillHowell-ca excludes.txt' # dir_updateNonHtml_dWeb() updateLast LOCALDIR REMOTEDIR pExcludesNoHtml #webSite_get_internalLinksWithFnamLinNum "$d_web"'webWork/pHtmlOnlyL.txt' "$d_web"'webWork/pWebSiteLinkLWithFnamLinNum.txt' #webSite_get_internalLinks "$d_web"'webWork/pHtmlOnlyL.txt' "$d_web"'webWork/pWebSiteLinkL.txt' #webSite_get_internalLinksWithFnamLinNum "$d_web"'webWork/pHtmlOnlyL.txt' "$d_web"'webWork/pWebSiteLinkLWithFnamLinNum.txt' #webSite_get_internalLinks "$d_web"'webWork/pHtmlOnlyL.txt' "$d_web"'webWork/pWebSiteLinkL.txt' # $ sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' "$d_web"'home.html' >"$d_temp"'dir_updateHtml_dWeb sed test.txt' # >> works, so why don't dir_update[, Html]_dWeb work? # pHtmlOnlyL="$d_web"'webWork/pHtmlOnlyL.txt' # find - use above... # echo '/home/bill/web/home.html' | sed 's|\/home\/bill\/web\/|/home/bill/PROJECTS/webHtmlOnly/|' # +-----+ # problem of dir mix-up!!!!!!!!!!!!!??????????????????????????? timeUnixUpdate=$( date_ymdhms_to_timeUnix "$updateLast" ) echo >>"$pLog" "updateLast = $updateLast; timeUnixUpdate = $timeUnixUpdate" pTmp="$d_temp"'dir_updateHtml_dWeb html temp.txt' while IFS='' read -u 9 pHtmlOnly; do timeUnixPth=$( pinn_timeModTo_timeUnix "$pHtmlOnly" ) if [ "$timeUnixPth" -gt "$timeUnixUpdate" ]; then sed 's|\/home\/bill\/web\/|http://www.BillHowell.ca/|g' "$pHtmlOnly" >"$pTmp" # must mv to "$d_PROJECTS"'webHtmlOnly/' for lftp upload pWebHtmlOnly=$( echo "$pHtmlOnly" | sed 's|\/home\/bill\/web\/|/home/bill/PROJECTS/webHtmlOnly/|' ) mv "$pTmp" "$pWebHtmlOnly" echo >>"$pLog" " $pWebHtmlOnly" fi done 9<"$pHtmlOnlyL" # update Grossberg imageCaptions - #dir_updateCaptionHtml_dWeb '230902 14h50m28' #08********08 #] 17Sep2023 online webPages: pHtmlPathAll_L upload onLine - this MUST be run ONLY from : # "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' 17:31$ bash "$d_bin"'fileops run webSite.sh' many error msgs to terminal like (small sample) : +-----+ mv: cannot move '/media/bill/ramdisk/dir_updateHtml_dWeb html temp.txt' to "/home/bill/PROJECTS/webHtmlOnly/ProjMajor/Electric Universe/References/Randi Foundation 2008-2011 ANthony Peratt's model of univese.php_files/11x11progress.html": No such file or directory mv: cannot move '/media/bill/ramdisk/dir_updateHtml_dWeb html temp.txt' to '/home/bill/PROJECTS/webHtmlOnly/ProjMajor/Sun pandemics, health/corona virus/Fauci covid emails/0_Howell - Fauci corona virus emails [question, impression]s.html': No such file or directory +-----+ >> I copy-pasted to log file >> I should redirect bash errors to logFile, eg something like 2>&??? >> seem to be special dirs : Electric Universe Sun Charvatova Sun civilisations Sun climate Sun model, forecast Sun pandemics, health >> I didn't change their names!!! probably don't exist in "$d_PROJECTS"'webHtmlOnly/' /Electric Universe/References/Randi Foundation *.html - aren't webPages! I renamed the dirs above in d_webHtmlOnly but home.html doesn't work! Same problems as before /Projects - major/ still 03Sep2023 on website /SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html was anything updated? >> yes d_web seems fine >> yes d_webHtmlOnly seems fine >> so NO uploads? >> funny - I had uncommented the upload part Rerun >> nuts, still lack of access I must also update dir_updateNonHtml_dWeb() as per "$d_bin"'fileops run webSite.sh' 17Sep2023 I need a script to better handle dirChanges across processes +-----+ olde code # 17Sep2023 ONLY rm old html, as dir_updateHtml_dWeb() changes links with copy-over!!! "$d_bin"'fileops run webSite.sh' update_dWebHtmlOnly() pTmp="$d_temp"'update_dWebHtmlOnly pTmp.txt' while IFS='' read -u 9 pHtmlPath; do pWebHtml=$( echo "$pHtmlPath" | sed 's|\/home\/bill\/web\/|\/home\/bill\/PROJECTS\/webHtmlOnly\/|' ) # echo "$pWebHtml" cp -p "$pHtmlPath" "$pWebHtml" done 9<"$d_webWork"'pHtmlPathAll_L.txt' #08********08 #] 17Sep2023 webSite upload online "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' dir_updateHtml_dWeb '230826 12h00m00' "$d_webWork"'pHtmlPathAll_L.txt' mv: cannot move '/media/bill/ramdisk/dir_updateHtml_dWeb html temp.txt' to '/home/bill/PROJECTS/webHtmlOnly/ProjMajor/Sun pandemics, health/corona virus/Howell - corona virus of countries, by region.html': No such file or directory ... tons of these >> all dirs have changed since a week or so ago... +-----+ +-----+ search "Linux rm files with a given extension" +-----+ How can I recursively delete all files of a specific extension in the current directory? Asked 9 years, 10 months ago Modified 2 years, 3 months ago Viewed 1.1m times find . -name "*.bak" -type f -print0 | xargs -0 /bin/rm -f edited Feb 8, 2015 at 16:13 muru answered Apr 4, 2014 at 19:10 lokers +-----+ try : # manually - update dirs in d_out="$d_PROJECTS"'webHtmlOnly/' # remove all files in d_out (sometimes more than *.html slip through! $ find "$d_PROJECTS"'webHtmlOnly/' -type f -print0 | xargs -0 rm -f # cp all of pHtmlPathAll_L.txt to appropriate dirs of d_out update_dWebHtmlOnly() { find "$d_PROJECTS"'webHtmlOnly/' -type f -print0 | xargs -0 rm -f pTmp="$d_temp"'update_dWebHtmlOnly pTmp.txt' while IFS='' read -u 9 pHtmlPath; do pWebHtml=$( echo "$pHtmlPath" | sed 's|\/home\/bill\/web\/|\/home\/bill\/PROJECTS\/webHtmlOnly\/|' ) echo "$pWebHtml" # cp -p "$pHtmlPath" "$pWebHtml" done 9<"$d_webWork"'pHtmlPathAll_L.txt' } update_dWebHtmlOnly +-----+ olde code no longer useful, remove : # povrL_pStrP_fixLinks (no args) - for webPages except confGuides # no args - [pHtmlPathAll_L, pStrPLinkL] set by povrL_pStrP_fixLinks #povrL_pStrP_fixLinks # povrL_pStrP_fixLinks() (no args) - fix bad internal links (to within webSite) # forces use of standard [povrL, pStrP for links], avoid cross-over with general pStrP stuff # tests: "$d_bin"'0_test/fileops/dWeb_change_badPths/0_dWeb_change_badPths notes.txt' # 05Sep2023 initial povrL_pStrP_fixLinks() { povrL616="$d_webWork"'pHtmlPathGenrL.txt' pStrP616="$d_webWork"'pStrP webSiteHtml.txt' pTemp616="$d_temp"'povrL_pStrP_fixLinks povr tmp.txt' pTmSt616="$d_temp"'povrL_pStrP_fixLinks pStrP tmp.txt' bolArXiv616=1 bolChrCd616=1 if [ -s "$povrL616" ]; then pinn_backupDatedTo_zArchive "$povrL616" while IFS='' read -u 9 povr616; do fnamer616=$( pth_get_fnam "$povr616" ) sed "s|\/home\/bill\/web\/\(.*\)fname|/home/bill/web/\1$fnamer616|" "$povr616" >"$pTmSt616" mv "$pTmSt616" "$povr616" povr_pStrP_replace "$bolArXiv616" "$bolChrCd616" "$povr616" "$pStrP616" done 9<"$povrL616" else echo 'povrL_pStrP_fixLinks - missing povrL' fi } "$d_bin""rsync directories.sh" ***** change : backer_htmlOnly() { dater=$(date +"%y%m%d %kh%Mm") d_src="$d_web" d_out="$d_PROJECTS"'webHtmlOnly/' becho "+---------------------------------------------------------+" becho "backer_rsync() - $dater rsync of htmlOnly $d_src to $d_out, " becho "" beval 'rsync '"$options"' --include="*.html" --exclude="*.*" "'"$d_src"'" "'"$d_out"'" >>"'"$p_log"'" ' becho "" # 17Sep2023 NON! use pHtmlPathAll_L exclusively!!! # rm dirs recursively : "z_Archive$\|z_archive$\|z_Old$\|z_old$" # also high-volume dirs, particularly : # /home/bill/PROJECTS/webHtmlOnly/System_maintenance/tex/ # /home/bill/PROJECTS/webHtmlOnly/ProjMajor/OPM/OPM/matlab_mrst/ # /home/bill/PROJECTS/webHtmlOnly/ProjMajor/OPM/OPM/Octave/ # /home/bill/PROJECTS/webHtmlOnly/References/Climategate emails/documents/briffa-treering-external/belfast/garrybog/pine/ beval 'find "$d_PROJECTS""webHtmlOnly/" -type d | grep "z_Archive$\|z_archive$\|z_Old$\|z_old$\|System_maintenance\/tex$\|ProjMajor\/OPM\/OPM\/matlab_mrst$\|ProjMajor\/OPM\/OPM/Octave$\|References\/Climategate $\|garrybog\/pine$" | tr \\n \\0 | xargs -0 -IFILE rm -r "FILE" ' # rm files that have no extension : beval 'find "$d_PROJECTS""webHtmlOnly/" -type f | grep "\/[A-Za-z0-9]*$" | tr \\n \\0 | xargs -0 -IFILE rm -r "FILE" ' # 02Sep2023 not yet ready! bash "$d_bin""du_diff.sh" "$d_src" "$d_out" becho "" dater=$(date +"%y%m%d %kh%Mm") becho "backer_rsync() - $dater end of operation $dater" becho "" } ***** to : ***** backer_webPageOnly() { dater=$(date +"%y%m%d %kh%Mm") d_src="$d_web" d_out="$d_PROJECTS"'webHtmlOnly/' becho "+---------------------------------------------------------+" becho "backer_rsync() - $dater rsync of htmlOnly $d_src to $d_out, " becho "" # see if a pLst can work?? beval 'rsync '"$options"' --include-from="'"$d_webWork"'pHtmlPathAll_L.txt'" "'"$d_src"'" "'"$d_out"'" >>"'"$p_log"'" ' becho "" becho "" dater=$(date +"%y%m%d %kh%Mm") becho "backer_rsync() - $dater end of operation $dater" becho "" } ***** #08********08 #] 17Sep2023 do it all: povrL_pStrP_replace pHtmlPathAll_L pStrPAll_L change "$d_bin"'fileops run webSite.sh' povrL_pStrP_replace 1 1 "$d_webWork"'pHtmlPathAll_L.txt' "$d_webWork"'pStrPAll_L change.txt' 14:32$ bash "$d_bin"'fileops run webSite.sh' /home/bill/web/Bill Howells book [note, review]s/Wilson 1977 Cosmic trigger, Howells review.html ... many more 14:36$ ran fine, pHtmlPathExclL.txt changes caused misses. Change back to : /Electric Universe/References/Randi Foundation 2008-2011 /References/Weart 2003 - The Discovery of Global Warming/ /Solar system/Cdn Solar Forecasting/ /Sudbury Neutrino Observatory (SNO)/ /Top 75 Immunotherapy startups_files/ In any case, a very [rare, quick] check of links gives some confidence. so proceed towards upload online. #08********08 #] 17Sep2023 I moved [Transformer NNs, TrNNs_ART] dirs into 'Neural Nets' Ouch, a lot of work, but better now than later, while I'm making changes and [process, software]s are fresh in my mind. Simply add dirChanges to "$d_webWork"'pStrPAll_L change.txt', and re-run test5 : /ProjMajor/Transformer NNs/ /Neural nets/Transformer NNs/ /ProjMajor/TrNNs_ART/ /Neural nets/TrNNs_ART/ super-easy 11:22$ bash "$d_bin"'fileops run webSite.sh' /home/bill/web/ProjMini/TrNNs_ART/Introduction.html /home/bill/web/ProjMini/TrNNs_ART/webWork/pMenuTopMenu TrNNs_ART.html /home/bill/web/Bill Howells videos/Howell - videos.html /home/bill/web/ProjMajor/Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html /home/bill/web/ProjMajor/Sun pandemics, health/_Pandemics, health, and the sun.html >> doesn't work - menus all bad now, try all webSite! update pHtmlPathAll_L via 'fileops run webSite.sh' dWeb_get_dWebPageL(no args) pHtmlClassAll_L has to be changed, but has to be done manually! update all menus +-----+ Steps below taken directly from "$d_bin"'fileops run webSite.sh' +--+ # Manually edit BEFORE dWeb_get_dWebPageL, iterate : # pHtmlPathExclL.txt - "230817 change pHtmlPathExclL" : ProjMini/TrNNs_ART/captions html /Neural nets/TrNNs_ART/captions html/ I also changes all dirs in pHtmlPathExclL to [start, end] with `/ >> oops, did AFTER dWeb_get_dWebPageL when I saw mistakes +--+ # dWeb_get_dWebPageL(no args) - generate pHtmlPath[All_, TrNN]L of Howell's webPages # does povrL_idx_strTst_cut check of htmls, add to pHtmlPathExclL try again : 11:58$ bash "$d_bin"'fileops run webSite.sh' +--+ "230817 change [pStrPAll_L, pHtmlClassAll_L, pHeader template TrNN.txt]" : /ProjMini/Transformer NNs/ /Neural nets/Transformer NNs/ /ProjMini/TrNNs_ART/ /Neural nets/TrNNs_ART/ # Manually edit AFTER doing dWeb_get_dWebPageL (these are NOT updated by script) : OK 'pStrPAll_L change.txt' - OK pHtmlClassAll_L - class must be MANUALLY [add, modify]ed for each line # pHeader templates - manually [check, edit] x "$webWork"'pHeader template.txt' OK "$d_web"'Neural nets/TrNNs_ART/webWork/pHeader template TrNN.txt' # new webPage manually add to pHtmlClassAll_L.txt : x automatically added by pHtmlPathAll_L.txt, but check OK done x not affected by "230817 change" * ignored, didn't check # comment line +-----+ povr_pStrP_replace_run() test : povrL_pStrP_replace 1 1 "$d_webWork"'pHtmlPathAll_L target1 230915.txt' "$d_webWork"'pStrPAll_L change.txt' 13:26$ bash "$d_bin"'fileops run webSite.sh' /home/bill/web/ProjMini/TrNNs_ART/Introduction.html pLog : povrL_pStrP_replace(bolArXiv bolChrCd povrL pStrP) 230917 13h26m12s povrL_pStrP_replace error: povr doesnt exist : /home/bill/web/ProjMini/TrNNs_ART/Introduction.html >> hah! I forgot to change test[1, 5] cp edit] new files : pHtmlPathAll_L target[1, 5] 230917.txt try again : povrL_pStrP_replace 1 1 "$d_webWork"'pHtmlPathAll_L target1.txt' "$d_webWork"'pStrPAll_L change.txt' 13:50$ bash "$d_bin"'fileops run webSite.sh' /home/bill/web/Neural nets/TrNNs_ART/Introduction.html junk pLog messages - no loger need : 230917 13h50m30s povr_strP_replace error, strP is [empty, no-tab] : 17Sep2023 I will have to look for old noteszc@z trials povr_strP_replace() removed : else # comment out except for [test, debug] echo >>"$pLog" "$date_ymdhms povr_strP_replace error, strP is [empty, no-tab] :" echo >>"$pLog" " $strP____448" Now check Introduction.html for substitutions : >> OK, a few errors as before (John Taylor?), leave for now... +-----+ Next test of 5 files : povrL_pStrP_replace 1 1 "$d_webWork"'pHtmlPathAll_L target5.txt' "$d_webWork"'pStrPAll_L change.txt' ******* +-----+ Introduction.html : +--+ old errors : bad links, pStrP style : /home/bill/web/ProjMini/TrNNs_ART/John Taylors concepts.html /home/bill/web/ProjMini/TrNNs_ART/Taylors consciousness.html /home/bill/web/ProjMini/TrNNs_ART/Introduction.html#Credibility from non-[bio, psycho]logical applications of Grossberg's ART /home/bill/web/ProjMini/TrNNs_ART/Introduction.html#Credibility from non-[bio, psycho]logical applications of Grossberg's ART >> bad bookmark? no link? : That paralleled their use in very widespread applications in [science, engineering, etc]. +--+ >> same result. I only spot-checked because previous step focussed on this file. +-----+ pMenuTopMenu TrNNs_ART.html : old error : missing content in webPage /home/bill/web/ProjMini/TrNNs_ART/Pribram 1993 quantum fields and consciousness proceedings.html >> same +-----+ Howell - videos.html : +--+ old errors : >> all OK +--+ >> all OK - spot check +-----+ _Kyoto Premise - the scientists arent wearing any clothes.html +--+ old errors : bad links : /home/bill/web/ProjMajor/Climate - Kyoto Premise fraud/home/bill/web/ProjMajor/Climate - Kyoto Premise fraud/Lindzen - Don't Believe the Hype - Al Gore is wrong.pdf +--+ >> OK its the same +-----+ _Pandemics, health, and the sun.html : +--+ old errors : >> none - all internal links are OK +--+ >> OK its the same +-----+ additional issues I forgot that I changed a function to use : # sed use as, eg \([$chrL_path]+\) # [] - will be a problem!! sometimes used in my fNams chrL_fNam="A-Za-z0-9._-()<>&%#" chrL_path="$chrL_fNam"'\/' # usually for website management, I start only with base path, worry about bookmarks later chrL_fNamWithBookmk="A-Za-z0-9._-()<>&%#" chrL_pathWithBookmk="$chrL_fNamWithBookmk"'\/' It was : webSite_get_internalLinks(no args) - extract internal links (to within webSite) But this does not affect current work. +-----+ olde code "$d_bin"'fileops run webSite.sh' : # pinnL_idx_strTst_cut pinnL idx strTst - remove non-webPages from pinnL, save to pCutL # when line(idx) ! == " /home/bill/web/economics, markets/market data/SLregress/200912 semi-log/1872-2020 SP500 index semi-log detrended 1871-1926 & 1926-2020, TradingView.xcf /home/bill/web/eir3.gif /home/bill/web/../eir_subscribe_button.gif /home/bill/web/eirtoc/2000/eirtoc_2742.html /home/bill/web/!!linkError!!/national/nationalpost/ /home/bill/web/!!linkError!!/national/nationalpost/search/ /home/bill/web/webOther/Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF Reinstated files from archive (outdated, but hold a place) : /webWork/webSite summary of [file, link] cart [types, counts].txt /webWork/webSite summary of [fail, unknown, OK,total] links.txt Seem fine? : /home/bill/web/CompLangs/LibreOffice/LibreCalc bank account macro system.txt /ProjMajor/History/Ukraine-Russia/emails, blogs on Ukraine Russia.html /ProjMajor/History/Ukraine-Russia/news items except [ukrinform, kyivindependent].html /ProjMajor/History/Ukraine-Russia/ukrinform.net news log.html /ProjMini/TrNNs_ART/Pribram 1993 quantum fields and consciousness proceedings.html Junk : /home/bill/web/ProjMini/hydrogen/ --> #08********08 #] 15Sep2023 update all webPage Menus # pOvrClassL_put_pHead - replace existing header with new pOvrClassL_put_pHead "$d_webWork"'pHtmlClassAll_L test1.txt' pOvrClassL_put_pHead "$d_webWork"'pHtmlClassAll_L test5.txt' pOvrClassL_put_pHead "$d_webWork"'pHtmlClassAll_L.txt' >> I checked all tests, corrected the template >> I then did all webPages $ bash "$d_webWork"'fileops run webSite.sh' cp: cannot stat '/home/bill/web/ProjMajor/Sun climate/Sun Charvatova/_Charvatova - solar inertial motion & activity.html': No such file or directory grep: /home/bill/web/ProjMajor/Sun climate/Sun Charvatova/_Charvatova - solar inertial motion & activity.html: No such file or directory cat: '/home/bill/web/ProjMajor/Sun climate/Sun Charvatova/_Charvatova - solar inertial motion & activity.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/ProjMajor/Sun climate/Sun Charvatova/_Charvatova - solar inertial motion & activity.html': No such file or directory >> many bad paths like this Must fix pOvrClassL_put_pHead() : y first created "$d_webWork"'pHeader template TrNN.txt' script must y extract TrNN classes - > "$d_ProjMini"'TrNNs_ART/webWork/pHtmlClassTrNNL (dont edit - pOvrClassL_put_pHead).txt' replace TrNN headers (2nd replacement in pOvrClassL_put_pHead) test : $ pOvrClassL="$d_webWork"'pHtmlClassAll_L.txt' $ pOvrClassTrNNL="$d_ProjMini"'TrNNs_ART/webWork/pHtmlClassTrNNL (dont edit - pOvrClassL_put_pHead).txt' $ cat "$pOvrClassL" | grep '\/ProjMini\/TrNNs_ART\/' >"$pOvrClassTrNNL" >> OK doing TrNN : $ bash "$d_webWork"'fileops run webSite.sh' cp: cannot stat '': No such file or directory grep: : No such file or directory cat: '': No such file or directory cp: cannot stat '': No such file or directory grep: : No such file or directory cat: '': No such file or directory cp: cannot stat '': No such file or directory grep: : No such file or directory cat: '': No such file or directory .... >many error like this - maybe the whole subSite? 15Sep2023 Also - TrNN webPages are still coded to online webSite! +-----+ povrs in pHtmlClassAll_L are being updated, but NOT pHtmlClassTrNNL!? nothing new in : pLogger="$d_webWork""$date_ymdhms pOvrClassL_put_pHead logger.txt" 13:19$ date_ymdhms=$(date +"%y%m%d %kh%Mm%Ss") ~ 13:24$ pLogger="$d_webWork""$date_ymdhms pOvrClassL_put_pHead logger.txt" ~ 13:24$ echo 'test' >"$pLogger" ~ 13:25$ echo "$date_ymdhms" 230915 13h24m38s ~ 13:25$ echo "$pLogger" /home/bill/web/webWork/230915 13h24m38s pOvrClassL_put_pHead logger.txt ~ 13:26$ >> even terminal output isn't creating pLogger 13:26$ echo 'hello tst' >"$d_temp"'hello test.txt' ~ 13:29$ >> this did write to "$d_temp"'hello test.txt' so what are the problems? pOvrClassL_put_pHead() #close 8 # 09Sep2023 causes bash error (why?) but still works 230915 9h05m13s pOvrClassL_put_pHead logger.txt >> note the space instead of 0 for the hour 13:32$ date_ymdhms=$(date +"%y%m%d %0kh%0Mm%0Ss") ~ 13:34$ echo "$date_ymdhms" 230915 13h34m08s ~ 13:34$ 13:34$ date_ymdhms=$(date +"%0y%0m%0d %0kh%0Mm%0Ss") ~ 13:35$ echo "$date_ymdhms" 230915 13h35m23s ~ 13:35$ >> note that the leading (padding) 0 format works Log file shows that +-----+ pHtmlClassAll_L MenuTops pOvrClassL = /home/bill/web/webWork/pHtmlClassAll_L test5.txt pHeadTpltL = /home/bill/web/webWork/pHeader template.txt pOvrClass = /home/bill/web/home.html home pOvrr = /home/bill/web/home.html class = home pthCmt = ********************** OH NUTS! - I forgot to change "pHtmlClassAll_L test5.txt' to 'pHtmlClassAll_L.txt' ********************** OK - now it works, including TrNN, with same list of bad files file:///home/bill/web/ProjMini/TrNNs_ART/webWork/pMenuTopCopyright%20TrNNs_ART.html#Grossberg >> MenuTop link doesn't work, others do I don't have "' at top of MenuTops? TrNN Menu only shows Grossberg - add others! still problem of http://www.Billhowell.ca +-----+ olde code # ********************* # 09Aug2023 OLD VERSION! # bash "$d_bin"'webSite update.sh' - ??? # +-----+ # ToDos : # 23Feb2023 plan forward # [test, fix] pHtml_cutPutEmbeds() - [cut old, insert new] embedExecute html files # webLocal_update() - copy all webPages to webWork so all viewing is local (no backups) # webSite_update_lftp() - lftp to [upload, delete extra] online [file, dir]s # pHtml_cutPutEmbeds() - generalize, then move to fileops.sh # +-----+ # URL [check, fix, change] : # pHtmlL_Howell_create() - Howell webPageL # pLinkL_HrefImgExtern_create() - external [HREF, IMG] links # QNialEmbeds_To_htmlEmbeds() - use pLsts with iterative tests # check_subDirL() - diff [webDirL from filemanager, webLink intern subDirL].txt # want to know any 'webLink intern subDirL' that are NOT in 'webDirL from filemanager' # +--------------------+ # Make changes # urls_update() - combined [dir, special] pStrPL changes, apply to webPages # webSite_update_lftp() - lftp to [upload, delete extra] online [file, dir]s # 09Aug2023 not created yet # ********************* # OLD VERSION! # bash "$d_bin"'webSite update.sh' - ??? # +-----+ # ToDos : # 23Feb2023 plan forward # [test, fix] pHtml_cutPutEmbeds() - [cut old, insert new] embedExecute html files # webLocal_update() - copy all webPages to webWork so all viewing is local (no backups) # webSite_update_lftp() - lftp to [upload, delete extra] online [file, dir]s # pHtml_cutPutEmbeds() - generalize, then move to fileops.sh # +-----+ # URL [fix, change, check] : # pHtmlL_Howell_create() - Howell webPageL # QNialEmbeds_To_htmlEmbeds() - use pLsts with iterative tests # check_subDirL() - diff [webDirL from filemanager, webLink intern subDirL].txt # want to know any 'webLink intern subDirL' that are NOT in 'webDirL from filemanager' # +-----+ # create_pStrPL_dirChanges() - known recent movements of dirs # create_pStrPL_dirChanges() - known recent movements of dirs # special pStrpL - are often created manually # combine_pStrLP - cat [dirChanges, special] pStrPL changes, all my webPages # +--------------------+ # Make changes # urls_update() - combined [dir, special] pStrPL changes, apply to webPages # pHtml_cutPutEmbeds() - [cut old, insert new] embedExecute html files # webSite_update_lftp() - lftp to [upload, delete extra] online [file, dir]s # +-----+ # alter line sequence in text file # povrL_idx_strTst_cut() - remove non-webPages from pinnL, save to pCutL # eg for non-webPages when line(idx) ! == " >> arXiv these files, not needed >> actually, these live on in links, see next step As a quick fudge, remove all lines that don't start with /home/bill/web (including all external) change : cat "$pHtmlPathAllL901" | tr \\n \\0 | xargs -0 -IFILE grep -i '\/home\/bill\/web\/' "FILE" | sed 's|.*.*|\1|Ig;s|.*||g' | sed 's|#.*||' | sort -u >"$pWebSiteLinkL901" to : cat "$pHtmlPathAllL901" | tr \\n \\0 | xargs -0 -IFILE grep -i '\/home\/bill\/web\/' "FILE" | sed 's|.*.*|\1|Ig;s|.*||g' | grep "^/home/bill/web/" | sed 's|#.*||' | sort -u >"$pWebSiteLinkL901" >> down from 701 to 556 lines >> example problems /home/bill/web/??? /home/bill/web/20120 [before, after] running head-on into a semi-tractor trailor hauling propane.jpg" NAME="Bill HOWELL photo change : cat "$pHtmlPathAllL901" | tr \\n \\0 | xargs -0 -IFILE grep -i '\/home\/bill\/web\/' "FILE" | sed 's|.*.*|\1|Ig;s|.*||g' | grep "^/home/bill/web/" | sed 's|#.*||' | sort -u >"$pWebSiteLinkL901" to : cat "$pHtmlPathAllL901" | tr \\n \\0 | xargs -0 -IFILE grep -i '\/home\/bill\/web\/' "FILE" | sed 's|.*.*|\1|Ig;s|.*||g' | grep "^/home/bill/web/" | sed 's|\".*||' | sed 's|#.*||' | sort -u >"$pWebSiteLinkL901" >> down from 701 to 540 lines change : cat "$pHtmlPathAllL901" | tr \\n \\0 | xargs -0 -IFILE grep -i '\/home\/bill\/web\/' "FILE" | sed 's|.*.*|\1|Ig;s|.*||g' | grep "^/home/bill/web/" | sed 's|\".*||' | sed 's|#.*||' | sort -u >"$pWebSiteLinkL901" to : cat "$pHtmlPathAllL901" | tr \\n \\0 | xargs -0 -IFILE grep -i '\/home\/bill\/web\/' "FILE" | sed 's|.*.*|\1|Ig;s|.*||g' | grep "^/home/bill/web/" | sed 's|\".*||' | sed 's|#.*||' | sed 's|^\/home\/bill\/web\/$||' | sed 's|^\/home\/bill\/web\/\?\?\?$||' | sort -u >"$pWebSiteLinkL901" eventually : cat "$pHtmlPathAllL901" | tr \\n \\0 | xargs -0 -IFILE grep -i '\/home\/bill\/web\/' "FILE" | sed 's|.*.*|\1|Ig;s|.*|\1.html|Ig' | sed 's| /home/bill/web|\n/home/bill/web|g' | sed 's|
    ||g' | grep "^/home/bill/web/" | sed 's|\".*||' | sed 's|#.*||' | sed 's|^\/home\/bill\/web\/$||' | sed 's|^\/home\/bill\/web\/???$||' | grep --invert-match "^$" | sort -u >"$pWebSiteLinkL901" >> down from 701 to 535 lines #08********08 #] 14Sep2023 scan all webPages for broken links nothing in fileops.sh -> get code from QNial & port "$d_d_Qndfs"''webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' Components that look like I should [adapt, port] to bash : ********************* loaddefs link d_Qndfs 'webSite/webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' +-----+ Find all d_webRawe html files related to webSite [URLs, convert, update] countPathsDirs IS - read stable [path, dir] lists +-----+ URLs - check and count, helps for debugging webURLs_extract IS - extract all link urls from a website [external, internal, menu, pagePosn] this is independent of other optrs, it creates d_temp files, to be further processed by urls_check urls_check IS OP linkType - create sublists of [internal,xternal] links classed as [fail, OK] check internal with path_exists "f, externals with curl webSite_link_counts IS - summarize the counts for links [external, internal, menu, tableOfContent]s +-----+ Instructions for individual web-[page, site] updates (check at each step!) +-----+ Check : 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' webURLs_extract IS - extract all link urls from a website fileops.sh webSite_get_internalLinks() - extract internal links (to within webSite) >> the bash version is supposed to find links & save to a file >> I should "hard code" [pHtmlPathAll_L.txt, >> webSite_get_internalLinks has not yet been tested! fileops.sh povrL_pStrP_fixLinks() (no args) - fix bad internal links (to within webSite) >> don't mess with QNial version (much more complicated) unless I run into problems Instructions for individual web-[page, site] updates (check at each step!) DON'T fly-by-seat-of-pants!!! This is too complex at my age : "$d_Qndfs"'1_webSite instructions.txt' >> forget it for now. >> Maybe in a year or so, after I've "perfected" (hah!) "$d_webWork"'fileops run webSite.sh' +-----+ olde code # dWeb_addHdrFtr_dTmp() - prepare files in dWeb for upload # html files MUST be pre-processed, others just copied # see "$d_PROJECTS""bin - secure/lftp update specified dir.sh" dWeb_addHdrFtr_dTmp() { dWeb="$1" dTmp1="$d_temp"'dWeb1/' dTmp2="$d_temp"'dWeb2/' pthL="$d_temp"'dWeb_addHdrFtr_dTmp pthL.txt' pInL="$d_temp"'dWeb_addHdrFtr_dTmp pInL.txt' if ! [ -d "$dTmp1" ]; then mkdir "$dTmp1" fi if ! [ -d "$dTmp2" ]; then mkdir "$dTmp2" fi find "$dWeb" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive" | sort -u >"$pthL" while read pinn; do pout=$( echo "$pinn" | sed "s|$dWeb|$dTmp1|" ) echo "$pout" cat "$d_web"'webWork/head_one.html' >"$pout" grep "^" "$pinn" >>"$pout" cat "$d_web"'webWork/head_two TrNNs&ART.html' >>"$pout" grep "^pInsert] " "$pinn" | sed 's|^pInsert] ||' >"$pInL" while read pInsert; do cat "$pInsert" >>"$pout" done < "$pInL" cat "$d_web"'webWork/head status TrNNs&ART.html' >>"$pout" grep --invert-match "<TITLE>\|pInsert] " "$pinn" >>"$pout" cat "$d_web"'webWork/footer TrNNs&ART.html' >>"$pout" done < "$pthL" echo '+-----+' # change pth to URL find "$dTmp1" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive" | sort -u >"$pthL" while read pinn; do pout=$( echo "$pinn" | sed "s|$dTmp1|$dTmp2|" ) echo "$pout" sed 's|/home/bill/web|http://www.BillHowell.ca|' "$pinn" >"$pout" done < "$pthL" echo '+-----+' find "$dWeb" -type f | grep --invert-match "z_Old\|z_Archive\|html" | sort -u >"$pthL" while read pinn; do pout=$( echo "$pinn" | sed "s|$dWeb|$dTmp2|" ) echo "$pout" cp -p "$pinn" "$pout" # don't want changes to -p!? done < "$pthL" } # echo '/home/bill/web/ProjMini/TrNNs-ART/Introduction.html' | sed 's|/home/bill/web|http://www.BillHowell.ca|' # $ dTmp1="$d_temp"'dWeb1/' # echo '/home/bill/web/ProjMini/TrNNs-ART/Introduction.html' | sed "s|/home/bill/web/ProjMini/TrNNs-ART/|$dTmp1|" # dWeb="$d_web"'ProjMini/TrNNs-ART/' # find "$dWeb" -type f | grep --invert-match "z_Old\|z_Archive\|html" | sort -u >"$d_temp"'dWeb_addHdrFtr_dTmp pthL.txt' # find "$dWeb" -type f \( -name "*.html" \) | grep --invert-match "z_Old\|z_Archive" | sort -u >"$d_temp"'dWeb_addHdrFtr_dTmp pthL.txt' #08********08 #] 14Sep2023 now do all webPages problematic links that I noticed with a quick overview : file:///home/bill/web/Wilson%201977%20Cosmic%20trigger,%20Howells%20review.html #08********08 #] 14Sep2023 now test povrL_pStrP_replace 0 1 "$d_webWork"'pHtmlPathAll_L test5.txt' ... page blogs.html >> all OK (my internals) page crazy themes and stories.html >> all OK (my internals) page hosted subsites.html >> all OK (my internals) page Howell - blog.html >> most are good except some like : file:///home/bill/web/CompLangs/LibreOffice%20macros/LibreCalc%20bank%20account%20macro%20system.txt file:///home/bill/web/CompLangs/LibreOffice%20macros/Howell_codification_macLib/ #08********08 #] 14Sep2023 test: povrL_pStrP_replace 0 1 "$d_webWork"'pHtmlPathAll_L test1.txt' ... povrL_pStrP_replace 0 1 "$d_webWork"'pHtmlPathAll_L test1.txt' "$d_webWork"'pStrPAll_L change.txt' again, no changes to home.html >> why, tests work ...? pLog="$d_bin"'fileops log.txt' povrL_pStrP_replace error: OR[povrL, pStrP] doesnt exist povrL : /home/bill/web/webWork/pHtmlPathAll_L test1.txt pStrP : /home/bill/web/webWork/pStrPAll_L change.txt shit change [povrL, pStrP]___384 $ bash "$d_webWork"'fileops run webSite.sh' 230914 16h03m38s povrL_pStrP_replace error: OR[povrL, pStrP] doesnt exist povrL : /home/bill/web/webWork/pHtmlPathAll_L test1.txt pStrP : /home/bill/web/webWork/pStrPAll_L change.txt >> pHtmlPathAll_L test1.txt is saved as pHtmlPathAll_L_test1.txt $ bash "$d_webWork"'fileops run webSite.sh' /home/bill/web/home.html 230914 16h07m20s povr_pStrP_replace error: OR[povr, pStrPL] doesnt exist : povr : /home/bill/web/home.html pStrP : /media/bill/ramdisk/povrL_pStrP_replace pStrP escaped.txt >> not used : pStrpPtmp384="$d_temp"'povrL_pStrP_replace pStrP escaped.txt' >> get rid of it - arXiv is offered get rid of tab in povrL_pStrP_replace() [ -s "$povr384" ]; $ bash "$d_webWork"'fileops run webSite.sh' >> terminal doesn't show home.html (return again to this error) povrL_pStrP_replace error: povr doesnt exist : /home/bill/web/home.html >> this is bullshit, why the error? >> oops, I had povr_pStrP_replace "$bolArXiv384" "$bolChrCd384" "$povr384" "$povr384" instead of povr_pStrP_replace "$bolArXiv384" "$bolChrCd384" "$povr384" "$pStrP___384" change to while IFS='' read -u 961 povr961; do povr_strP_replace "$bolArXiv961" "$bolChrCd961" "$povr961" "$strP____961" done 961<"$povrL___961" on ramdisk 'povr_pStrP_replace povr temp.txt' has size zero, so the error is correct, but not for povrL_pStrP_replace()! NUTS!! home.html had been zero'd out, but nemi had to be reloaded to see this!!! now pMenuTops are fixed still problems like <HR><H4><A HREF="/home/bill/web/Projects - major/Stalin supported Hitler/">Icebreaker Unchained : we should have lost WWII</a></h4> Missing again from 'pStrPAll_L change.txt' !!?? /Projects - major/ /ProjMajor/ /Projects - mini/ /ProjMini/ run again, test links in home.html with browser : >> looks good, but problems with : file:///home/bill/web/Projects%20-%20major/Stalin%20supported%20Hitler/ file:///home/bill/web/Projects%20-%20major/Icebreaker/ ... >> multiple replacements not done!! povr_strP_replace() change : sed "s|$strOld448|$strNew448|" "$povTmp448" >"$povTm_448" to : sed "s|$strOld448|$strNew448|g" "$povTmp448" >"$povTm_448" >> halleluyah! +-----+ olded code #pStrpPtmp384="$d_temp"'povrL_pStrP_replace pStrP escaped.txt' #08********08 #] 14Sep2023 pOvrClassL_get_pClassL - after fixes combining [bolPovEs, bolPstrEs] -> bolChrCd "$d_webWork"'fileops run webSite.sh' - process to update [, sub]webSites again : STILL includes ghost classes!??? >> I don't get it, later.. #08********08 #] 14Sep2023 some are wrong!! "$d_bin"'0_test/fileops/fileops test.sh' see "$d_bin"'fileops notes.txt' A whole day to fix up changes from #08********08 #] 12Sep2023 povrL_pStrP_replace 0 1 pHtmlPathAll_L pStrPAll_L change.txt' via "$d_bin"'fileops run webSite general.sh' geany normal search \/ replace / pStrPAll_L change.txt first do testing!!! povrL_pStrP_replace 0 1 "$d_webWork"'pHtmlPathAll_L test1.txt' "$d_webWork"'pStrPAll_L change.txt' >> no links work - home.html hasn't been changed! >> William Astle must have reset my webPage to earlier version, all changes lost >> add all old changes back to 'pStrPAll_L change.txt' I lost the old lists From broken links in pHtmlPathAll_L_test5.txt webPages : /Menu.html /pMenuTopMenu.html /status.html /pMenuTopStatus.html /copyright.html /pMenuTopCopyright.html /help.html /pMenuTopHelp.html /Projects - major/ /ProjMajor/ /Projects - mini/ /ProjMini/ /SP500/multi-fractal/ /market data/SLregress/200912 semi-log/ /Software programming & code/ /CompLangs/ >> but all of the MenuTops WERE listed - the changes aren't working and should be easy Bad external links -> MOST external links now fail! : https://www.mackinac.org/SP1998-01 https://www.tradingview.com/chart/SPX500/4mxAC2o8-Stock-prices-have-reached-a-permanently-high-plateau/ https://suspicious0bservers.org/ Why isn't povrL_pStrP_replace() working? This is crucual, and was meant to be a no-brainer! povr_pStrP_replace "$bolArXiv384" "$bolChrCd384" "$povr384" "$pStrpPtmp384" povr_pStrP_replace() { bolArXiv607="$1" # 1= archive povr bolChrCd607="$2" # 1= escape (code) human <--> sed search-replace povr_607="$4" # file to overwrite after sed search-replace pStrP607="$5" # str[Old, New] search-replace pairs >> NUTS!! I didn't renumber the args!! Try again >> still home.html hasn't been changed! povr_pStrP_replace() povr_strP_replace 0 0 0 "$povr_tmp607" "$strP607" >> still has pinn_backupDatedTo_zArchive, etc but that's OK, I should retain povrL_pStrP_replace() keep as is povr_pStrP_replace() if [ 1 -eq "$bolChrCd607" ]; then pHum_sed_pCde "$povr_607" "$povr_tmp607" pHum_sed_pCde "$pStrP607" "$pStrPtmp607" >> nyet: this last line is WRONG!? must code "$pStrPtmp607" +-----+ 14Sep2023 I need to go through "$d_bin"'0_test/fileops/fileops test.sh' +-----+ olde code # webSite_make_Menu() - build one Menu for my entire webSite webSite_make_Menu() { cat "$d_webWork"'Menu home.html' "$d_webWork"'Menu neural nets.html' "$d_webWork"'Menu projects.html' "$d_webWork"'Menu software programming.html' "$d_webWork"'Menu professional and resume.html' "$d_webWork"'Menu Howell videos.html' "$d_webWork"'Menu blogs.html' "$d_webWork"'Menu hosted subsites.html' >"$d_webWork"'Menu zhome, all.html' } pStrPMenuTopL change.txt - no longer used, to z_history #08********08 #] 13Sep2023 pOvrClassL_get_pClassL() STILL adds "ghost" classes!??? STILL includes ghost classes!??? >> I don't get it, later.. +-----+ olde code # 09Sep2023 currently missing classes in d_TrNNs_ART, confGuides #08********08 #] 14Sep2023 replace [bolPovEs, bolPStPEs] woth bolChrCd, modify code see "$d_bin"'fileops notes.txt' #08********08 #] 13Sep2023 pHtmlClassAll_L.txt missing webWork webPages! pHtmlPathExclL.txt : /home/bill/web/webWork/confFoot_authors.html /home/bill/web/webWork/confFoot.html /home/bill/web/webWork/confHead.html /home/bill/web/webWork/confStatus.html /home/bill/web/webWork/footer Neil Howell.html /home/bill/web/webWork/footer normal.html /home/bill/web/webWork/footer organisations.html /home/bill/web/webWork/footer Paul Vauhan.html /home/bill/web/webWork/footer Steven Wickson.html /home/bill/web/webWork/footer Steven Yaskell.html >> these are OK, doesn't include p<enuTop files pHtmlClassAll_L.txt >> no webWork files, problem pHtmlPathAll_L.txt >> has TrNN_ART, but nothing else /home/bill/web/ProjMini/TrNNs_ART/webWork/pMenuTopCopyright TrNN_ART.html /home/bill/web/ProjMini/TrNNs_ART/webWork/pMenuTopHelp TrNNs_ART.html /home/bill/web/ProjMini/TrNNs_ART/webWork/pMenuTopMenu TrNNs_ART.html /home/bill/web/ProjMini/TrNNs_ART/webWork/pMenuTopStatus TrNNs_ART.html I must have forgotten to put '<!-- /home/bill/web/' into these files? >> nope, they all have it tired, stop here for the day... #08********08 #] 13Sep2023 upload ConfGuide, callerID-SNNs then go on to peer review! 14Sep2023 CIQ-SNN (Sick Sin) CallerID Quantum SNN - ignoring entanglement, many similarities maybe even entanglement? I'm not a fan of Quantum Mechnaics (QM), but it may be fun to play with. 14Sep2023 pOvrL_pStrPL_replace - should either [en, de]code [neither, both] [pOvr, pStrP] #08********08 #] 13Sep2023 pHtmlClass[Conf, Genr, MenuTop, TrNN, WebW]L.txt class must be manually added to each line of pHtmlClass[All_, TrNN]L.txt first - add I moved all (except pMenuTop files) of pHtmlClassTrNNL.txt to pHtmlClassAll_L.txt I assigned classes : Grossberg to all 'Grossberg*.html' files GrossVideo to Grossberg video related TrNN_ART to others generate a fresh list of classes "$d_webWork"'fileops run webSite.sh' pOvrClassL_get_pClassL "$d_webWork"'pHtmlClassAll_L.txt' "$d_webWork"'pClassL.txt' >> nuts, still get old classes geany search for them n Bill Howells book [note, review]s n Bill Howells videos n economics, markets Hosted subSites Howell blogs Neural nets no-class Nuclear Personal Professional Projects ProjMajor ProjMini Software programming >> stop looking The problem is that pClassL.txt is being deleted! if [ -f "$pClassL" ]; then rm "$pClassL" fi >> what's wrong with this? replace -f with -s!!! >> nyet - still doesn't clean it out I commented out the section to add classes - it did rm pClassL.txt >> so much for that theory Problem is that the old classes ARE being added!??? why? Might be because I didn't escape `/ in pHtmlClassAll_L.txt? I backed up file and did that with geany un-commented section that adds class search for \tmarket class works - OK search for 'Neural nets' fails - OK but pOvrClassL_get_pClassL() STILL adds "ghost" classes!??? class=$( echo "$pOvrClass" | sed 's|.*\x9||' ) >> I don't understand, leave it for later... 13Sep2023 pOvrClassL_get_pClassL() STILL adds "ghost" classes!??? cp backup to pClassL.txt and continue with next steps have to solve this problem later... #08********08 #] 13Sep2023 put povrL_idx_strTst_cut in dWeb_get_dWebPageL(), adds to pHtmlPathExclL $ bash "$d_webWork"'fileops run webSite.sh' /home/bill/web/bin/fileops.sh: line 754: warning: command substitution: ignored null byte in input /home/bill/web/bin/fileops.sh: line 754: warning: command substitution: ignored null byte in input >> everything worked, including additions to pHtmlPathExclL, except for error >> 'pMenuTopStatus TrNNs_ART.html' missing from list, but file is OK # still has an error, but works anyway. Deal with this later... # seems to be problem with TrNN_ART webWork status file >> ah-hah! first line of ?? change : http://www.BillHowell.ca/ to : /home/bill/web/ but do I really want to exclude TrNN_ART/WebWork? >> NO - I took out that line, pHtmlPathGenrL no longer used +-----+ olde code pinnL_idx_strTst_cut() if [ -f "$pcut333" ];then rm "$pcut333" fi dWeb_get_dWebPageL() pHtmlPathGenrL769="$d_webWork"'pHtmlPathGenrL.txt' # no longer want : # grep --invert-match "ProjMini\/TrNNs_ART\/webWork" "$pHtmlPathAll_L769" | sort -u >"$pHtmlPathGenrL769" #08********08 #] 13Sep2023 update html files ONLY - fix recent errors # update html files ONLY - strP replacements may not change file last modified date-times? # updateLast="$1" # get last update from pLog in "$d_PROJECTS"'bin - secure/' # pHtmlOnlyL="$2" dir_updateHtml_dWeb '230826 12h00m00' "$d_web"'webWork/pHtmlPathGenrL.txt' $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' /home/bill/PROJECTS/bin - secure/lftp update specified dir.sh: line 159: /home/bill/web/webWork/pHtmlPathGenrL.txt: No such file or directory date: extra operand ‘%y%m%d %kh%Mm’ Try 'date --help' for more information. >> ugly $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' stat: cannot statx '/home/bill/web/ProjMajor/Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html': No such file or directory sed: can't read /home/bill/web/ProjMajor/Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html: No such file or directory stat: cannot statx '/home/bill/web/ProjMajor/Civilisations and sun/_Civilisations and the sun.html': No such file or directory sed: can't read /home/bill/web/ProjMajor/Civilisations and sun/_Civilisations and the sun.html: No such file or directory ... tons of these >> I must update "$d_web"'webWork/pHtmlPathGenrL.txt' I changed many pthNames $ bash "$d_webWork"'fileops run webSite.sh' /home/bill/web/bin/fileops.sh: line 2002: : No such file or directory grep: : No such file or directory /home/bill/web/bin/fileops.sh: line 2009: : No such file or directory /home/bill/web/bin/fileops.sh: line 2010: : No such file or directory grep: : No such file or directory Several more iterations of variable corrections : $ bash "$d_webWork"'fileops run webSite.sh' grep: sort: No such file or directory >> has accidentally removed a pipe $ bash "$d_webWork"'fileops run webSite.sh' ~ >> runs OK, check results >> includes the non-webPage html files should include pHtmlPathCutL.txt in pHtmlPathExclL.txt should do the webPage check in webSite_get_pHtmlL() rename webSite_get_pHtmlL() to webSite_get_webPageL() +-----+ olde code # 13Sep2023 comment-out several specialized lists so general processing applies #pHtmlPathConfL769="$d_webWork"'pHtmlPathConfL.txt' #pHtmlPathWebWL769="$d_webWork"'pHtmlPathWebWL.txt' # grep "Neural nets\/Conference guides" "$pHtmlPathAll_L769" | sort -u >"$pHtmlConfL769" #grep "ProjMini\/TrNNs_ART" "$pHtmlPathAll_L769" | grep --invert-match "webWork" | sort -u >"$pHtmlTrNNL769" #grep "webWork" "$pHtmlPathAll_L769" | sort -u >"$pHtmlWebWL769" # webSite_make_Menu_bag() - build Menus for 1st level down from top-level of my webSite # 04Sep2023 will likely discontinue this. Its easier to have one Menu for all webSite_make_Menu_bag() { cat "$d_webWork"'Menu home.html' "$d_webWork"'Menu blogs.html' >"$d_webWork"'Menu home blogs.html' cat "$d_webWork"'Menu home.html' "$d_webWork"'Menu hosted subsites.html' >"$d_webWork"'Menu home hosted subsites.html' cat "$d_webWork"'Menu home.html' "$d_webWork"'Menu Howell videos.html' >"$d_webWork"'Menu home Howell videos.html.html' cat "$d_webWork"'Menu home.html' "$d_webWork"'Menu Lies, Damned Lies, and Scientists.html' >"$d_webWork"'Menu home Lies, Damned Lies, and Scientists.html' cat "$d_webWork"'Menu home.html' "$d_webWork"'Menu neural nets.html' >"$d_webWork"'Menu home neural nets.html' cat "$d_webWork"'Menu home.html' "$d_webWork"'Menu professional and resume.html' >"$d_webWork"'Menu home professional and resume.html' cat "$d_webWork"'Menu home.html' "$d_webWork"'Menu projects.html' >"$d_webWork"'Menu home projects.html' cat "$d_webWork"'Menu home.html' "$d_webWork"'Menu software programming.html' >"$d_webWork"'Menu home software programming.html' } # webTrNNsART_menus() - separate Menus for d_TrNNs_ART, to focus readers, avoid confusion webTrNNsART_menus() { d_TrNN_ART="$d_web"'ProjMini/TrNNs_ART/webWork/' cat "$d_TrNN_ART"'Menu home.html' "$d_TrNN_ART"'Menu TrNNs&ART theme.html' "$d_TrNN_ART"'Menu Grossberg.html' "$d_TrNN_ART"'Menu [definitions, models] of consciousness.html' "$d_TrNN_ART"'Menu Let the machines speak.html' >"$d_TrNN_ART"'Menu zhome, all.html' cat "$d_TrNN_ART"'Menu home.html' "$d_TrNN_ART"'Menu TrNNs&ART theme.html' >"$d_TrNN_ART"'Menu zhome, TrNNs&ART theme.html' cat "$d_TrNN_ART"'Menu home.html' "$d_TrNN_ART"'Menu Grossberg.html' >"$d_TrNN_ART"'Menu zhome, Grossberg.html' cat "$d_TrNN_ART"'Menu home.html' "$d_TrNN_ART"'Menu [definitions, models] of consciousness.html' >"$d_TrNN_ART"'Menu zhome, [definitions, models] of consciousness.html' cat "$d_TrNN_ART"'Menu home.html' "$d_TrNN_ART"'Menu Let the machines speak.html' >"$d_TrNN_ART"'Menu zhome, Let the machines speak.html' } # webTrNNsART_status() - construct webPageStatus files from top-current level # 10Aug2023 drop this for now : alphabetical order, just have webPage-specific # webTrNNsART_status() { d_TrNN_ART="$d_web"'ProjMini/TrNNs_ART/webWork/' cat "$d_TrNN_ART"'status ART assess theories of consciousness.html' "$d_TrNN_ART"'status home.html' >"$d_TrNN_ART"'status zhome, ART assess theories of consciousness.html' cat "$d_TrNN_ART"'status ART augmentation of other research.html' "$d_TrNN_ART"'status home.html' >"$d_TrNN_ART"'status zhome, ART augmentation of other research.html' cat "$d_TrNN_ART"'status [definitions, models] of consciousness.html' "$d_TrNN_ART"'status home.html' >"$d_TrNN_ART"'status zhome, [definitions, models] of consciousness.html' cat "$d_TrNN_ART"'status Grossberg.html' "$d_TrNN_ART"'status home.html' >"$d_TrNN_ART"'status zhome, Grossberg.html' cat "$d_TrNN_ART"'status [definitions, models] of consciousness.html' "$d_TrNN_ART"'status home.html' >"$d_TrNN_ART"'status zhome, [definitions, models] of consciousness.html' cat "$d_TrNN_ART"'status home.html' "$d_TrNN_ART"'status [definitions, models] of consciousness.html' >"$d_TrNN_ART"'status zhome, [definitions, models] of consciousness.html' cat "$d_TrNN_ART"'status Let the machines speak.html' "$d_TrNN_ART"'status home.html' >"$d_TrNN_ART"'status zhome, Let the machines speak.html' cat "$d_TrNN_ART"'status TrNNS&ART theme.html' "$d_TrNN_ART"'status home.html' >"$d_TrNN_ART"'status zhome, TrNNS&ART theme.html' } # pHtml_cutPutEmbeds() - list of html codes, see : # "$d_webWork"'embeds QNial sort -u.txt' # "$d_webWork"'embeds html list.txt' # list of html files, see : # "$d_webWork"'urls webWork [head, foot] pLst all.txt' # "$d_webWork"'urls webWork Menu pLst all.txt' # example : <!-- [#!:http://www.BillHowell.ca/webWork files/head_one.html:!#] --> #08********08 #] 12Sep2023 continue pStrP fix MenuTop files 'pStrPGenrlL change.txt' now has : /Climate and sun/ /Sun climate/ /Civilisations and sun/ /Sun civilisations/ /Solar modeling and forecasting/ /Sun model, forecast/ /Charvatova solar inertial motion & activity/ /Sun Charvatova/ /Pandemics, health, and the Sun/ /Sun pandemics, health/ /ProjMini/Puetz \& Borchardt/ /ProjMini/PuetzUWS/ /home/bill/web/webWork/copyright.html /home/bill/web/webWork/pMenuTopCopyright.html /home/bill/web/webWork/help.html /home/bill/web/webWork/pMenuTopHelp.html /home/bill/web/webWork/Menu.html /home/bill/web/webWork/pMenuTopMenu.html /home/bill/web/webWork/status.html /home/bill/web/webWork/pMenuTopStatus.html 'pStrPMenuTopL change.txt' : #Bill Howells book [note, review]s #reviews #Bill Howells videos #videos #economics, markets #market #Hosted subSites #hosted #Howell blogs #myBlogs #Neural nets #neural #Nuclear #ProjMini #Personal #personal #Professional #professional #Projects #projMini #ProjMajor #projMajr ProjMini #projMini #Software programming #computer pHtmlClass[Conf, Genr, TrNN, WebW]L - changes : \tBill Howells book [note, review]s \treviews \tBill Howells videos \tvideos \teconomics, markets \tmarket \tHosted subSites \thosted \tHowell blogs \tmyBlogs \tNeural nets \tneural \tNuclear \tProjMini \tPersonal \tpersonal \tProfessional \tprofessional \tProjects \tprojMini \tProjMajor \tprojMajr \tProjMini \tprojMini \tSoftware programming \tcomputer After all that - must regenerate all webPage headers DON'T separate [Conf, TrNN, WebW] from Genr? just use different classes for [Conf, TrNN]? >> keep confGuides separate for now #08********08 #] 12Sep2023 pStrP fix MenuTop files # pOvrClassL_get_pClassL - generate a list of all classes pOvrClassL_get_pClassL "$d_webWork"'pHtmlClassGenrL.txt' "$d_webWork"'pClassL.txt' don't change ConfGuides, except backtrack (below) with [TrNN, CID_SNN] - [menu, status copyright]s are independent? >> later ... d_TrNN_ART insert <!-- /home/bill/web/ at top of page <!-- Howell end head --> after menu >> done Do this now (test first!) : 'pStrPConflL change.txt' \[#\=; backtrack ;\=\#] /home/bill/web/Neural nets/Conference guides/ >> done - works well Now to update online!! NOT "$d_bin"'webSite update.sh'!! - now done step-by-step [manual, auto] by "$d_webWork"'fileops run webSite general.sh' use "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' dir_updateHtml_dWeb '170101 12h00m00' "$d_web"'webWork/pHtmlClassConfL.txt' >> update entire webSite!! hopefully most links will work for now? NUTS - can't find many paths! : +-----+ $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' stat: cannot statx '/home/bill/web/Bill Howells book [note, review]s/Wilson 1977 Cosmic trigger, Howells review.html'$'\t''reviews': No such file or directory sed: can't read /home/bill/web/Bill Howells book [note, review]s/Wilson 1977 Cosmic trigger, Howells review.html reviews: No such file or directory stat: cannot statx '/home/bill/web/Bill Howells book [note, review]s/Wilson 1990 Quantum consciousness, Howells review.html'$'\t''reviews': No such file or directory sed: can't read /home/bill/web/Bill Howells book [note, review]s/Wilson 1990 Quantum consciousness, Howells review.html reviews: No such file or directory stat: cannot statx '/home/bill/web/Bill Howells videos/160901 Big Data, Deep Learning, and Safety/0_Big Data, Deep Learning, and Safety.html'$'\t''videos': No such file or directory sed: can't read /home/bill/web/Bill Howells videos/160901 Big Data, Deep Learning, and Safety/0_Big Data, Deep Learning, and Safety.html videos: No such file or directory ... +-----+ change : dir_updateHtml_dWeb '230903 15h02m00' "$d_web"'webWork/pHtmlPathClasL.txt' to : dir_updateHtml_dWeb '230903 15h02m00' "$d_web"'webWork/pHtmlPathGenrL.txt' BUT - frigging mess I deleted paths with class name from "$d_PROJECTS"'webHtmlOnly/' I saw no evidence that the corript "Class" files of zero length were uploaded (relief!) I just moved callerID_SNN dir, fix pHtmlPathGenrL.txt Try again... only Conference Guides : povrL_pStrP_replace 0 0 0 "$d_webWork"'pHtmlPathConfL.txt' "$d_webWork"'pStrPConflL change.txt' $ bash "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' >> just leave it run and check 12Sep2023 permissions are still a problem online - can lftp do this? #08********08 #] 11Sep2023 NOT legitimate cuts (or maybe replaced by other files)? : keep /home/bill/web/webWork/status.html cut OK /home/bill/web/webWork/style quotations.html was missing a space change : <!-- /home/bill/web/webWork/status general.html --> to : <!-- /home/bill/web/webWork/status general.html --> #08********08 #] 10Sep2023 debug pinnL_idx_strTst_cut_pinn - remove pinn from pinnL, in fileops.sh 11Sep2023 $ bash "$d_webWork"'fileops run webSite general.sh' mv: cannot stat '': No such file or directory >> problem must be in pinnL_idx_strTst_cut_pinn pinnL_idx_strTst_cut_pinn "$d_webWork"'pHtmlOnlyL_test1.txt' 1 '<!-- /home/bill/web/' change : mv "$pinnL_tmp333" "$pinnL333" to: if [ -s "$pinnL_tmp333" ]; then mv "$pinnL_tmp333" "$pinnL333" fi $ bash "$d_webWork"'fileops run webSite general.sh' strTst333 = <!-- /home/bill/web/ line333 = <!-- /home/bill/web/home.html --> >> make fixes, improvements $ bash "$d_webWork"'fileops run webSite general.sh' strTst333 = <!-- /home/bill/web/ strTst_len333 = 21 line333 = <!-- /home/bill/web/home.html --> line333 = <!-- /home/bill/web/ /home/bill/web/bin/fileops.sh: line 755: : No such file or directory >> I had accidentally deleted line : pinnL_tmp333="$d_temp"'pinnL_idx_strTst_cut webPageL.txt' pinnL_idx_strTst_cut "$d_webWork"'pHtmlOnlyL_test5 with bad.txt' 1 '<!-- /home/bill/web/' $ bash "$d_webWork"'fileops run webSite general.sh' sed: can't read /home/bill/web/footer Steven Yaskell.html: No such file or directory >> does the right thing, but error >> why? I don't even use sed!!! oops - footer is in webWork $ bash "$d_webWork"'fileops run webSite general.sh' ~ >> OK, now it works pinnL_idx_strTst_cut "$d_webWork"'pHtmlOnlyL.txt' 1 '<!-- /home/bill/web/' $ bash "$d_webWork"'fileops run webSite general.sh' /home/bill/web/bin/fileops.sh: line 752: warning: command substitution: ignored null byte in input /home/bill/web/bin/fileops.sh: line 752: warning: command substitution: ignored null byte in input >> anomalous paths (might have been made into webPage by mistake) : +-----+ /home/bill/web/economics, markets/Freeman 27Oct2000 The Quality Adjustment Method, How Statistical Fakery Wipes Out Inflation.html /home/bill/web/economics, markets/Nuclear for tar sands 23Sep05.html /home/bill/web/Neural nets/References/Schmidhuber 26Mar2022 Neural nets learn to program neural nets with with fast weights (1991).html /home/bill/web/Neural nets/References/Schmidhuber 29Dec2022 Annotated history of modern AI and deep neural networks.html /home/bill/web/Neural nets/References/Scientific Integrity and the History of Deep Learning The 2021 Turing Lecture, and the 2018 Turing Award.html /home/bill/web/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html /home/bill/web/Personal/210611 Hoffart lawsuit/230306 emto Christine Smith, Aviva Trial Lawyers (draft).html /home/bill/web/ProjMajor/History/Ukraine-Russia/kyivindependent.com news log.html /home/bill/web/ProjMajor/History/Ukraine-Russia/ukrinform.net news log.html /home/bill/web/ProjMajor/Sun model, forecast/Sudbury Neutrino Observatory (SNO)/collaboration.html /home/bill/web/ProjMajor/Sun model, forecast/Sudbury Neutrino Observatory (SNO)/Computer-generated images of SNO events.html /home/bill/web/ProjMajor/Sun model, forecast/Sudbury Neutrino Observatory (SNO)/publications.html /home/bill/web/ProjMajor/Sun model, forecast/Sudbury Neutrino Observatory (SNO)/SNO contacts.html /home/bill/web/ProjMajor/Sun model, forecast/Sudbury Neutrino Observatory (SNO)/SNO detector description.html /home/bill/web/ProjMajor/Sun model, forecast/Sudbury Neutrino Observatory (SNO)/solar neutrino problem.html /home/bill/web/ProjMini/Transformer NNs/230604 KEEP survey ChatGPT and AI Usage (Students).html /home/bill/web/ProjMini/Transformer NNs/230604 KEEP survey ChatGPT and AI Usage (Teachers).html /home/bill/web/ProjMini/TrNNs_ART/webWork/copyright Grossberg.html /home/bill/web/webOther/Steven H Yaskell/0_Copyright ending.html +-----+ backup is VERY useful, some htmls must be converted to webPages, others not : pHtmlCutL 230911 14h17m17s.txt legitimate conversions needed for dirs : /home/bill/web/Neural nets/Conference guides/ /home/bill/web/ProjMini/TrNNs_ART/ legitimate cuts : /home/bill/web/Qnial/MY_NDFS/email/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html /home/bill/web/Qnial/MY_NDFS/iconv - Unicode to ASCII/IJCNN ICONV Bad Adress email file.html /home/bill/web/Qnial/MY_NDFS/uni2ascii/uni2ascii - convert UTF-8 Unicode to various 7-bit ASCII.html /home/bill/web/References/Charbonneau - Dynamo models of thesolar cycle - resources/lrsp-2005-2Resources/index.html /home/bill/web/References/Climate/[Armstrong Jun07 - Gore bet challenge.html, others] /home/bill/web/References/economics, markets/Campbell, Grossman, Turner 04Sep2019 monthly British stock market, 1829-1929.html /home/bill/web/References/Mathematics/Functional Integration.html /home/bill/web/References/Neural Nets/[Herrera, Alfredo at Nortel - FPGA_computational_engines/Language Barrier.html, others] /home/bill/web/References/Toynbee VII/Toynbee studyofhistory VI Distintegration of Civilisations 5018264mbp_djvu.html /home/bill/web/webWork/[confFoot*, footer*, head_one ??, NOT legitimate cuts (or maybe replaced by other files) : /home/bill/web/webWork/status.html /home/bill/web/webWork/style quotations.html next : 11Sep2023 NOT legitimate cuts (or maybe replaced by other files)? : /home/bill/web/webWork/status.html /home/bill/web/webWork/style quotations.html 11Sep2023 legitimate conversions needed for dirs : /home/bill/web/Neural nets/Conference guides/ /home/bill/web/ProjMini/TrNNs_ART/ +-----+ olde code 10Sep2023 #pQndfCoreL_pOptrNamPL_replace #CAREFUL!! comment out 'mv' line to test first!!! #dir_renameFileL "$d_web"'ProjMini/TrNNs-ART/images- Grossberg 2021/' 1 "p\([0-9]\{3\}\)fib\([0-9]\{2\}\)\." "s|p\([0-9]\{3\}\)fib\([0-9]\{2\}\)\.|p\1fig\2.|g" </tr><TR> <TD> overall </td><TD><A HREF='/home/bill/web/bin/fileops.sh'> fileops.sh</a> </td><TD> see level of code </td> </tr><TR> <TD> overall development </td><TD><A HREF='/home/bill/web/bin/fileops notes.txt'> fileops notes.txt</a> </td><TD> usually none specific to webSite </td> </tr><TR> <TD> overall approach </td><TD><A HREF='/home/bill/web/bin/fileops run commentary, overall.html'> fileops run commentary, overall.html</a> </td><TD> just for overall reference #08********08 #] 11Sep2023 create "intermediate-level" documentation for 'fileops run webSite general.sh' see "$d_webWork"'fileops run commentary.html' #08********08 #] 10Sep2023 pinnL_idx_strTst_cut_pinn - remove pinn from pinnL, in fileops.sh #] when line(idx) ! == "<!-- /home/bill/web/*" post webSite_get_pHtmlL() - ensure that only Howell webPages are in pHtmlOnlyL.txt cull pHtml from list if the first line of each webPage DOES NOT start with : '<!-- /home/bill/web/' use grep to get line? $ sed -n '1{p;q}' "$d_web"'economics, markets/0_BillHowell market news/210421 BillHowell.ca market news.html' <!-- /home/bill/web/economics, markets/0_BillHowell market news/210421 BillHowell.ca market news.html --> >> Perfect, very fast try new function in fileops,sh, called by "$d_webWork"'fileops run webSite general.sh' : pinnL_idx_strTst_cut_pinn "$d_webWork"'pHtmlOnlyL.txt' 1 '<!-- /home/bill/web/' $ bash "$d_webWork"'fileops run webSite general.sh' /home/bill/web/bin/fileops.sh: line 734: : No such file or directory mv: cannot stat '': No such file or directory >> oops, forgot the mix of random 3-digit numCodes for LOCAL symbols, try again : $ bash "$d_webWork"'fileops run webSite general.sh' /home/bill/web/bin/fileops.sh: line 747: warning: command substitution: ignored null byte in input /home/bill/web/bin/fileops.sh: line 747: warning: command substitution: ignored null byte in input mv: cannot stat '': No such file or directory >> FUNNY - ALL html files were cut! work on this tomorrow... #08********08 #] 10Sep2023 webSite_get_pHtmlL() - minimize non-Howell pHtmls with crude filter first thing is to avoid problem dirs where you know there are many problems make use of an exclude file? nyet -> use grep " man grep -f FILE, --file=FILE Obtain patterns from FILE, one per line. If this option is used multiple times or is combined with the -e (--regexp) option, search for all patterns given. The empty file contains zero patterns, and therefore matches nothing. OK, let's try it "$d_webWork"'pHtmlExclL.txt' change : find "$d_web" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive\|z_old\|z_archive\|z_history\|\/Forms\|System_maintenance\|Qnial\/Manuals\|Qnial\/code develop_test\|Qnial\/Qnial_bag\|Cool emails\|Yoonsuck Choe - conf program book\|ProjMini\/TrNNs_ART\/captions html\|Top 75 Immunotherapy\|OPM\/OPM\|bin\/0_test\|References\/Weart 2003\|Electric Universe\/References\/Randi Foundation 2008-2011\|References\/Niroma" | sort -u >"$pHtmlOnlyL769" to : find "$d_web" -type f -name "*.html" | grep --invert-match "$pHtmlExclL769" | sort -u >"$pHtmlOnlyL769" test to see if grep file reads must escape `/ $ cat "$d_webWork"'pHtmlExclL.txt' | sort bin/0_test Cool emails economics, markets Electric Universe/References/Randi Foundation 2008-2011 /Forms/ OPM/OPM ProjMini/TrNNs_ART/captions html Qnial/code develop_test /Qnial/Manuals/ Qnial/Qnial_bag References/Niroma References/Weart 2003 /System_maintenance/ Top 75 Immunotherapy Yoonsuck Choe - conf program book z_Archive z_archive z_history z_Old z_old For now, I won't add all the "nonHowell" html files, eg : NotHowell: keep the file! but : remove from pHtmlClassGenrL.txt, add to script exclusions : /home/bill/web/ProjMini/Solar system/Cdn Solar Forecasting/Canadian Solar Workshop 2006 home page.html /home/bill/web/ProjMini/Solar system/Cdn Solar Forecasting/CSWProgram.html /home/bill/web/ProjMini/Solar system/Cdn Solar Forecasting/test- Canadian Solar Workshop 2006 home page.html ... $ pHtmlExclL="$d_webWork"'pHtmlExclL.txt' $ find "$d_webWork" -type f -name "*.html" | grep --file="$pHtmlExclL" --invert-match | sort /home/bill/web/webWork/confFoot_authors.html /home/bill/web/webWork/confFoot.html /home/bill/web/webWork/confHead.html /home/bill/web/webWork/confStatus.html /home/bill/web/webWork/copyright.html /home/bill/web/webWork/footer Neil Howell.html /home/bill/web/webWork/footer normal.html /home/bill/web/webWork/footer organisations.html /home/bill/web/webWork/footer Paul Vauhan.html /home/bill/web/webWork/footer Steven Wickson.html /home/bill/web/webWork/footer Steven Yaskell.html /home/bill/web/webWork/head_one.html /home/bill/web/webWork/help.html /home/bill/web/webWork/Menu.html /home/bill/web/webWork/status.html /home/bill/web/webWork/style quotations.html >> wow, not that many html files (mostly .txt) $ find "$d_web"'System_maintenance/' -type f -name "*.html" | grep --file="$pHtmlExclL" --invert-match | sort ~ >> nothing, so \/System_maintenance\/ Now, in "$pHtmlExclL", change : \/System_maintenance\/ to : /System_maintenance/ $ find "$d_web"'System_maintenance/' -type f -name "*.html" | grep --file="$pHtmlExclL" --invert-match | sort >> still seems to work Need better test use initial (escaped) version of "$pHtmlExclL" $ find "$d_web"'economics, markets' -type f -name "*.html" | grep --file="$pHtmlExclL" --invert-match | sort /home/bill/web/economics, markets/0_BillHowell market news/210421 BillHowell.ca market news.html /home/bill/web/economics, markets/currency-crypto/Cryptos versus [currencies, 10 year [rates, bonds]].html /home/bill/web/economics, markets/Fischer, David 1996 The Great Wave/0_Fischer - The Great pricing Waves 1200-1990 AD.html /home/bill/web/economics, markets/Freeman 27Oct2000 The Quality Adjustment Method, How Statistical Fakery Wipes Out Inflation.html /home/bill/web/economics, markets/[interest, exchange] rates/[interest, exchange] rate models.html /home/bill/web/economics, markets/Long term market indexes & PPI 0582.html /home/bill/web/economics, markets/market data/SLregress/200912 semi-log/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html /home/bill/web/economics, markets/Nuclear for tar sands 23Sep05.html /home/bill/web/economics, markets/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html >> seems that the search is "normal", not regexpr. I am NOT using the option -e try -e option to see: $ find "$d_web"'economics, markets' -type f -name "*.html" | grep -e --file="$pHtmlExclL" --invert-match | sort +--+ /home/bill/web/economics, markets/0_BillHowell market news/210421 BillHowell.ca market news.html /home/bill/web/economics, markets/0_BillHowell market news/z_Archive/230219 20h48m 210421 BillHowell.ca market news.html /home/bill/web/economics, markets/0_BillHowell market news/z_Archive/230223 16h32m 210421 BillHowell.ca market news.html ... +--+ >> almost all .html files in dir returned. It seems that the `, in 'economics, markets' prevented a [match, exclusion] becaise of regexpr. try escaped `, in 'economics\, markets' in "$pHtmlExclL" $ find "$d_web"'economics, markets' -type f -name "*.html" | grep -e --file="$pHtmlExclL" --invert-match | sort >> same big list, so what is -e grep option doing? I'm lost. Going back to : grep has no -e option "$pHtmlExclL" has economics\, markets NOT economics, markets $ find "$d_web"'economics, markets' -type f -name "*.html" | grep --file="$pHtmlExclL" --invert-match | sort >> nothing Now try : grep has no -e option "$pHtmlExclL" has economics, markets NOT economics\, markets $ find "$d_web"'economics, markets' -type f -name "*.html" | grep --file="$pHtmlExclL" --invert-match | sort | grep -e option pHtmlExclL |no yes ---------------------+------------------------------ economics, markets |nothing huge list economics\, markets |nothing huge list double-check all tests : yes -e, not `\ in pHtmlExclL $ find "$d_web"'economics, markets' -type f -name "*.html" | grep -e --file="$pHtmlExclL" --invert-match | sort >> nothing yes -e, yes `\ $ find "$d_web"'economics, markets' -type f -name "*.html" | grep -e --file="$pHtmlExclL" --invert-match | sort >> huge list not -e, not `\ $ find "$d_web"'economics, markets' -type f -name "*.html" | grep --file="$pHtmlExclL" --invert-match | sort >> nothing not -e, yes `\ $ find "$d_web"'economics, markets' -type f -name "*.html" | grep --file="$pHtmlExclL" --invert-match | sort >> nothing Can't make sense of it. +-----+ Arbitrary, and we'll see how it works : Don't escape [,/] in [pHtmlExclL, 'economics, markets'] pHtmlExclL add : /Solar system/Cdn Solar Forecasting/ (andd see if any get through) $ cat "$d_webWork"'pHtmlExclL.txt' | sort bin/0_test Cool emails economics, markets Electric Universe/References/Randi Foundation 2008-2011 /Forms/ OPM/OPM ProjMini/TrNNs_ART/captions html Qnial/code develop_test /Qnial/Manuals/ Qnial/Qnial_bag References/Niroma References/Weart 2003 /Solar system/Cdn Solar Forecasting/ /System_maintenance/ Top 75 Immunotherapy Yoonsuck Choe - conf program book z_Archive z_archive z_history z_Old z_old Try with actual $ pHtmlExclL="$d_webWork"'pHtmlExclL.txt' $ pHtmlOnlyL_test="$d_webWork"'pHtmlOnlyL_test.txt' $ pHtmlGenrL_test="$d_webWork"'pHtmlGenrL_test.txt' $ find "$d_web" -type f -name "*.html" | grep --file="$pHtmlExclL" --invert-match | sort >"$pHtmlOnlyL_test" >> doesn't look bad, actually used pHtmlGenrL_test for output, which should have gone to pHtmlOnlyL769="$d_webWork"'pHtmlOnlyL.txt' so I back-changed its name OK, update webSite_get_pHtmlL() to use "$pHtmlExclL" find "$d_web" -type f -name "*.html" | grep --file="$pHtmlExclL" --invert-match | sort -u >"$pHtmlOnlyL769" ...leave the rest of the code alone... "$d_bin"'fileops run webSite general.sh' uncomment : webSite_get_pHtmlL # lists of [mainWeb, confGuides, TrNNs_ART, webWorks] $ bash "$d_bin"'fileops run webSite general.sh' #08********08 #] 10Sep2023 continue: one-by one edit of files listed in pHtmlClassGenrL.txt done: see notes below +-----+ Special: Werbos QuickView that I can't open, not web-page, but may be worth fixing : /home/bill/web/References/Neural nets/Werbos - EnergySustainability July03 QuickView Plus.html NotHowell: keep the file! but : remove from pHtmlClassGenrL.txt, add to script exclusions : /home/bill/web/ProjMini/Solar system/Cdn Solar Forecasting/Canadian Solar Workshop 2006 home page.html /home/bill/web/ProjMini/Solar system/Cdn Solar Forecasting/CSWProgram.html /home/bill/web/ProjMini/Solar system/Cdn Solar Forecasting/test- Canadian Solar Workshop 2006 home page.html /home/bill/web/ProjMini/Solar system/Cdn Solar Forecasting/test- CSWProgram.html /home/bill/web/ProjMini/wordpress site/Authors Guide BLOG home.html /home/bill/web/ProjMini/wordpress site/test- Authors Guide BLOG home.html /home/bill/web/Qnial/MY_NDFS/email/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html /home/bill/web/Qnial/MY_NDFS/email/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html /home/bill/web/Qnial/MY_NDFS/iconv - Unicode to ASCII/IJCNN ICONV Bad Adress email file.html /home/bill/web/Qnial/MY_NDFS/uni2ascii/uni2ascii - convert UTF-8 Unicode to various 7-bit ASCII.html /home/bill/web/References/Charbonneau - Dynamo models of thesolar cycle - resources/lrsp-2005-2Resources/index.html /home/bill/web/References/Climate/Armstrong Jun07 - Gore bet challenge.html /home/bill/web/References/Climate/Biello - With Warming Climate, Only the Earlier Bird Catches the Worm, SciAm 04May06.htm /home/bill/web/References/Climate/Gregory Aug07 - Climate Change Science.html /home/bill/web/References/Climate/Gregory Aug07 - Lockwood Paper critique.html /home/bill/web/References/Climate/HTML Quick List - HTML Code Tutorial.html /home/bill/web/References/Climate/IJCNN - historical Earth Sciences special sessions.html /home/bill/web/References/Climate/MacRae - Drive-by shootings in Kyotoville 10Sep05.html /home/bill/web/References/Climate/OIQ - delocalisation de genie en Chine et Inde 02Feb07.html /home/bill/web/References/Climate/Veizer & Shaviv - celestial driver of Phanerozoic climate.html /home/bill/web/References/economics, markets/Campbell, Grossman, Turner 04Sep2019 monthly British stock market, 1829-1929.html /home/bill/web/References/Mathematics/Functional Integration.html /home/bill/web/References/Neural nets/Herrera, Alfredo at Nortel - FPGA_computational_engines/Language Barrier.html /home/bill/web/References/Neural Nets/Immerkær 1997 - Least Median Squared.html /home/bill/web/References/Neural nets/lamarck.html /home/bill/web/References/Neural nets/Manuel, Alfonseca - NNs in APL.html /home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/FIRNet usage.html /home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/Jeff Miller OhioSU - FIR NNs coding.html /home/bill/web/References/Toynbee VII/Toynbee studyofhistory VI Distintegration of Civilisations 5018264mbp_djvu.html moved : /home/bill/web/ProjMini/PuetzUWS/220616/Howell - PuetzUWS [time, price] multifractal of detrended SPX 1871-2020.html /home/bill/web/ProjMini/Puetz/Howell - PuetzUWS [time, price] multifractal of detrended SPX 1871-2020.html lost /home/bill/web/References/Climate/_Climate and sun.html I deleted the files : KEEP questionnaire coding - unreadable /home/bill/web/ProjMini/Transformer NNs/230604 KEEP survey ChatGPT and AI Usage (Students).html /home/bill/web/ProjMini/Transformer N/230604 KEEP survey ChatGPT and AI Usage (Teachers).html lost the mail html file - webPage complete /home/bill/web/References/110313 ABC news - tsunami effect in Japan_files/jquery.html projMini just crappy-html /home/bill/web/References/Climate/Davidson, Ben 06Jul2017 The Charlemagne event.html junk page content, moved /home/bill/web/References/Climate/Le Mouël, Lopes, Courtillot 22Feb2019 A Solar Signature in Many Climate Indices.html /home/bill/web/References/Climate/locusts in Australia.html #08********08 #] 09Sep2023 one-by one edit of files listed in pHtmlClassGenrL.txt # [header, status] save via nemo filemanager, # OK - got fixed: I screwed up permissions in d_webWork, and may have lost all of the files!!! from "$d_SysMaint"'Linux/chown & chmod notes.txt' : for my local [file, dir]s, change LOCAL d_web permissions to : $ sudo find "$d_webWork" -type d -print0 | xargs -0 chmod 755 $ sudo find "$d_webWork" -type f -print0 | xargs -0 chmod 644 >> OK, looks fine now... thank goodness (don't use nemo, use bash for permissions!!! OK, so now work on html files : missing files : /home/bill/web/economics, markets/[interest, exchange] rates/[interest, exchange] rate models.html economics, markets /home/bill/web/Personal/210611 Hoffart lawsuit/230306 emto Christine Smith, Aviva Trial Lawyers (draft).html personal Renamed dir, made changes to pHtmlClassGenrL.txt, [copyright, help, Menu, status] later : Climate and sun Sun climate Civilisations and sun Sun civilisations Solar modeling and forecasting Sun model, forecast Charvatova solar inertial motion & activity Sun Charvatova Pandemics, health, and the Sun Sun pandemics, health stopped at : /home/bill/web/Personal/Crazy ideas/Crazy ideas.html personal +-----+ olde code # 09Sep023 I need to remove .html from title title=$( grep '<TITLE>' "$pOvrr" ) if [[ "$title" ]]; then title=$( pth_get_fnam "$pOvrr" ) title='<TITLE>'"$title"'' fi sed "s|.*|$title|;s|<:class:>|$class|" "$pHeadTpltL" >>"$pOvrTmp" #08********08 #] 09Sep2023 augment [Menu, status, copyright, help] files OK - formatted get list of all classes, fileops.sh see : pOvrClassL_get_pClassL() - extract a list of classes from pOvrClassL fileops run.sh : pOvrClassL_get_pClassL "$d_webWork"'pHtmlClassGenrL.txt' "$d_webWork"'pClassL.txt' sorted list : Bill Howells book [note, review]s Bill Howells videos economics, markets home Hosted subSites Howell blogs Neural nets no-class Nuclear (change to ProjMini) Personal Professional Projects ProjMajor ProjMini Software programming standard order (by my priorities) : home Neural nets Projects ProjMajor * ProjMini Software programming economics, markets Bill Howells videos Howell blogs Bill Howells book [note, review]s * Hosted subSites Professional Personal in "$d_webWork"'pHtmlClassGenrL.txt' : no-class I allocated these to current classes change to \tNeural nets \tneural \tProjects \tproject \tProjMajor \tprojMajr \tProjMini \tprojMini \tNuclear \tprojMini \tSoftware programming \tcomputer \teconomics\, markets \tmarket \tBill Howells videos \tvideos \tHowell blogs \tmyBlogs \tBill Howells book \[note\, review\]s \treviews \tHosted subSites \thosted \tProfessional \tcareer \tPersonal \tpersonal yikes - still getting from : pOvrClassL_get_pClassL "$d_webWork"'pHtmlClassGenrL.txt' "$d_webWork"'pClassL.txt' >> why? Bill Howells book [note, review]s Bill Howells videos career computer economics, markets home hosted Hosted subSites Howell blogs market myBlogs neural Neural nets no-class Nuclear Personal personal Professional project Projects ProjMajor projMajr ProjMini projMini reviews Software programming videos need script to change classes for [pHtmlClassGenrL, Menu, status, copyright, help] files +-----+ QNial - tblOfContents for [Menu, status, copyright, help] files NYET - bash fileops.sh!! Crap - TableOfContents doesn't work, can't put heading in between? nuts - was it permissions? (must be exeecutable) Good enough for now. #08********08 #] 09Sep2023 pOvrClassL_put_pHead() - test1 Menu specialised, bookmarks? got test[1,5] working EXCEPT extra line inserted after header pOvrClassL_put_pHead "$d_webWork"'pHtmlClassGenrL.txt' "$d_webWork"'pHeader template.txt' $ bash "$d_bin"'fileops run.sh' +-----+ cp: cannot stat '/home/bill/web/References/Neural nets/Herrera, Alfredo at Nortel - FPGA_computational_engines/Language Barrier.html': No such file or directory grep: /home/bill/web/References/Neural nets/Herrera, Alfredo at Nortel - FPGA_computational_engines/Language Barrier.html: No such file or directory grep: /home/bill/web/References/Neural nets/Herrera, Alfredo at Nortel - FPGA_computational_engines/Language Barrier.html: No such file or directory cat: '/home/bill/web/References/Neural nets/Herrera, Alfredo at Nortel - FPGA_computational_engines/Language Barrier.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/References/Neural nets/Herrera, Alfredo at Nortel - FPGA_computational_engines/Language Barrier.html': No such file or directory cp: cannot stat '/home/bill/web/References/Neural nets/lamarck.html': No such file or directory grep: /home/bill/web/References/Neural nets/lamarck.html: No such file or directory grep: /home/bill/web/References/Neural nets/lamarck.html: No such file or directory cat: '/home/bill/web/References/Neural nets/lamarck.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/References/Neural nets/lamarck.html': No such file or directory cp: cannot stat '/home/bill/web/References/Neural nets/Manuel, Alfonseca - NNs in APL.html': No such file or directory grep: /home/bill/web/References/Neural nets/Manuel, Alfonseca - NNs in APL.html: No such file or directory grep: /home/bill/web/References/Neural nets/Manuel, Alfonseca - NNs in APL.html: No such file or directory cat: '/home/bill/web/References/Neural nets/Manuel, Alfonseca - NNs in APL.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/References/Neural nets/Manuel, Alfonseca - NNs in APL.html': No such file or directory cp: cannot stat '/home/bill/web/References/Neural nets/Pribram 1993 Rethinking neural networks, Quantum fields and biological data, TableOfContents.html': No such file or directory grep: /home/bill/web/References/Neural nets/Pribram 1993 Rethinking neural networks, Quantum fields and biological data, TableOfContents.html: No such file or directory grep: /home/bill/web/References/Neural nets/Pribram 1993 Rethinking neural networks, Quantum fields and biological data, TableOfContents.html: No such file or directory cat: '/home/bill/web/References/Neural nets/Pribram 1993 Rethinking neural networks, Quantum fields and biological data, TableOfContents.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/References/Neural nets/Pribram 1993 Rethinking neural networks, Quantum fields and biological data, TableOfContents.html': No such file or directory cp: cannot stat '/home/bill/web/References/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html': No such file or directory grep: /home/bill/web/References/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html: No such file or directory grep: /home/bill/web/References/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html: No such file or directory cat: '/home/bill/web/References/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/References/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html': No such file or directory cp: cannot stat '/home/bill/web/References/Neural nets/sejnowski ica.html': No such file or directory grep: /home/bill/web/References/Neural nets/sejnowski ica.html: No such file or directory grep: /home/bill/web/References/Neural nets/sejnowski ica.html: No such file or directory cat: '/home/bill/web/References/Neural nets/sejnowski ica.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/References/Neural nets/sejnowski ica.html': No such file or directory cp: cannot stat '/home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/FIRNet usage.html': No such file or directory grep: /home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/FIRNet usage.html: No such file or directory grep: /home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/FIRNet usage.html: No such file or directory cat: '/home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/FIRNet usage.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/FIRNet usage.html': No such file or directory cp: cannot stat '/home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/Jeff Miller OhioSU - FIR NNs coding.html': No such file or directory grep: /home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/Jeff Miller OhioSU - FIR NNs coding.html: No such file or directory grep: /home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/Jeff Miller OhioSU - FIR NNs coding.html: No such file or directory cat: '/home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/Jeff Miller OhioSU - FIR NNs coding.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/References/Neural nets/Wan & Miller - FIR NNs/Jeff Miller OhioSU - FIR NNs coding.html': No such file or directory cp: cannot stat '/home/bill/web/References/Neural nets/Werbos - EnergySustainability July03 QuickView Plus.html': No such file or directory grep: /home/bill/web/References/Neural nets/Werbos - EnergySustainability July03 QuickView Plus.html: No such file or directory grep: /home/bill/web/References/Neural nets/Werbos - EnergySustainability July03 QuickView Plus.html: No such file or directory cat: '/home/bill/web/References/Neural nets/Werbos - EnergySustainability July03 QuickView Plus.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/References/Neural nets/Werbos - EnergySustainability July03 QuickView Plus.html': No such file or directory cp: cannot stat '/home/bill/web/References/Neural nets/Werbos EnergySustainabilityJuly%2Edoc.html': No such file or directory grep: /home/bill/web/References/Neural nets/Werbos EnergySustainabilityJuly%2Edoc.html: No such file or directory grep: /home/bill/web/References/Neural nets/Werbos EnergySustainabilityJuly%2Edoc.html: No such file or directory cat: '/home/bill/web/References/Neural nets/Werbos EnergySustainabilityJuly%2Edoc.html': No such file or directory mv: cannot move '/media/bill/ramdisk/pOvrClassL_put_pHead pOvr temp.txt' to '/home/bill/web/References/Neural nets/Werbos EnergySustainabilityJuly%2Edoc.html': No such file or directory +-----+ >> lots of problems! >> augment [Menu, status, copyright, help] filesm then one-by one edit of pHtmlClassGenrL.txt #08********08 #] 08Sep2023 pOvrClassL_put_pHead() - finish, test more or less works now have to clean up status general.html test5 next - see how a variety of htmls work +-----+ olde code pHeadTpltTmp="$d_temp"'pOvrClassL_put_pHead header template temp.txt' #08********08 #] 08Sep2023 continue pHtmlClassL_put_pSeddClass() I give up for now... (even lost all 'pHtmlClassGenrL.txt' html files - had to get from backup - replace ALL status phrases in pHtmlL (over-write) I used the hand-coded file : cp 'povrL_put_pHead classe.txt' -> 'pHtmlClassGenrL.txt' +-----+ - replace ALL status phrases in pHtmlL (over-write) use test file set!! (has copies in "$d_web"'z_Archive/230908 17h22m recover test files/') $ bash "$d_bin"'fileops run.sh' /home/bill/web/home.html sed: -e expression #1, char 142: Invalid back reference >> still zeros out "$d_web"'home.html' Why???? povr_strP_replace() commented out : #if [ 1 -eq "$bolPovEs448" ]; then # pCde_sed_pHum "$povTm_448" "$povr__448" #else # mv "$povTm_448" "$povr__448" #fi put in echos to see echo "$strOld448" echo "$strNew448" $ bash "$d_bin"'fileops run.sh' +-----+ /home/bill/web/home.html \/home\/bill\/web\/Bill Howells videos\/\(.*\)\/\([A-Za-z0-9 -_,()\[\]]*\)\.html mv: cannot stat '/media/bill/ramdisk/povr_pStrP_replace povr temp.txt': No such file or directory +-----+ >> YIKES! why is "$strNew448" null? sed: -e expression #1, char 142: Invalid back reference >> must be this \/home\/bill\/web\/Bill Howells videos\/\(.*\)\/\([A-Za-z0-9 -_,()\[\]]*\)\.html /home/bill/web/Bill Howells videos/\1\/2/.htmlztbzBill Howells videos pHtmlL_putClass_pStrPL() change : povrL_pStrP_replace 0 0 0 "$pHtmlL289" "$pStrPL289" to : povrL_pStrP_replace 0 1 0 "$pHtmlL289" "$pStrPL289" >> same problem as above but povr_strP_replace temp.txt is strHum_sed_strCde() - convert str format [human -> code] of "$d_webWork"'povrL_put_pHead header.txt' >> !!What???? why was this just created in d_temp? this is ONLY in povrL_put_pHead() Run povr_strP_replace() directly bolArXiv448="$1" bolPovEs448="$2" bolStrEs448="$3" povr__448="$4" strP__448="$5" $ povr_strP_replace 0 0 0 "$d_webWork"'/home/bill/web/home.html' '\/home\/bill\/web\/Bill Howells videos\/\(.*\)\/\([A-Za-z0-9 -_,()\[\]]*\)\.html /home/bill/web/Bill Howells videos/\1\/2/.htmlztbzBill Howells videos' povr_strP_replace: command not found >> which command not found? povr_strP_replace??? >> maybe escape search & replace terms? >> wait, there is a tab, so that should be OK? NYET - I was missing the tab for some reason? >> put this in 'fuleops run.sh' $ bash "$d_bin"'fileops run.sh' ~ >> OK, it ran. check - nuts .home/bill/web/home.html was empty $ bash "$d_bin"'fileops run.sh' \/home\/bill\/web\/Bill Howells videos\/\(.*\)\/\([A-Za-z0-9 -_,()\[\]]*\)\.html /home/bill/web/Bill Howells videos/\1\/2/.htmlztbzBill Howells videos >> OK this time I got BOTH [search, replace] terms >> but home.html was DELETED (not even just empty) >> ah-hah - probably moved to d_temp povr_strP_replace() uncomment : #if [ 1 -eq "$bolPovEs448" ]; then # pCde_sed_pHum "$povTm_448" "$povr__448" #else # mv "$povTm_448" "$povr__448" #fi $ bash "$d_bin"'fileops run.sh' ~ >> runs simply, but DELETES home.html >> why??? Again, see about manually escaping replace term povr_strP_replace 0 0 0 '/home/bill/web/home.html' '\/home\/bill\/web\/Bill Howells videos\/\(.*\)\/\([A-Za-z0-9 -_,()\[\]]*\)\.html \/home\/bill\/web\/Bill Howells videos\/\\1\\/2\/\.htmlztbzBill Howells videos' >> again, runs nicely : \/home\/bill\/web\/Bill Howells videos\/\(.*\)\/\([A-Za-z0-9 -_,()\[\]]*\)\.html \/home\/bill\/web\/Bill Howells videos\/\\1\\/2\/\.htmlztbzBill Howells videos ~ >> I DIDN'T wipe out home.html IDIOT! I'm not trying to change home.html need to change what? I'm confused pHtmlGenrL.txt -> pHtmlClassGenrL.txt povr_strP_replace 0 0 0 '/home/bill/web/webWork/pHtmlGenrL.txt' '\/home\/bill\/web\/Bill Howells videos\/\(.*\)\/\([A-Za-z0-9 -_,()\[\]]*\)\.html \/home\/bill\/web\/Bill Howells videos\/\\1\\/2\/\.htmlztbzBill Howells videos' $ bash "$d_bin"'fileops run.sh' \/home\/bill\/web\/Bill Howells videos\/\(.*\)\/\([A-Za-z0-9 -_,()\[\]]*\)\.html \/home\/bill\/web\/Bill Howells videos\/\\1\\/2\/\.htmlztbzBill Howells videos ~ Hah! I still had change : povrL_pStrP_replace 0 1 0 "$pHtmlL289" "$pStrPL289" to : povrL_pStrP_replace 0 0 0 "$pHtmlL289" "$pStrPL289" >> again, no change to the file. sed doesn't work the way I set it up I give up for now. Use the hand-coded file : cp 'povrL_put_pHead classe.txt' -> 'pHtmlClassGenrL.txt' #08********08 #] 08Sep2023 backup recovery of all of "$d_web"'pHtmlGenrL.txt' files!!! created "$d_bin"'backup recover.sh' - recover a list of files from backup drive >> It worked like a charm, probably lost 2 days of work #08********08 #] 08Sep2023 povrL_put_pHead() - put classes in povrL_put_pHead classe.txt Some of these are over-lapping, some miss items in other dirs!! geany regexpr search-replace : (\/home\/bill\/web\/home.html)\t.* \1\thome page (.*)\/web\/([A-Za-z0-9 _,()\[\]]*)\.html\t.* \1/web/\2\3.html\tPersonal (.*)\/web\/Bill Howells videos\/(.*)\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/Bill Howells videos/\2/\3.html\tBill Howells videos (.*)\/web\/CompLangs\/(.*)\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/CompLangs/\2/\3.html\tSoftware programming (.*)\/web\/economics, markets\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/economics, markets/\2.html\teconomics, markets (.*)\/web\/Neural nets\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/Neural nets/\2.html\tNeural nets (.*)\/web\/Neural nets\/.*\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/Neural nets/\2/\3.html\tNeural nets (.*)\/web\/Personal\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/Personal/\2.html\tPersonal (.*)\/web\/Professional\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/Professional/\2.html\tProfessional (.*)\/web\/ProjMajor\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/ProjMajor/\2.html\tProjMajor (.*)\/web\/ProjMini\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/ProjMini/\2.html\tProjMini (.*)\/web\/Qnial\/(.*)\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/Qnial/\2/\3.html\tSoftware programming (.*)\/web\/References\/Climate\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/References/Climate/\2.html\tProjMajor (.*)\/web\/References\/economics, markets\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/References/economics, markets/\2.html\teconomics, markets (.*)\/web\/References\/Neural nets\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/References/Neural nets/\2.html\tNeural nets (.*)\/web\/webOther\/(.*)\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/webOther/\2/\3.html\tHosted subSites don't work - manual for now, special search-replace later : /home/bill/web/References/Climate/Le Mouël, Lopes, Courtillot 22Feb2019 A Solar Signature in Many Climate Indices.html ProjMajor /home/bill/web/References/Mathematics/Functional Integration.html Neural nets (.*)\/web\/References\/Toynbee VII\/([A-Za-z0-9 -_,()\[\]]*)\.html\t.* \1/web/References/Toynbee VII/\2.html\tProjMajor no-class none left... +-----+ I put the above in fileops pHtmlL_put_pSedStatus(), with sed grammar (mostly ( to \( etc) Worried about \t in [search, replace] terms! backups of "$d_webWork"'[pHtmlGenrL.txt, pSedStatsL general.txt]' # example for general webSite, not d_[TrNNs_ART, conference guides] povr_pStrP_replace 0 0 0 "$d_webWork"'pHtmlGenrL.txt' "$d_webWork"'pSedStatsL general.txt' $ bash "$d_bin"'fileops run.sh' sed: -e expression #1, char 79: invalid reference \3 on `s' command's RHS /home/bill/web/bin/fileops.sh: line 565: : No such file or directory /home/bill/web/bin/fileops.sh: line 566: : No such file or directory ... >> problem in "$d_webWork"'pSedStatsL general.txt' - was missing ending tabs >> fix webSite_get_pHtmlL() to generate autpmatically add \t to line endings!!! >> "$d_webWork"'pHtmlGenrL with classes.txt' >> still need "classless" 'pHtmlGenrL.txt' for other work >> pHtmlGenrL.txt is now empty (thanks for backup!) - I regenerated it +-----+ at this point, regenerate html pth listings to generate "Class" pths : webSite_get_pHtmlL # lists of [mainWeb, confGuides, TrNNs_ART, webWorks] $ bash "$d_bin"'fileops run.sh' /home/bill/web/bin/fileops.sh: line 1946: : No such file or directory /home/bill/web/bin/fileops.sh: line 1949: : No such file or directory /home/bill/web/bin/fileops.sh: line 1950: : No such file or directory /home/bill/web/bin/fileops.sh: line 1951: : No such file or directory /home/bill/web/bin/fileops.sh: line 1952: : No such file or directory >> arghh ... OK, problems with capital letters, messed up ordering are OK However, calsses are NOT added to the pHtmlClass(*)L.txt >> so the sed replacements are not working >> use an intermediate chrCodeL to do this? or sed code for \t? eg, sed 's|$|\x9|' "$pHtmlGenrL769" >"$pHtmlClassGenrL769" >> OK, not there are trailing tabs +-----+ back to auto-replace classes, pHtmlClassL_put_pSeddClass "$d_webWork"'pHtmlClassGenrL.txt' "$d_webWork"'pSedStatsL general.txt' backup 'pSedStatsL general.txt' $ bash "$d_bin"'fileops run.sh' sed: -e expression #1, char 79: invalid reference \3 on `s' command's RHS /home/bill/web/bin/fileops.sh: line 565: : No such file or directory /home/bill/web/bin/fileops.sh: line 566: : No such file or directory ... >> same problems as before : pHtmlClassGenrL.txt is empty, must regenerate, backup 'pSedStatsL general.txt' might have to replace \t with code, then change back to \x9? first try \x9 >> nope fix : sed: -e expression #1, char 79: invalid reference \3 on `s' command's RHS 3 lines had that problem! try again : $ bash "$d_bin"'fileops run.sh' /home/bill/web/bin/fileops.sh: line 616: : No such file or directory /home/bill/web/bin/fileops.sh: line 617: : No such file or directory /home/bill/web/bin/fileops.sh: line 618: : No such file or directory >> much better! pLog issue - not defined in 'standard header.sh', or in most fileops.sh functions These could spit out huge hidden lists 'standard header.sh' uses p_log which must be defined by calling function. change for now to pLog use becho for both [screen, pLog] output define pLogs? but fileops.sh functions may depend on calling functions? figure out later, for now define for current project pLog="$d_bin"'standard log fileops.txt' put date in fileops.sh uses of pLog : date_ymdhms=$(date +"%0y%0m%0d %0kh%0Mm%0S") echo >>"$pLog" "date_ymdhms" now retry : povr_pStrP_replace error: OR[p_ovr, pStrPL] doesnt exist : povr : /home/bill/web/webWork/pHtmlClassGenrL.txt pHtmlClassGenrL.txt pStrP : /home/bill/web/webWork/pSedStatsL general.txt pSedStatsL general.txt >> but BOTH of these exist? check spelling... >> nyet - spelling is good (see above) and both exist and have content!!?? >> error has p_ovr, not povr, is this the problem? calling function supplies (correctly) : povr_607="$4" # file to overwrite after sed search-replace pStrP607="$5" # str[Old, New] search-replace pairs eg: pHtmlClassL_put_pSeddClass() povr_pStrP_replace 0 0 0 "$pHtmlClassL289" "$pHtmlSedddL289" pHtmlClassGenrL.txt is again empty - copy over from (copy) $ bash "$d_bin"'fileops run.sh' sed: -e expression #1, char 124: Invalid back reference >> what now? 230908 13h38m40 povr_strP_replace error, could not get : /media/bill/ramdisk/povr_pStrP_replace povr temp.txt but there is povr_strP_replace temp.txt povr_pStrP_replace pStrP temp.txt >> mixup in files!!! povr_pStrP_replace() while IFS='' read -r -u 607 strP607; do povr_strP_replace 0 0 0 "$povr_tmp607" "$strP607" done 607< "$pStrPtmp607" povr_strP_replace() sed "s|$strOld448|$strNew448|" "$povTmp448" >"$povTm_448" Something wrong with my basic thinking : >> maybe encode \t in "$pHtmlClassL289" "$pHtmlSedddL289" >> maybe this is the completely wrong approach? think it through for each pHtmlSeddd of pHtmlSedddL : use "$pHtmlSedddL289" to determine what the new code is sed the [old, new] code at the end of "$pHtmlClassL289" for now 'pSedStatsL general.txt' : remove first line : \(.*\)\/web\/\([A-Za-z0-9 -_,()\[\]]*\)\.html\x9.* \1/web/\2.html\x9Personal many lines : replace ^\(.*\)\/ with \/home\/bill\/web\/ start of 2nd term : replace \1/web/ with /home/bill/web/ end of first term : replace \x9.* with .* for now keep & test /x9 just before class (2nd term of strP) save copy of 'pSedStatsL general.txt' retry : now it runs, no error visible >> but pHtmlClassGenrL.txt has no class how about : take pHtmlGenrL.txt without Class use pHtmlSedddL to modify it \x9 might still be an issue, check & see, if problem use code for /x9 change : pHtmlClassL_put_pSeddClass "$d_webWork"'pHtmlClassGenrL.txt' "$d_webWork"'pSedStatsL general.txt' to : pHtmlClassL_put_pSeddClass "$d_webWork"'pHtmlGenrL.txt' "$d_webWork"'pSedStatsL general.txt' /home/bill/web/webOther/Wickson website/Steven Wickson.html sed: -e expression #1, char 138: Invalid back reference ...~160 of these? OK, now put special code in for \t : ztbz after - convert to tab SHIT!! ALL of the html files were zero'd out!!!! Huge work to reinstate BACKUPS!!! Why wasn't I working with a single file only in pHtmlGenL.txt? Days of work to get back where I was this afternoon!!!! +-----+ olde code webSite_get_pHtmlL() { pHtmlOnlyL769="$d_webWork"'pHtmlOnlyL.txt' pHtmlGenrL769="$d_webWork"'pHtmlGenrL.txt' pHtmlConfL769="$d_webWork"'pHtmlConfL.txt' pHtmlGrosL769="$d_webWork"'pHtmlGrosL.txt' pHtmlWebWL769="$d_webWork"'pHtmlWebWL.txt' pHtmlClassGenrL769="$d_webWork"'pHtmlClassGenrL.txt' pHtmlClassConfL769="$d_webWork"'pHtmlClassConfL.txt' pHtmlClassGrosL769="$d_webWork"'pHtmlClassGrosL.txt' pHtmlClassWebWL769="$d_webWork"'pHtmlClassWebWL.txt' find "$d_web" -type f -name "*.html" | grep --invert-match "z_Old\|z_Archive\|z_old\|z_archive\|z_history\|\/Forms\|System_maintenance\|Qnial\/Manuals\|Qnial\/code develop_test\|Qnial\/Qnial_bag\|Cool emails\|Yoonsuck Choe - conf program book\|ProjMini\/TrNNs_ART\/captions html\|Top 75 Immunotherapy\|OPM\/OPM\|bin\/0_test\|References\/Weart 2003\|Electric Universe\/References\/Randi Foundation 2008-2011\|References\/Niroma" | sort -u >"$pHtmlOnlyL769" grep "Neural nets\/Conference guides" "$pHtmlOnlyL769" | sort -u >"$pHtmlConfL769" grep "ProjMini\/TrNNs_ART" "$pHtmlOnlyL769" | grep --invert-match "webWork" | sort -u >"$pHtmlGrosL769" grep "webWork" "$pHtmlOnlyL769" | sort -u >"$pHtmlWebWL769" grep --invert-match "Neural nets\/Conference guides\|ProjMini\/TrNNs_ART\|webWork" "$pHtmlOnlyL769" | sort -u >"$pHtmlGenrL769" sed 's|$|\x9|' "$pHtmlGenrL769" >"$pHtmlClassGenrL769" sed 's|$|\x9|' "$pHtmlConfL769" >"$pHtmlClassConfL769" sed 's|$|\x9|' "$pHtmlGrosL769" >"$pHtmlClassGrosL769" sed 's|$|\x9|' "$pHtmlWebWL769" >"$pHtmlClassWebWL769" } # povr_put_pInserts_old() - insert pInsert into povr, also [skip, include] lines in povr # too hard! maybe come back later # developed to put html header into webPage, only good for a specific format!! # 07Sep2023 haven't yet put in randomCode to ensure no "crossover" of symbols # 07Sep2023 this is crazy for now - just too many variants! # www.BillHowell.ca 07Sep2023 initial povr_put_pInserts_old() { povr="$1" pin1="$2" pin2="$3" pLog="$d_webWork"'povr_put_pInserts log.txt' pTmp="$d_temp"'webPage_put_header' exec 9<"$povr" # get the indicators : [<TITLE>, menus, title=$( grep '<TITLE>' "$povr" ) docTp=$( grep '<!DOCTYPE HTML' "$povr" ) menus=$( grep 'Menu </a></td><TD><A HREF="./">' ) if [[ -n "$title" || -z "$docTp" || -z "$menus" ]]; then echo >>"$pLogsdfsf" "has no <TITLE>, or has [docType, menus] : " echo >>"$pLogsdfsf" " $povr" else # 1st line is pth - replace to be sure read -u 9 pth line='<!-- '"$povr"' -->' echo >>"$pTmp" "$line" # process up to <TITLE> line - ignore empty lines while read -u 9 line; do line=$( echo "$line" | sed 's|[ \t]||' ) if [ -n "$line" ]; then echo >>"$pTmp" "$line" else break fi done # for first non-[ \t] line, # run as long as fits format while read -u 9 line; do echo >>"$pTmp" "$line" done close 9 #mv "$pTmp" "$povr" fi } #08********08 #] 07Sep2023 manual-insert Menu, add to status, #] webPages add <!-- Howell start head --> & end (easy replace) # I started with Neural nets/Computational neuro-genetic modelling.html missing : copyright home.html some links are now webSite links, not d_web!!! no webPage content! : /home/bill/web/Bill Howells videos/160901 Big Data, Deep Learning, and Safety/0_Big Data, Deep Learning, and Safety.html not a webPage of mine : /home/bill/web/economics, markets/Cool stuff/z_history/201221 Michael_Wang_Official, TradingView - 15 Types of Financial Market Participants Explained for TVC SPX.html /home/bill/web/Freeman 27Oct2000 The Quality Adjustment Method, How Statistical Fakery Wipes Out Inflation.html /home/bill/web/economics, markets/Nuclear for tar sands 23Sep05.html utility html files : /home/bill/web/Forms/0_form webPage.html /home/bill/web/Forms/HTML example - Call for Sponsors.html /home/bill/web/Forms/HTML example - MPDI logo on skeleton web-page.html converted to txt : /home/bill/web/My sports & clubs/natural- CNPS/Flandern 1998 Speed of light Meta Research Bulletin of 6_15_94.html /home/bill/web/My sports & clubs/natural- Thunderbolts/150907 LiveEventPass for EU2015 re-broadcast to end Nov2015 Receipt.txt deleted (junk) : /home/bill/web/My sports & clubs/natural- SAFIRE/phase-three.html /home/bill/web/My sports & clubs/politic- Samara democracy/2109924 Samara - Online toxicity in the last week of the campaign.html #08********08 #] 05Sep2023 problem with [#: links? see "$d_bin"'0_test/fileops/dWeb_change_badPths/0_dWeb_change_badPths notes.txt' #08********08 #] 04Sep2023 html [old, new] strP changes - HUGE bug in str_strP_replace() see "$d_bin"'0_test/fileops/povrL_pStrP_replace/0_povrL_pStrP_replace_test notes.txt' # str_strP_replace() - escape str[Old, New] then replace strOld with strNew in apo str In : "$d_web"'z_Archive/230904 15h11m webWork/230904 pStrP.txt' I decoupled : https://www.BillHowell.ca/ /home/bill/web/ webSites - other people/ webOther/ Try again : >> SUCCESS!!!!! What are remaining failures of links? Economics & Markets : S&P500 1872-2020, 83y trend >> everything else looks OK from a quick click-through of all menu links. I'm ready to do the entire webSite, now that Menus work?!! #08********08 #] 04Sep2023 contine revamp [webPage, Menu, header, footer]s Simplify - one menu for all, but not including [d_TrNNs_ART, confGuides] "$webWork" : many Menus! where is bash script to amalgamate? "$d_bin"'webMenus build.sh' - build combination Menus for webSite, starting with home 04Sep2023 discontinued : all of this was put into fileops.sh colors provided for all Menus - a bit brazen, bold, but done re-sized all Menues 6 cols*160ox rather than 16% - more consistent output, easy to stack tables problem of revamped dirs - effect on links : /home/bill/web/economics, markets/PineScript/ /home/bill/web/CompLangs/PineScript/ /home/bill/web/Professional & Resume/ /home/bill/web/Professional/ /home/bill/web/Projects - major/ /home/bill/web/ProjMajor/ /home/bill/web/Projects - minor/ /home/bill/web/ProjMini/ /home/bill/web/security/encryption-decryption instructions.html /home/bill/web/System_maintenance/security/encryption-decryption instructions.html /home/bill/web/Software programming & code/LibreOffice macros/ /home/bill/web/CompLangs/LibreOffice/ /home/bill/web/Software programming & code/ /home/bill/web/CompLangs/ /home/bill/web/websites - other people/ /home/bill/web/webOther/ /home/bill/web/webWork files/ /home/bill/web/webWork/ Already in form "http://www.BillHowell.ca/" http://www.BillHowell.ca/Professional & Resume/_Resumes, work experience.html http://www.BillHowell.ca/Professional & Resume/education.html >> do this menu manually? Howell-produced videos http://www.BillHowell.caBill Howells videos/Howell - videos.html http://www.BillHowell.caBill Howells videos/Birkeland rotation in galaxy - not dark matter/Birkeland rotation in galaxy - not dark matter.html 08********08 #] 01Sep2023 rm all z_[Archive, Old] from "$d_PROJECTS"'webHtmlOnly/' see "$d_SysMaint"'Linux/rm notes.txt' final result : $ find "$d_PROJECTS"'webHtmlOnly/' -type d | grep "z_Archive$\|z_archive$\|z_Old$\|z_old$" | tr \\n \\0 | xargs -0 -IFILE rm -r "FILE" check result : $ find "$d_PROJECTS"'webHtmlOnly/' -type d | grep "z_Archive$\|z_Old$" >"$d_temp"'rm 08********08 #] 31Aug2023 create dir "$d_PROJECTS"'webHtmlOnly/', rsync html files there, change see "$d_SysMaint"'Linux/lftp notes.txt' >> Howell : try beval 'rsync '"$options"' --include="*.html" --exclude="*.*" "'"$d_src"'" "'"$d_out"'" >>"'"$p_log"'" ' This works, but includes : z_[Archive, Old, archive, old] files with no extension #24************************24 #] +-----+ #] Handy geany regxpr search-replace commands while working #] create "$d_webWork"'pStrPAll_L change.txt' from "$d_webWork"'pLnkBad.txt' backup-dated current 'pStrPAll_L change.txt' create new 'pStrPAll_L change.txt' copy-paste 'pLnkBad.txt' search : (.*) replace : \1\t\1 manually edit (search files for link to be sure) #] check "$d_webWork"'pStrPAll_L change.txt' for non-tabbed lines $ sed s'|\x9|xyZyx|' "$d_webWork"'pStrPAll_L change.txt' | grep --invert-match 'xyZyx' #] [add, replace]*[class] for "home" : search : (.*)\/web\/([A-Za-z0-9_,() ]*\.html\t.* replace : \1/web/\2.html\thome #] bookmarks search-replace bookmarks search replace initial <A id\=\"#(.*)\"> <\/a> <A id="#\1"><H3>\1</h3></a> or later <A id\=\"#(.*)\">.*<\/a> <A id="#\1"><H3>\1</h3></a> or later <A id\=\"#(.*)\"><H3>.*</h3><\/a> <BR>\n<A id="#\1"><H3>\1</h3></a> #] fix TblOfContents : <A id\=\"#(.*)\"><H3>(.*)</h3><\/a> <A id="#\1"> </a>\n\t <H3>\2</h3> nuts - was it permissions? (must be exeecutable) <A id\=\"#(.*)\"><\/a>\n\t <H3>(.*)</h3> <A id\=\"\1"><H3>\2</h3><\/a> #] MenuTop geany regexpr search-replace : search : <LI><A HREF\=\"#(.*)\">\n\t\t\t\t (.*)<\/a> replace : <A id="#\1"><H3>\2</h3></a> #] add p_log to each fileops.sh function except those that iterate many times : search \t(.*)\n^{ replace \t\1\n{\n\tdate_ymdhms=$(date +"%0y%0m%0d %0kh%0Mm%0Ss")\n\techo >>"$p_log" "$date_ymdhms \1"\n 08********08 #] +-----+ #] Old ToDos 07Sep2023 some links are now webSite links, not d_web!!!?? (fixed confGuides) 09Sep2023 script: change classes for [pHtmlClassGenrL, Menu, status, copyright, help] files 10Sep2023 [dir, fil] changes in [copyright, help, Menu, status] files 11Sep2023 legitimate conversions needed for dirs : write script to insert 1st line '<!-- /home/bill/web/' (no need for rest of path) /home/bill/web/Neural nets/Conference guides/ /home/bill/web/ProjMini/TrNNs_ART/ 12Sep2023 permissions are still a problem online - can lftp do this? William Astle of Lexicom.ca fixed my screwup of /billhowell.ca - must be 7xx permission!! 12Sep2023 p[MenuTop, Html, StrP]L * [Menu, status & updates, copyright, help] via "$d_bin"'fileops run webSite general.sh' /Climate and sun/ /Sun climate/ /Civilisations and sun/ /Sun civilisations/ /Solar modeling and forecasting/ /Sun model, forecast/ /Charvatova solar inertial motion & activity/ /Sun Charvatova/ /Pandemics, health, and the Sun/ /Sun pandemics, health/ /ProjMini/Puetz \& Borchardt/ /ProjMini/PuetzUWS/ 13Sep2023 upload ConfGuide, callerID-SNNs then go on to peer review! 13Sep2023 pOvrClassL_get_pClassL() STILL adds "ghost" classes!??? is this coming from all MenuTops? I haven't updated many... 14Sep2023 pOvrL_pStrPL_replace - should either [en, de]code [neither, both] [pOvr, pStrP] 14Sep2023 some are wrong!! "$d_bin"'0_test/fileops/fileops test.sh' 15Sep2023 TrNN webPages are still coded to online webSite! 15Sep2023 I am putting dir changes AFTER pth changes in pHtmlClassAll_L - safer 23Sep2023 webSite_getCheck_internalLinks doesn't capture all internal links? #24************************24 #] +-----+ #] Setup, templates #] template for "$d_webWork"'pStrPAll_L change.txt' ******************************* "$d_webWork"'pStrPAll_L change.txt' www.BillHowell.ca ?Aug2023? initial, 17Sep2023 clean up comments WARNING: tabs must NOT be in [comment, whiteSpace] lines, only 1 tab in each strP +-----+ paths with fileName +----+ bookmarks - hopefully these wont backfire +-----+ directory changes +-----+ regular expression (rgx) changes 17Sep2023 I will have to look for old notes, trials #] template for test of [1, 5] files : /home/bill/web/ProjMini/TrNNs_ART/Introduction.html /home/bill/web/ProjMini/TrNNs_ART/webWork/pMenuTopMenu TrNNs_ART.html /home/bill/web/Bill Howells videos/Howell - videos.html /home/bill/web/ProjMajor/Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html /home/bill/web/ProjMajor/Sun pandemics, health/_Pandemics, health, and the sun.html +-----+ Introduction.html : +--+ old errors : bad links, pStrP style : /home/bill/web/ProjMini/TrNNs_ART/John Taylors concepts.html /home/bill/web/ProjMini/TrNNs_ART/Taylors consciousness.html /home/bill/web/ProjMini/TrNNs_ART/Introduction.html#Credibility from non-[bio, psycho]logical applications of Grossberg's ART /home/bill/web/ProjMini/TrNNs_ART/Introduction.html#Credibility from non-[bio, psycho]logical applications of Grossberg's ART >> bad bookmark? no link? : That paralleled their use in very widespread applications in [science, engineering, etc]. +--+ +-----+ pMenuTopMenu TrNNs_ART.html : +--+ old errors : missing content in webPage /home/bill/web/ProjMini/TrNNs_ART/Pribram 1993 quantum fields and consciousness proceedings.html +--+ +-----+ Howell - videos.html : +--+ old errors : >> all OK +--+ +-----+ _Kyoto Premise - the scientists arent wearing any clothes.html +--+ old errors : bad links : /home/bill/web/ProjMajor/Climate - Kyoto Premise fraud/home/bill/web/ProjMajor/Climate - Kyoto Premise fraud/Lindzen - Don't Believe the Hype - Al Gore is wrong.pdf +--+ +-----+ _Pandemics, health, and the sun.html : +--+ old errors : >> none - all internal links are OK +--+ # enddoc