#] link d_Qndfs 'website notes.txt' # www.BillHowell.ca 06Oct2020 initial, based on earlier testing etc # see link d_Qndfs 'website updates instructions & summary.txt' curl sites : IF (OR ('200' '300' '301' '302' EACHLEFT in_string curlHdr)) I need to make an easy list of code explanations [OK, fail] +-----+ webOnline updates via FileZilla : make SURE that FileZilla -> Menu -> View -> Directory listing filters : set 'html_files' filter or not, depending on whether [html, all] files are to be uploaded +-----+ Problem with html file permissions - must set execute for [owner, group, public] +-----+ ToDos in the future : 19Nov2020 pgPosn - write routine for checking html `# links 22Nov2020 Amazon links are un-reliable 13Dec2020 extra
between last menu and 'normalStatus.html' for some webPages 13Dec2020 cannot have `& ? 13Dec2020 other unsolved problems 13Dec2020 check for duplicate fnames of webPages!!! 14Dec2020 apo (apostrophe) problems : !!linkError!!Allegre's second thoughts.pdf !!linkError!!Bill Howells videos/120214 Venus et Mars, au dela d'une histoire d amour/Mythology.flv !!linkError!!Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists aren't wearing any clothes.html !!linkError!!Lucas's Universal Force for electrodynamics, gravity, mass, etc !!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF !!linkError!!Software programming & code/bin/bin - Howell's web-page.html 48************************************************48 08********08 17Dec2020 problems with menu changes with uploads - added crypto page http://www.BillHowell.ca/economics, markets/currency-crypto/Cryptos versus [currencies, 10 year [rates, bonds]].html http://www.BillHowell.ca/economics,%20markets/currency-crypto/Cryptos%20versus%20[currencies,%2010%20year%20[rates,%20bonds]].html http://www.BillHowell.ca/Software%20programming%20&%20code/bin/encrypt%20-%20keys%20setup.sh http://www.BillHowell.ca/Software%20programming%20&%20code/bin/encrypt-close.sh http://www.BillHowell.ca/Software%20programming%20&%20code/bin/encrypt-open.sh 08********08 14Dec2020 fileZilla update webPages & check previous problems ALL Directory filters lost!!! Bullshit arrangement! NYET - it's OK, must highlight filter first [Author, Pub]guide menus OK Still a problem with : Neural Nets : MindCode earlier work Paper reviews Software : Linux bash scripts - still shows Wickson rest - all OK Projects - all OK Resume - all OK Publications - OK Videos - all OK Blogs - all OK didn't check rest 08********08 14Dec2020 rerun to check link status qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webSite_doAll A bit better, but not as much as I hoped. Current results : Failures : +--+------------+ |24|errors list | +--+------------+ [fail, unknown, OK, total] counts : +----+-------------+ | 63|failed links | +----+-------------+ | 373|unknown links| +----+-------------+ | 671|OK links | +----+-------------+ |1107|total | +----+-------------+ Yesterday's : Failures : +--+------------+ |43|errors list | +--+------------+ [fail, unknown, OK, total] counts : +----+-------------+ | 82|failed links | +----+-------------+ | 373|unknown links| +----+-------------+ | 657|OK links | +----+-------------+ |1112|total | +----+-------------+ Counts : Today : +----+-----------------------------+ |4593|count of all links in webSite| +----+-----------------------------+ 1107 = count of all [file, dir, url]s targeted by links on the webSite Yesterday : 4598 = count of all links in webSite 1112 = count of all [file, dir, url]s targeted by links in the webSite Many problems seem to have come back, and don't seem correct!!?!! : Old problems : !!linkError!! !!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv !!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/ !!linkError!!Climate and sun/Glaciation model 005 !!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language !!linkError!!Neural nets/Conference guides/Author guide website/N-19557 wrong paper [size, margin]s.pdf !!linkError!!Neural nets/Conference guides/Publicity website/INNS mass email instructions.odt !!linkError!!Pandemics, health, and the Sun/Howell - corona virus 2020.html !!linkError!!Personal/130728 Car collision with a deer.html !!linkError!!Software programming & code/bin/SSH/ !!linkError!!Software programming & code/Qnial/MY_NDFS/fileops.ndf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Design of QNial V7.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/V7 QNial Dictionary.html !!linkError!!Software programming & code/System_maintenance/ !!linkError!!Table of Contents !!linkError!!webAnalytics/ !!linkError!!webWork files/confMenu_authors.html apo in fnames : !!linkError!!Allegre's second thoughts.pdf !!linkError!!Bill Howells videos/120214 Venus et Mars, au dela d'une histoire d amour/Mythology.flv !!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF These links should be OK!?? Are they coming from 'code develop_test' or something? : !!linkError!!Charvatova solar inertial motion & activity/Verification/ !!linkError!!Cool emails/ !!linkError!!LibreOffice/ This is VERY frustrating!!!! I am missing something important 'Qnial.html' Why did these revert? I just changed them, but my program corrupted them!!! Maybe too deep? (5 levels -maxdepth 4) <UL> <LI> <A HREF="https://github.com/danlm/QNial7">QNial git clone</a> - this is the download site for the QNial system, including [documentation, system stuff, environment variables, data, operators, transformers, examples, [descriptions, ongoing challenges] of the mathematical theory underlying the language]. <LI> <A HREF="!!linkError!!Software programming & code/Qnial/Qnial_bag/docs/V7 QNial Dictionary.html">V7 QNial Dictionary.html</a>, <LI> <A HREF="!!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Design of QNial V7.pdf">Design of QNial V7.pdf</a> <LI> <A HREF="!!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf">Array Theory and the Design of Nial.pdf</a> </ul> $ find "/media/bill/SWAPPER/Website - raw/" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html:20: <LI><A HREF="!!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/"> $ find "/media/bill/SWAPPER/Website - raw/" -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:46: <LI> <A HREF="!!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf">Array Theory and the Design of Nial.pdf</a> 08********08 14Dec2020 recurring problems that don't get fixed -why? 1. menus are wrong 2. links to non-existant files, lenks that don't stay fixed +-----+ First - list multiple occurences of fnames setup.ndf - add optr : aryList_extractMulplicate_subArys IS OP selectOp aryList { LOCAL indices matches subAryList ; indices := tell (gage shape aryList) ; iPairs := (front indices) EACHBOTH pair (rest indices) ; subaryList := selectOp EACHRIGHT apply aryList ; matches := (front subaryList) EACHBOTH = (rest subaryList) ; indxHits := cull link (matches sublist iPairs) ; indxHits EACHLEFT pick aryList } webSite_extract_pathsSubDirsFnames Change : +.....+ host link 'find "' d_webRawe '" -type f -name "*" | grep --invert-match "z_Old\|z_Archive\|code develop_test\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " | sort -u >"' p_allFileList '" ' ; +.....+ To : +.....+ p_nonuniqueFileList := link d_temp 'webSite_extract_pathsSubDirsFnames nonuniqueFileList.txt' ; host link 'find "' d_webRawe '" -type f -name "*" | grep --invert-match "z_Old\|z_Archive\|code develop_test\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " | sort >"' p_nonuniqueFileList '" ' ; allMulplicateFnames := aryList_extractMulplicate_subArys "path_extract_fname (strList_readFrom_path p_nonuniqueFileList) ; host link 'sort -u "' p_nonuniqueFileList '" >"' p_allFileList '" ' ; +.....+ qnial> webSite_extract_pathsSubDirsFnames qnial> EACH write allMulplicateFnames /media/bill/SWAPPER/Website - raw/Lucas/math Howell/cos - 1 noo, iterative, non-feedback/d-dt Rpcs^-5_t__cos -1.txt /media/bill/SWAPPER/Website - raw/Lucas/math Howell/cos - 1 yes, iterative, non-feedback/d-dt Rpcs^-5_t__cos -1.txt /media/bill/SWAPPER/Website - raw/Projects - mini/History/Timechart of Military History - naval/0001iw/SCAN0000.rtf /media/bill/SWAPPER/Website - raw/Projects - mini/History/Timechart of Military History - naval/0002iw/SCAN0000.rtf /media/bill/SWAPPER/Website - raw/Projects - mini/History/Timechart of Military History - naval/0003iw/SCAN0000.rtf /media/bill/SWAPPER/Website - raw/Projects - mini/History/Timechart of Military History - naval/0004iw/SCAN0000.rtf /media/bill/SWAPPER/Website - raw/Projects - mini/History/Timechart of Military History - naval/0005iw/SCAN0000.rtf >> Too good to bee - hardly ANY mulplicate fnames in the whole webSite? almost impossible >> In any case, duplicate fnames, on this basis, aren't the problem. >> none of the mulplicates above are important to the li. >> I don't trust this +-----+ 1. menus are wrong, re-check & make list : PROJECTS Climate - Kyoto Premise fraud Neural Nets - OK Software prog - OK Pf&Resume - OK Pub&Report - OK Videos - OK Blogs - OK Howell blog - OK Cool stuff -OK Cool images - OK SuspObs comments -OK Crazy themes - OK Hosted - OK I only see a problem with "Climate - Kyoto Premise fraud" file:///media/bill/HOWELL_BASE/Website/Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html >> extra space? 'Menu projects.html' >> I just can't see a problem with the link!!??!!! leave it for now +-----+ 2. links to non-existant files, lenks that don't stay fixed work again with 'urls errors list.txt' p_allLinks := link d_webWork 'webURLs_extract allLinks.txt' ; $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html:20: <LI><A HREF="!!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/ Changed to : <LI><A HREF="[#=; backtrack ;=#]Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/"> Background material for the scenes</A> This shows a directory of Scenes, each listing [scripts with additional references (albeit vastly incomplete), images used but NOT videos due to copyrights]. <LI><A HREF="[#=; backtrack ;=#]Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Programming code/"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Charvatova solar inertial motion & activity/Verification/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html:93:<A HREF="!!linkError!!Charvatova solar inertial motion & activity/Verification/">Supporting documents, spreadsheets etc</a><BR> Changed to : <A HREF="[#=; backtrack ;=#]Charvatova solar inertial motion & activity/Verification/">Supporting documents, spreadsheets etc</a> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, towards a quasi-predictive model of the rise and fall of civilisations.pdf' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/_Pandemics, health, and the sun.html:419: <LI> <A HREF="!!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, towards a quasi-predictive model of the rise and fall of civilisations.pdf"> >> Changed to : <A HREF="[#=; backtrack ;=#]Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Cool emails/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/index.html:101: <LI> <A HREF="!!linkError!!Cool emails/">Cool emails</a> >> Changed to <A HREF="[#=; backtrack ;=#]Cool emails/"> also : <A HREF="[#=; backtrack ;=#]Projects - mini/Puetz & Borchardt/Howell - comments on Puetz UWS, the greatest of cycles, human implications.odt"> <A HREF="[#=; backtrack ;=#]Software programming & code/System_maintenance/"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!International Neural Network Society.JPG' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/webWork files/fin organisations.html:13: <IMG SRC="!!linkError!!International Neural Network Society.JPG" NAME="INNS" ALIGN=CENTER WIDTH=75 HEIGHT=75 BORDER=0> >> changed to : <IMG SRC="[#=; backtrack ;=#]logo Nial Systems Limited.jpg" NAME="QNial" ALIGN=CENTER WIDTH=170 HEIGHT=65 BORDER=0 VSPACE=0> <IMG SRC="[#=; backtrack ;=#]logo International Neural Network Society.bmp" NAME="INNS" ALIGN=CENTER WIDTH=75 HEIGHT=75 BORDER=0> <IMG SRC="[#=; backtrack ;=#]logo National Post.bmp" NAME="National Post" ALIGN=CENTER WIDTH=170 HEIGHT=90 BORDER=0 ISMAP> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!LibreCalc bank account macro system.txt' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/page Howell - blog.html:77:<LI><FONT SIZE=4><B><A HREF="!!linkError!!LibreCalc bank account macro system.txt"> $ find "$d_PROJECTS" -maxdepth 4 -type f -name 'LibreCalc bank account macro system.txt' /media/bill/PROJECTS/Investments/LibreCalc bank account macro system.txt >> I moved this to : Software programming & code/LibreOffice macros/ >> changed to : <A HREF="[#=; backtrack ;=#]Software programming & code/LibreOffice macros/LibreCalc bank account macro system.txt"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Menu.html' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Software for the Guides.html:68:<A HREF="!!linkError!!Menu.html">Authors' Guide Menu</a><BR> >> changed to : <A HREF="[#=; backtrack ;=#]webWork files/confMenu_authors.html"> >> also : <A HREF="[#=; backtrack ;=#]Software programming & code/Qnial/"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Neural nets/Conference guides/Publications website/CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Publications website/IEEE CrossCheck, chair.html:179:<LI>The Publications Chair responds to author inquiries about their CrossCheck rejection (see the <A HREF="!!linkError!!Neural nets/Conference guides/Publications website/CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt">Publications Chair explanation of CrossCheck results and analysis</A>), including a generic comment, the CrossCheck print-out pdf, CrossCheck analysis comments, and offers to respond to any questions that they may have. >> changed to : <A HREF="[#=; backtrack ;=#]Neural nets/Conference guides/Publications website/CrossCheck - Publications Chair explanation of CrossCheck results and analysis.html"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Publicity website/INNS mass email instructions.odt' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/IEEE ListServe publicity subscriptions.html:140: <LI><A HREF="!!linkError!!Publicity website/INNS mass email instructions.odt">INNS mass HOLDERS instructions.odt</A> >> changed to : <A HREF="[#=; backtrack ;=#]Neural nets/Conference guides/Publicity website/INNS - IEEE-CIS mass email policies.odt"> >> also : (Oops! I can't find this document - may have been supered by IEEE LstServer approach) : INNS mass emailers - easy [setup, approach] for mass emails.odt $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Regina Legoo. INNS Meetings & Program Manager. Association Resources. Washington. DC. USA <rlegoo@association-resources.com>' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Sponsors website/Instructions.html:34: </td><td><A HREF="!!linkError!!Regina Legoo. INNS Meetings & Program Manager. Association Resources. Washington. DC. USA <rlegoo@association-resources.com>"><A HREF="mailto:mailto:Bill Howell. Sponsors & Exhibits Chair. IJCNN2019 Budapest. Hussar. Alberta. Canada <Bill@BillHowell.ca>?subject=IJCNN2019%20Budapest%20-%20Confirmed%20Sponsor%20inquiry&body=Enter%20your%20[comment,%20question]%20below%20regarding%20Sponsor%20and%20Exhibit%20[storage,%20setup,%20teardown]%20at%20the%20conference,%20and%20for%20questions%20during%20the%20conference."> >> changed to : <A HREF="mailto:Regina Legoo. INNS Meetings & Program Manager. Association Resources. Washington. DC. USA <rlegoo@association-resources.com>?subject=IJCNN2019%20Budapest%20-%20Confirmed%20Sponsor%20inquiry&body=Enter%20your%20[comment,%20question]%20below%20regarding%20Sponsor%20and%20Exhibit%20[storage,%20setup,%20teardown]%20at%20the%20conference,%20and%20for%20questions%20during%20the%20conference.;&CC=Bill Howell. Sponsors & Exhibits Chair. IJCNN2019 Budapest. Hussar. Alberta. Canada <Bill@BillHowell.ca>"> <A HREF="mailto:Angela Aites. Accounting & cash receipts. Association Resources. Washington. DC. USA <>?subject=IJCNN2019%20Budapest%20-%20Confirmed%20Sponsor%20inquiry&body=Enter%20your%20[comment,%20question]%20below%20regarding%20your%20[Sponsor,%20Exhibitor]%20contract%20or%20payment.;&CC=Bill Howell. Sponsors & Exhibits Chair. IJCNN2019 Budapest. Hussar. Alberta. Canada <Bill@BillHowell.ca>"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Social media/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/page projects.html:61: <LI><A HREF="!!linkError!!Social media/"> >> change to : <A HREF="[#=; backtrack ;=#]Projects - mini/Social media/"> >> also : <A HREF="[#=; backtrack ;=#]Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html"> <A HREF="[#=; backtrack ;=#]Lucas/"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Software programming & code/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> Wow! lots of problems. ines other errors : !!linkError!!Software programming & code/bin/SSH/ !!linkError!!Software programming & code/Qnial/MY_NDFS/??? !!linkError!!Software programming & code/Qnial/MY_NDFS/fileops.ndf !!linkError!!Software programming & code/Qnial/MY_NDFS/website urls.ndf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Design of QNial V7.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/V7 QNial Dictionary.html !!linkError!!Software programming & code/System_maintenance/ /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Attendee downloads - summary.html:239: <LI> <A HREF="!!linkError!!Software programming & code/bin/SSH/">Click to see a directory of sftp-related bash scripts</A> that were used for the analysis of the sftp site. <BR> /media/bill/SWAPPER/Website - raw/page Software programming.html:59:<BR><A HREF="!!linkError!!Software programming & code/System_maintenance/"> >> change to : <A HREF="[#=; backtrack ;=#]Software programming & code/bin/SSH/"> For the following errors, I mostly replaced '!!linkError!!' '[#=; backtrack ;=#]',some comments as file not found : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:44: <LI> <A HREF="!!linkError!!Software programming & code/Qnial/Qnial_bag/docs/V7 QNial Dictionary.html">V7 QNial Dictionary.html</a>, /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:45: <LI> <A HREF="!!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Design of QNial V7.pdf">Design of QNial V7.pdf</a> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:46: <LI> <A HREF="!!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf">Array Theory and the Design of Nial.pdf</a> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:123:Besides augmenting [<A HREF="[#=; backtrack ;=#]Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Programming code/strings.ndf">strings.ndf</a>, <A HREF="!!linkError!!Software programming & code/Qnial/MY_NDFS/fileops.ndf">fileops.ndf</a>, other] files (describved in a section below) - I've added many operators that have come in handy for my [work, projects]. Here is a [random, scattered, incomplete] selection of my major QNial-based projects : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:142: <LI> <A HREF="!!linkError!!Software programming & code/Qnial/MY_NDFS/fileops.ndf">fileops.ndf</a> - /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:145: <LI> <A HREF="[#=; backtrack ;=#]Software programming & code/Qnial/MY_NDFS/windows system for startup.ndf">Linux computer startup</a> - [open, configure] cart [windows, workspaces], and start applications, which is now <A HREF="!!linkError!!Software programming & code/">bash-only based</a>. This doesn't sound like much, but is a huge time-saver for me, especially as I occasionally shut down my system, and regularly [clear, start] different workspaces. /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:146: <LI> <A HREF="!!linkError!!Software programming & code/Qnial/MY_NDFS/???">Linux drive backups</a> - This has evolved through several approaches over the years, and is currently <A HREF="[#=; backtrack ;=#]Software programming & code/bin/backup.sh">Linux bash based</a>, rather than still using QNial. Yeah, I know, many good backup programs are free (eg basic system with Linux), so why do I still waste my time on this? Good question now that huge drive capacity is cheap. But I still want the simple [flexibility, adaptability] of my system. /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:148: <LI> <A HREF="!!linkError!!Software programming & code/Qnial/MY_NDFS/website urls.ndf">[Check, correct] website links</a> - (All-new as of Oct2020.) All [internal, external (other peoples' webSites)] relevant links are checked to allow easy identification of problems and their correction. /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html:167:A <A HREF="[#=; backtrack ;=#]Software programming & code/bin/0_list of Howells bash_scripts.txt">near-full list of my bash scripts</a> is also available (see also <A HREF="!!linkError!!Software programming & code/bin/bin - Howell's web-page.html">my Linux [command, script] web-page</a>). <BR> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Table of Contents' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html:15:<h3><A HREF="!!linkError!!Table of Contents">Table of Contents</a> </h3> >> change to (ignore link-back - maybe later) : <h3>Table of Contents</h3> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!/webAnalytics' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author guide.html:93: <LI> <B><A HREF="!!linkError!!/webAnalytics">Google Analytics</a></b> - This provides a directory listing of snapshots of the Guides' web-page usage (the file prefixes denote YYMMDD - [year, month, day]). Clearly, only a fraction (~200) of the IJCNN2019 submitting co-authors used the Authors' Guide. My guess is that they were mostly students. As the Authors' Guide was NOT updated for WCCI2020 and the mass emails didn't announce it's availability (other than broken links in the last mass email), there has been no usage of the Authors' Guide for WCCI2020 as of 18Jan2020. >> change to : whatever ... $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> crap, lots of errors (12), many as I was too [lazy, rushed] to put them in to start with : +-----+ /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/IEEE CrossCheck.html:212:<B>What is <A HREF="!!linkError!!">CrossCheck</A>?</B> From an <A HREF="https://www.elsevier.com/editors/perk/plagiarism-complaints/plagiarism-detection">Elsevier webpage</A> : <BR> >> changed to : I simply removed the empty link /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Paper formatting blog.html:725:As an example, I have attached <A HREF="!!linkError!!N-19557 wrong paper [size, margin]s.pdf">paper 19557</A> with red-background margins and the standard [header, footer]s. Note that the paper [size, margin, font]s are all wrong. This is hard to see just by looking at it.<BR> >> changed to : <A HREF="[#=; backtrack ;=#]Neural nets/Conference guides/Author guide website/N-19557 wrong paper [size, margin]s.pdf"> /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Paper reviews.html:21: <LI> <I>Great paper, wrong conference</I> - Even great papers are be filtered out in the screening process if they do not fit the conference themes and <A HREF="https://www.ijcnn.org/callforpapers">topics</A>, albeit [WCCI,IEEE-CEC, IJCNN, IEEE-FUZZ] has been tolerant of innovative papers having some relevance, and that may differ from mainstream thinking. In such cases, the Chairs will often suggest alternate conference better suited to the paper. This can happen with papers from other areas of Computational Intelligence (such as <A HREF="!!linkError!!">Evolutionary Computation</A>, <A HREF="!!linkError!!">Fuzzy Systems</A>, and ???), but such papers can be accepted if they have a neural network component. >> changed to : I simply removed the empty links /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author guide.html:93: <LI> <B><A HREF="!!linkError!!/webAnalytics">Google Analytics</a></b> - This provides a directory listing of snapshots of the Guides' web-page usage (the file prefixes denote YYMMDD - [year, month, day]). Clearly, only a fraction (~200) of the IJCNN2019 submitting co-authors used the Authors' Guide. My guess is that they were mostly students. As the Authors' Guide was NOT updated for WCCI2020 and the mass emails didn't announce it's availability (other than broken links in the last mass email), there has been no usage of the Authors' Guide for WCCI2020 as of 18Jan2020. >> changed to (I thought that I had already fixed this!): <A HREF="[#=; backtrack ;=#]webAnalytics/"> /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author [PDF, CrossCheck]-like tests.html:45: <LI> a <A HREF="!!linkError!!/bash scripts/pdf edits/pdf insert [ISBN, copyright] by author, single paper.sh">bash script</A> to carry out the operations. You will have to adapt [directories, files] to your own system. >> changed to : <A HREF="[#=; backtrack ;=#]Software programming & code/bin/pdf edits/pdf insert [pubTitle, paper [title, numb], ISBN, copyright, authTitle] by author.sh"> /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author [PDF, CrossCheck]-like tests.html:46: <LI> a <A HREF="!!linkError!!/bash scripts/pdf edits/pdf insert [ISBN, copyright] by author, single paper.sh">LaTeX template</A> for the overlay >> changed to : <UL> <LI> <A HREF="[#=; backtrack ;=#]Software programming & code/bin/pdf edits/template generic.tex">template generic.tex</a> <LI> <A HREF="[#=; backtrack ;=#]Software programming & code/bin/pdf edits/template generic all pages - cpyRgt, publnTitle, paperAuthor, [paper, page]Num, permissions.tex">template generic all pages - cpyRgt, publnTitle, paperAuthor, [paper, page]Num, permissions.tex</a> <LI> <A HREF="[#=; backtrack ;=#]Software programming & code/bin/pdf edits/template generic first page only.tex">template generic first page only.tex</a> </ul> /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Publications website/PubChair guide.html:140:The IEEE-MCE webpage <A HREF="!!linkError!!">Quick links to Required Forms</A> is a good reference to keep in mind.<BR> >> changed to (I delethe link as lost - let the users search) : The IEEE-MCE webpage "Quick links to Required Forms" (oops - I've lost the link) /media/bill/SWAPPER/Website - raw/page Software programming.html:59:<BR><A HREF="!!linkError!!Software programming & code/System_maintenance/"> >> changed to : <A HREF="[#=; backtrack ;=#]Software programming & code/System_maintenance/"> /media/bill/SWAPPER/Website - raw/page blogs.html:17: <LI><A HREF="!!linkError!!Cool emails/"> >> changed to : <A HREF="[#=; backtrack ;=#]Cool emails/"> /media/bill/SWAPPER/Website - raw/Lies, Damned Lies, and Scientists/General Relativity is a turkey.html:96:As this issue is already covered in my webPage <A HREF="!!linkError!!">"General Relativity is a turkey?"</a>, please refer to it.<BR> >> changed to : I deleted the whole paragraph, as it was for the Quantum mechanics webPage. +-----+ These are already flagged for apos : /media/bill/SWAPPER/Website - raw/Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html:91: <A HREF="!!linkError!!Allegre's second thoughts.pdf"> /media/bill/SWAPPER/Website - raw/Climate and sun/_Climate and sun.html:28: <LI><A HREF="!!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF"><FONT SIZE=4> nothing showing from find (null stdout) - some because of 'code develop_test' exclusion?, others already fixed? : !!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv !!linkError!!Climate and sun/Glaciation model 005 !!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language !!linkError!!LibreOffice/ !!linkError!!N-19557 wrong paper [size, margin]s.pdf !!linkError!!Pandemics, health, and the Sun/Howell - corona virus 2020.html !!linkError!!Personal/130728 Car collision with a deer.html !!linkError!!Publicity website/INNS mass emailers - easy [setup, approach] for mass emails.odt !!linkError!!Puetz greatest of cycles/ !!linkError!!Qnial !!linkError!!/bash scripts/pdf edits/pdf insert [ISBN, copyright] by author, single paper.sh apo (apostrophe) problems : !!linkError!!Allegre's second thoughts.pdf !!linkError!!Bill Howells videos/120214 Venus et Mars, au dela d'une histoire d amour/Mythology.flv !!linkError!!Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists aren't wearing any clothes.html !!linkError!!Lucas's Universal Force for electrodynamics, gravity, mass, etc !!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF !!linkError!!Software programming & code/bin/bin - Howell's web-page.html 08********08 13Dec2020 fix menu errors of normalSite (not confGuideSite) Add projects menu menu to all 'hosted sites' >> manually done Move [Prechter, Puetz] menu items in projects menu >> done Add 'normalStatus.html' to all normalPages >> created optr normalGuide_header qnial> webSite_extract_pathsSubDirsFnames qnial> EACH normalGuide_header htmlNormalPages >> seems OK +-----+ corrupted <TITLE> - scrapped menus? NYE - should work? S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html The presence of `& may have caused the problem - this will be a problem elsewhere for <TITLE>? qnial> EACH write (('&' EACHRIGHT substr_in_str htmlFnamesSortedByFname) sublist htmlFnamesSortedByFname) ; + _Charvatova - solar inertial motion & activity.html + _Solar modeling & forecasting.html n/a Long term market indexes & PPI 0582.html + page Publications & reports.html + Past & future worlds.html OK S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html +--+ + had already been fixed OK fixed now n/a not a normal wage, ignore +-----+ 403 You don't have permission to access this resource. 150525 Icebreaker unchained%20 - We should have lost World War II/ - screwed path >> ?? can't see a problem? will have to find corrupted menu link? Neural networks - no menus or files for [MindCode, NN earlier work, Paper reviews] >> can't see a problem with : [MindCode, MindCode earlier work (renamed), Paper reviews] >> wait & see?? +-----+ Bad menus - several PROJECTS!! - should be OK? maybe weren't updated? below _Lies, damned lies, and scientists.html OK? page Howell - Hope-to-do projects, active and planned.html - no PROJECTS menu OK? Neural nets/MindCode/ - no html file? but doesn't need one! OK? Software programming & code/bin/ - gives Wickson html!!!?? "Menu Lies, Damned Lies, and Scientists.html' changed to : Problems with Science </TD><TD><A HREF="[#=; backtrack ;=#]Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html"> Lies, Damned Lies, and Scientists </A></TD><TD><A HREF="[#=; backtrack ;=#]Lies, Damned Lies, and Scientists/General Relativity is a turkey.html"> General Relativity turkey? </A></TD><TD><A HREF="[#=; backtrack ;=#]Lies, Damned Lies, and Scientists/Quantum Mechanics is a fools paradise.html"> Quantum Mechanics fools paradise? </A></TD><TD><A HREF="[#=; backtrack ;=#]Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html"> _Kyoto Premise fraud? </A></TD><TD><A HREF="[#=; backtrack ;=#]Pandemics, health, and the Sun/corona virus/Howell - corona virus.html"> Corona virus screwups? </A></TD></tr> index.html - Howell's photo >> I changed it to the new photo qnial> webPageSite_update l (link d_webRawe 'index.html') >> OK +-----+ qnial> webSite_doAll extra <BR> between last menu and 'normalStatus.html' for some webPages >> leave it for next round of fixes 'page Howell - Hope-to-do projects, active and planned.html' >> missing projects menu qnial> webPageSite_update l (link d_webRawe 'page Howell - Hope-to-do projects, active and planned.html') >> OK eliminate redundant menu from Lies, Damned Lies series : [#!: path_executeEmbedsInsertIn_fHand (link d_webWork 'Menu Lies, Damned Lies, and Scientists.html') phraseValueList ; qnial> webPageSite_update l (link d_webRawe 'Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html') qnial> webPageSite_update l (link d_webRawe 'Lies, Damned Lies, and Scientists/General Relativity is a turkey.html') qnial> webPageSite_update l (link d_webRawe 'Lies, Damned Lies, and Scientists/Quantum Mechanics is a fools paradise.html') economics, markets/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html >> why isn't projects menu showing? I just put it in?!?!! qnial> webPageSite_update l (link d_webRawe 'economics, markets/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html') >> nyet - problem with link!?!? I change menu projects : <TD><A HREF="[#=; backtrack ;=#]economics, markets/SP500/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html"> S&P500 P/E ratios vs Treasury rates </A></TD> qnial> webPageSite_update l (link d_webRawe 'economics, markets/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html') ?webPageSite_update file unknown error : /media/bill/SWAPPER/Website - raw/economics, markets/S&P 500 Shiller-forward PE versus 10y Treasury bond rate.html missing subDir : qnial> webPageSite_update l (link d_webRawe 'economics, markets/SP500/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html') Actual link : file:///media/bill/HOWELL_BASE/Website/economics,%20markets/S&P%20500%20Shiller-forward%20PE%20versus%2010y%20Treasury%20bond%20rates.html >> WRONG!! shouldn't be there!! >> What the sam hill is going on? I'm worried that webPage[Rawe,Site] updates aren't being done to all?!?? >> Oops - do this first? qnial> webPageRawe_update l (link d_webRawe 'economics, markets/SP500/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html') qnial> webPageSite_update l (link d_webRawe 'economics, markets/SP500/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html') >> Still not updating properly - puts in the wrong link. Why!!?? Is it the `&? >> Nope - others work Others with looks-to-be-the-same problem : Climate - Kyoto Premise fraud file:///media/bill/HOWELL_BASE/Website/Climate%20-%20Kyoto%20Premise%20fraud/_Kyoto%20Premise%20-%20%20the%20scientists%20arent%20wearing%20any%20clothes.html General relativity is a turkey? file:///media/bill/HOWELL_BASE/Website/Pandemics,%20health,%20and%20the%20Sun/_Pandemics,%20health,%20and%20the%20sun.html#Robert%20Prechter%20-%20Socionomics,%20the%20first%20quantitative%20sociology? Quantum mechanics is a fools paradise? file:///media/bill/HOWELL_BASE/Website/Lies,%20Damned%20Lies,%20and%20Scientists/General%20Relativity%20is%20a%20turkey,%20Quantum%20Mechanics%20is%20a%20fools%20paradise.html >> Ah Hah! still the wrong fname! Robert Prechter - Socionomics file:///media/bill/HOWELL_BASE/Website/Lies,%20Damned%20Lies,%20and%20Scientists/General%20Relativity%20is%20a%20turkey,%20Quantum%20Mechanics%20is%20a%20fools%20paradise.html >> Ah Hah! still the wrong fname! but this works (suedly same link!!) : file:///media/bill/HOWELL_BASE/Website/Pandemics,%20health,%20and%20the%20Sun/_Pandemics,%20health,%20and%20the%20sun.html#Robert%20Prechter%20-%20Socionomics,%20the%20first%20quantitative%20sociology? General Relativity is a turkey, Quantum Mechanics is a fools paradise.html >> not in 'Menu projects.html' >> so why does it keep showing up? I'm lost and confused - go to 'errors list' many still have `# - these should have been put into 'pgPosn list' internalLinks_return_relativePath Change : +.....+ IF (OR (= `# (first lineList@midIndx)) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@midIndx))) THEN null ; +.....+ To : +.....+ IF (OR (`# chr_in_str lineList@midIndx) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@midIndx))) THEN null ; +.....+ webURLs_extract don't changeow - wait and see effect of change above Better look at remaining problems : $ grep --invert-match '#' "$d_webRawe""webWork files/urls errors list.txt" !!linkError!! !!linkError!!Allegre's second thoughts.pdf !!linkError!!/bash scripts/pdf edits/pdf insert [ISBN, copyright] by author, single paper.sh !!linkError!!Bill Howells videos/120214 Venus et Mars, au dela d'une histoire d amour/Mythology.flv !!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv !!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/ !!linkError!!bin/blog-format.sh !!linkError!!bin/pdf edits !!linkError!!bin/SSH/ !!linkError!!Charvatova solar inertial motion & activity/Verification/ !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, towards a quasi-predictive model of the rise and fall of civilisations.pdf !!linkError!!Climate and sun/Glaciation model 005 !!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language !!linkError!!Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists aren't wearing any clothes.html !!linkError!!Cool emails/ !!linkError!!CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt !!linkError!!Google analytics !!linkError!!Howell - Are we ready for global cooling.pdf !!linkError!!International Neural Network Society.JPG !!linkError!!LibreCalc bank account macro system.txt !!linkError!!LibreOffice/ !!linkError!!LibreOffice/LibreCalc bank account macro system.txt !!linkError!!Lucas's Universal Force for electrodynamics, gravity, mass, etc !!linkError!!Menu.html !!linkError!!N-19557 wrong paper [size, margin]s.pdf !!linkError!!National Post.jpg !!linkError!!Nial Systems Limited.JPG !!linkError!!Pandemics, health, and the Sun/Howell - corona virus 2020.html !!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF !!linkError!!Personal/130728 Car collision with a deer.html !!linkError!!Publicity website/INNS mass emailers - easy [setup, approach] for mass emails.odt !!linkError!!Publicity website/INNS mass email instructions.odt !!linkError!!Puetz greatest of cycles/ !!linkError!!Qnial !!linkError!!Randell Mills - hydrinos/ !!linkError!!Regina Legoo. INNS Meetings & Program Manager. Association Resources. Washington. DC. USA <rlegoo@association-resources.com> !!linkError!!Social media/ !!linkError!!Software programming & code/ !!linkError!!Software programming & code/bin/bin - Howell's web-page.html !!linkError!!Software programming & code/Qnial/MY_NDFS/??? !!linkError!!Software programming & code/Qnial/MY_NDFS/fileops.ndf !!linkError!!Software programming & code/Qnial/MY_NDFS/website urls.ndf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Design of QNial V7.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/V7 QNial Dictionary.html !!linkError!!Software programming & code/System_maintenance/ !!linkError!!Table of Contents +--+ $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Personal/130728 Car collision with a deer.html' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" >> no result? $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Personal/130728 Car collision with a deer.html' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html convertBodyLinks.html:516:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html convertBodyLinks.html:651:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html:516:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html:651:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html str_replaceIn_path.html:516:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html str_replaceIn_path.html:651:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html update.html:581:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- page Howell - blog.html update.html:716:<LI><FONT SIZE=4><A HREF="!!linkError!!Personal/130728 Car collision with a deer.html"> >> Shoose! these don't count!! >> remove 'code develop_test' $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Personal/130728 Car collision with a deer.html' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK, none show webSite_extract_pathsSubDirsFnames Change : +.....+ host link 'find "' d_webRawe '" -type f -name "*" | grep --invert-match "z_Old\|z_Archive\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " | sort -u >"' p_allFileList '" ' ; +.....+ To : +.....+ host link 'find "' d_webRawe '" -type f -name "*" | grep --invert-match "z_Old\|z_Archive\|code develop_test\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " | sort -u >"' p_allFileList '" ' ; +.....+ $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/120214 Venus et Mars, au dela d'une histoire d amour/Mythology.flv' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> I chopped off apo $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/120214 Venus et Mars, au dela ' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK, nothing $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!bin/blog-format.sh' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Attendee downloads - summary.html:240: <LI> Nifty bash script to help with <A HREF="!!linkError!!bin/blog-format.sh">re-formatting emails to html blog format</A>.<BR> Change : +.....+ <LI> <A HREF="!!linkError!!bin/SSH/">Click to see a directory of sftp-related bash scripts</A> that were used for the analysis of the sftp site. <BR> <LI> Nifty bash script to help with <A HREF="!!linkError!!bin/blog-format.sh">re-formatting emails to html blog format</A>.<BR> +.....+ To : +.....+ <LI> <A HREF="[#=; backtrack ;=#]Software programming & code/bin/SSH/">Click to see a directory of sftp-related bash scripts</A> that were used for the analysis of the sftp site. <BR> <LI> Nifty bash script to help with <A HREF="[#=; backtrack ;=#]Software programming & code/bin/conference guides - format html.sh">re-formatting emails to html blog format</A>.<BR> +.....+ $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/140214 Nazis saved Norwegians video/Nazis saved Norwegian lives.flv' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> nothing? Are many '!!linkError!!' somehow lingering AFTER solved? Just do a full update, and come back : qnial> bye qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' qnial> webSite_doAll +-----+ Check those already fixed : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!bin/blog-format.sh' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK (trivial) $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!bin/blog-format.sh' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK +-----+ Check new ones : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!bin/pdf edits' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author guide.html:96: <LI> <B><A HREF="!!linkError!!bin/pdf edits"> >> changed to "[#=; backtrack ;=#]Software programming & code/bin/pdf edits/" $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Charvatova solar inertial motion & activity/Verification/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html:93:<A HREF="!!linkError!!Charvatova solar inertial motion & activity/Verification/">Supporting documents, spreadsheets etc</a><BR> >> changed to : <A HREF="[#=; backtrack ;=#]Charvatova solar inertial motion & activity/Charvatova related files/Howell - solar inertial motion - NASA-JPL versus Charvatova.pdf"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Galactic rays and evolution/_Galactic rays and evolution - life, the mind, civilisation, economics, financial markets.html:70: <A HREF="!!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf"> /media/bill/SWAPPER/Website - raw/Climate and sun/_Climate and sun.html:100: <A HREF="!!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf"> >> both Changed to : <A HREF="[#=; backtrack ;=#]Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Climate and sun/Glaciation model 005' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK, no shows $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Climate and sun/Laskar etal model for solar insolation in QNial programming language' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" >> OK, no shows $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Cool emails/' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/page blogs.html:17: <LI><A HREF="!!linkError!!Cool emails/"> /media/bill/SWAPPER/Website - raw/index.html:101: <LI> <A HREF="!!linkError!!Cool emails/">Cool emails</a> >> both changed to : <A HREF="[#=; backtrack ;=#]Cool emails/"> Problem - the linkErrors are being ignored, but should remove linkErrod process link anyways, or this becomes all manual $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Publications website/IEEE CrossCheck, chair.html:179:<LI>The Publications Chair responds to author inquiries about their CrossCheck rejection (see the <A HREF="!!linkError!!CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt">Publications Chair explanation of CrossCheck results and analysis</A>), including a generic comment, the CrossCheck print-out pdf, CrossCheck analysis comments, and offers to respond to any questions that they may have. >> Changed to :<A HREF="[#=; backtrack ;=#]Neural nets/Conference guides/Publications website/CrossCheck - Publications Chair explanation of CrossCheck results and analysis.txt"> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Google analytics' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/Neural nets/Conference guides/Author guide website/Author guide.html:93: <LI> <B><A HREF="!!linkError!!Google analytics">Google Analytics</a></b> - This provides a directory listing of snapshots of the Guides' web-page usage (the file prefixes denote YYMMDD - [year, month, day]). Clearly, only a fraction (~200) of the IJCNN2019 submitting co-authors used the Authors' Guide. My guess is that they were mostly students. As the Authors' Guide was NOT updated for WCCI2020 and the mass emails didn't announce it's availability (other than broken links in the last mass email), there has been no usage of the Authors' Guide for WCCI2020 as of 18Jan2020. >> changed to (after moving directories) : <A HREF="[#=; backtrack ;=#]/webAnalytics">Google Analytics</a> $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!Howell - Are we ready for global cooling.pdf' "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | grep --invert-match "code develop_test" /media/bill/SWAPPER/Website - raw/page Publications & reports.html:63: <LI>Bill Howell <A HREF="!!linkError!!Howell - Are we ready for global cooling.pdf">"Are we ready for global cooling?" </A>- A short presentation to Toastmasters – Dows Lake, Ottawa, 14Mar06. Needs corrections and comments! (some time later...)<BR><BR> >> Changed to : <A HREF="[#=; backtrack ;=#]Climate - Kyoto Premise fraud/Howell - Are we ready for global cooling 14Mar06 longer version.pdf"> This is [long, boring] work! Again - Problem - the linkErrors are being ignored, but should remove linkErrod process link anyways, or this becomes all manual >> NYET - codalready does that. most problems are [mis-spelt, renamed] files? Just re-do and update webOnline qnial> webSite_doAll 08********08 12Dec2020 check for failures /media/bill/SWAPPER/Website - raw/z_Archive/201211 18h09m50s backups webPageRawe_update corrupted <TITLE> - scrapped menus? NYE - should work? S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html 403 You don't have permission to access this resource. 150525 Icebreaker unchained%20 - We should have lost World War II/ - screwed path Neural networks - no menus or files for [MindCode, NN earlier work, Paper reviews] Bad menus - several PROJECTS!! - should be OK? maybe weren't uploaded _Lies, damned lies, and scientists.html S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html page Howell - Hope-to-do projects, active and planned.html - no PROJECTS menu Neural nets/MindCode/ - no html file? Software programming & code/bin/ - gives Wickson html!!!?? Hosted websites - add projects menu "Problems with Science" menu not in PROJECTS? Lies, Damned Lies, and Scientists/General Relativity is a turkey, Quantum Mechanics is a fools paradise.html GR turkey, QM fools paradise -no menu or page? I need a message insert file for Howell side of website (like confGuides)!! 08********08 09Dec2020 STOP working on this! - simple patch, get onto MindCode etc!!! qnial> str_executeEmbeds '<TITLE> Howell : [#=; fname ;=#] ' (("fname 'This is a test.html')("fout 5)("backtrack './../../')) Howell : This is a test.html >> This works fine! So why not within webSite_doAll? Forget it - some other year... Simple ConfGuide fix : qnial> str_replaceIn_pathList l d_webRawe ' Howell : [#=; fname ;=#] ' ' [Author, Committee]s guides to [IEEE-CIS, INNS] conferences ' htmlPathsSortedByPath qnial> webSite_doAll >> cofGuides webPage titles OK >> Nuts - NapierU logo for 2020 missing. >> extra space in 'IMG SRC=' qnial> webSite_doAll Upload - FileZilla is uploading ALL of Dad's paintings AGAIN! - bullshit! I switched to lftp - but now it's uploading everything, probably die to FileZilla fuckup. VERY slow!! $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" For now - just lftp the html files only! >> Nope, can't get log OR excludesork!!??? FileZilla - set up different download manager for html-only 08********08 07Dec2020 Now I need to : 1. check my entire website as others can't seem to see content!!?? 2. test ConfGuide updates selectively, webPage by webPage 3. modify [bash, QNial] to do confGuides? 4. [update, upload] all webPages +--+ 1. check my entire website as others can't seem to see content!!?? http://www.billhowell.ca/Neural%20nets/Neural%20Networks.html go to village office - ask Kate Brandt no problems there wait : look at /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt Website stats for : www.BillHowell.ca 201125 12h50m26s Failures : +--+------------+ |79|errors list | +--+------------+ |22|extern fails| +--+------------+ |38|howell list | +--+------------+ |27|intern fails| +--+------------+ At least fix "howell list" : 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' : >> I need to [update, test] this optr to remove 'http://www.BillHowell.ca' internalLinks_return_relativePath Change : +.....+ % modify lineList#midIndxs for legit [fname, subDir] -> expand to relative paths ; FOR midIndx WITH midIndxs DO % don't modify lines with midIndxsLines_bads ; IF (OR (= `# (first lineList@midIndx)) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@midIndx))) THEN null ; % check for a valid fname-only, assumes only one instance of fname ; ELSEIF (NOT isfault (i_fname := find_Howell lineList@midIndx allFnamesSortedByFname)) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRawe) ; +.....+ To : +.....+ % modify lineList#midIndxs for legit [fname, subDir] -> expand to relative paths ; FOR midIndx WITH midIndxs DO % remove any http://www.BillHowell.ca BEFORE checking midIndxsLines_bads (http etc) ; IF ('http://www.billhowell.ca' subStr_in_str (str_toLowerCase lineList@midIndx)) THEN lineList@midIndx := 24 drop lineList@midIndx ; ENDIF ; % don't modify midIndxs with midIndxsLines_bads ; IF (OR (= `# (first lineList@midIndx)) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@midIndx))) THEN null ; ELSE % remove %20 from links, now that mailtos are no longer considered ; IF ('%20' subStr_in_str lineList@midIndx) THEN lineList@midIndx := str_replace_subStr '%20' ' ' lineList@midIndx ; ENDIF ; % check for a valid fname-only, assumes only one instance of fname ; IF (NOT isfault (i_fname := find_Howell lineList@midIndx allFnamesSortedByFname)) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRawe) ; +.....+ To re-try each time to resolve linkErrors, Change : +.....+ midIndxsLines_bads := 'http' 'mailto:' '!!linkError!!' './' ; +.....+ To : +.....+ midIndxsLines_bads := 'http' 'mailto:' './' ; +.....+ plus add line : liner := liner str_remove_subStr '!!linkError!!' ; +-----+ 2. test ConfGuide updates selectively, webPage by webPage http://www.billhowell.ca/Neural%20nets/Conference%20guides/Author%20guide%20website/Conference%20registration%20blog.html qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >> had to fix a few coding glitches qnial> webPageRawe_update o (link d_webRawe 'Neural nets/Conference guides/Author guide website/Conference registration blog.html') >> looks great! (p_log) Now update webPage[Rawe, Site] for real qnial> webPageRawe_update l (link d_webRawe 'Neural nets/Conference guides/Author guide website/Conference registration blog.html') qnial> webPageSite_update l (link d_webRawe 'Neural nets/Conference guides/Author guide website/Conference registration blog.html') >> IJCNN2019 sponsor logos don't show!! >> Oops - I didn't change the "message" for each Menu.html of the confGuides!! >> webWork files : OK, I changed all of the "messages" +-----+ 3. modify [bash, QNial] to do confGuides? Screw it - leap of faith, just update full deal 4. [update, upload] all webPages qnial> webSite_doAll Failures : +--+------------+ |74|errors list | +--+------------+ |22|extern fails| +--+------------+ |0 |howell list | +--+------------+ |27|intern fails| +--+------------+ >> all 'howell list' are now OK >> same number of other failures as previous MOST 'intern fails' are of form : ./SP500 1872-2020 TradingView, 1928-2020 yahoo finance.ods ./SP500 1872-2020 TradingView download.dat ./SP500 1928-2020 yahoo finance.dat ./Table of Contents ./WCCI2020 mass email [SS,Comp,Tut,Wrkshp] 191025 Howell.html ./Website tie-ins.html >> easy to fix (... someday...) Check webPageSites - start with confGuides >> everything looks OK EXCEPT I forgot the for confGuides!! Edit [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand (link d_webWork 'confHead.html') fout ; >> phraseValueList doesn't include fname. Is it available within webPageSite_update? >> So [#!: path_insertIn_fHand is just a guide (throw-away line, as coding isn't executed) webPageSite_update Change : +.....+ % so here executeEmbeds if present, with (phrase values) pairList ; IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line (("fout fout)("backtrack backtrack)) ; ENDIF ; +.....+ To : +.....+ % so here executeEmbeds if present, with (phrase values) pairList ; IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line (("fname fname)("fout fout)("backtrack backtrack)) ; ENDIF ; +.....+ confHead.html Change : +.....+ <TITLE>??? +.....+ To : +.....+ Howell : +.....+ Let's see if that works : qnial> webSite_doAll >> didn't work, because 'Howell : [#=; fname ;=#]' is in : [#!: path_insertIn_fHand (link d_webWork 'confHead.html') fout ; >> This isn't executed, seeing as it isn't in the webPageRawe file Two options - 1. nested substitution 2. Change : +.....+ [#!: path_insertIn_fHand (link d_webWork 'confHead.html') fout ; +.....+ To : +.....+ [#!: path_insertIn_fHand (link d_webWork 'fin Head_one.html') fout ; Howell : [#=; fname ;=#] [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout ; +.....+ Take the second - easier for now. #] change strOld to strNew in pathList, for strPattern, automatic path backups to d_backup # be careful - can screw up many files if str is not unique!!! # chopped-up line : str_replaceIn_pathList l d_webRawe (link '[#!: path_insertIn_fHand (link d_webWork ' chr_apo 'confHead.html' chr_apo ') fout ;') (link '[#!: path_insertIn_fHand (link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' index.html ' chr_newline '[#!: path_insertIn_fHand (link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ' ) htmlPathsSortedByPath # all webPages : str_replaceIn_pathList l d_webRawe (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'confHead.html' chr_apo ')' chr_tab 'fout ;') (link '[#!: path_insertIn_fHand (link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' Howell : [#=; fname ;=#] ' chr_newline '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ') htmlPathsSortedByPath # just test with one file str_replaceIn_pathList o d_webRawe (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'confHead.html' chr_apo ')' chr_tab 'fout ;') (link '[#!: path_insertIn_fHand '(link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' Howell : [#=; fname ;=#] ' chr_newline '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ') (solitary (link d_webRawe 'Neural nets/Conference guides/Author guide website/Author guide.html')) >> Result +--+ [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand(link d_webWork 'fin Head_one.html') fout ; Howell : [#=; fname ;=#] [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout ; [#!: path_executeEmbedsInsertIn_fHand (link d_webWork 'confMenu_authors.html') phraseValueList ; +--+ >> OK - just add some tab str_replaceIn_pathList o d_webRawe (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'confHead.html' chr_apo ')' chr_tab 'fout ;') (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' Howell : [#=; fname ;=#] ' chr_newline '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ') (solitary (link d_webRawe 'Neural nets/Conference guides/Author guide website/Author guide.html')) +--+ [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand (link d_webWork 'fin Head_one.html') fout ; Howell : [#=; fname ;=#] [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout ; [#!: path_executeEmbedsInsertIn_fHand (link d_webWork 'confMenu_authors.html') phraseValueList ; +--+ >> beautiful Take a break - Friends of Science AGM Full-meal deal : str_replaceIn_pathList l d_webRawe (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'confHead.html' chr_apo ')' chr_tab 'fout ;') (link '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_one.html' chr_apo ') fout ; ' chr_newline ' Howell : [#=; fname ;=#] ' chr_newline '[#!: path_insertIn_fHand' chr_tab chr_tab chr_tab chr_tab chr_tab '(link d_webWork ' chr_apo 'fin Head_two.html' chr_apo ') fout ; ') htmlPathsSortedByPath qnial> webSite_doAll >> Oops - didn't work, as the line below was "erased!?" Howell : [#=; fname ;=#] >Why - this shouldn't happen? [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand (link d_webWork 'fin Head_one.html') fout ; Howell : [#=; fname ;=#] [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout ; [#!: path_executeEmbedsInsertIn_fHand (link d_webWork 'confMenu_authors.html') phraseValueList ; qnial> str_executeEmbeds ' Howell : [#=; fname ;=#] ' (("fname 'This is a test.html')("fout 5)("backtrack './../../')) Howell : This is a test.html >> This works fine! So why not within webSite_doAll? 08********08 07Dec2020 /media/bill/SWAPPER/Website - raw/webWork files/ fin confHead.html fin confFoot_authors.html fin confFoot.html /media/bill/SWAPPER/Website - raw/webWork files/confMenu_overall.html /media/bill/SWAPPER/Website - raw/webWork files/confMenu_authors.html YIKES!!!, will moz-do-not-send="true" fail, as it must NOT occur before SRC=. however, this is for emails, not webPages!! : Conference Guide Next : Publications Guide, then Publicity, Reviewers, Sponsors /media/bill/SWAPPER/Website - raw/webWork files/confMenu_publications.html 08Dec2020 resume work /media/bill/SWAPPER/Website - raw/webWork files/confMenu_publicity.html /media/bill/SWAPPER/Website - raw/webWork files/confMenu_sponsors.html 08********08 25Nov2020 'webSite [menuHeadFoot, link, TableOfContents, link] tools.html' I botched it together. qnial> webPageRawe_update l (link d_webRawe 'Software programming & code/Qnial/webSite [menuHeadFoot, link, TableOfContents, link] tools.html') qnial> webPageSite_update l (link d_webRawe 'Software programming & code/Qnial/webSite [menuHeadFoot, link, TableOfContents, link] tools.html') 08********08 25Nov2020 index.html - put in a smaller sized image! I made a 200*200 pixel image of the big chart : http://www.billhowell.ca/Civilisations%20and%20sun/Howell%20-%20radioisotopes%20and%20history.jpg qnial> webPageRawe_update l (link d_webRawe 'index.html') qnial> webPageSite_update l (link d_webRawe 'index.html') >> image doesn't show!? >> OKnow file:///media/bill/HOWELL_BASE/Website/Civilisations and sun/Howell - radioisotopes and history 200 by 200 pixels.jpg qnial> webPageRawe_update l (link d_webRawe 'Civilisations and sun/_Civilisations and the sun.html') qnial> webPageSite_update l (link d_webRawe 'Civilisations and sun/_Civilisations and the sun.html') 08********08 25Nov2020 lftp instead of fileZilla upload see "$d_SysMaint""internet & wifi/lftp notes.txt $ bash "$d_PROJECTS""bin-secure/lftp update www-BillHowell-ca.sh" bash: /media/bill/PROJECTS/bin-secure/lftp update www-BillHowell-ca.sh: No such file or directory >> kills me - I can't seem to get the file!!!???!!! >>bin - secure : Permissions, file access was null, set to read & write $ bash "$d_PROJECTS""bin-secure/lftp update www-BillHowell-ca.sh" bash: /media/bill/PROJECTS/bin-secure/lftp update www-BillHowell-ca.sh: No such file or directory qnial> a := EACH string 'lftp update www-BillHowell-ca.sh' +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ |l|f|t|p| |u|p|d|a|t|e| |w|w|w|-|B|i|l|l|H|o|w|e|l|l|-|c|a|.|s|h| +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ qnial> a EACHLEFT in chrs_fnames llllllllllllllllllllllllllllllll >> OK - so no weird chrs Why can't this file be accessed?? >> shit, I miss-spelled and couldn't see the obvious -> spaces around the hypen search "Linus lftp and how do I ensure that only new versions are uploaded?" Hmm, no direct answer backup "$d_webSite/Mythology/" $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" 02--02 mirror: Access failed: /Mythology: No such file or directory mkdir: Access failed: 550 Mythology/: File exists (/billhowell.ca/Mythology/) lftp: MirrorJob.cc:242: void MirrorJob::JobFinished(Job*): Assertion `transfer_count>0' failed. /media/bill/PROJECTS/bin - secure/lftp update www-BillHowell-ca.sh: line 9: 13719 Aborted lftp $PROTOCOL://$URL <<-UPLOAD user $USER "$PASS" cd $REMOTEDIR mirror --reverse --recursion=newer "$d_webSite/Mythology/" "/billhowell.ca/Mythology/" close UPLOAD /media/bill/PROJECTS/bin - secure/lftp update www-BillHowell-ca.sh: line 33: /home/user/script.log: No such file or directory 02--02 >> hmm, I have to put in full path for lftp? Change : +.....+ mirror --reverse --recursion=newer "$d_webSite/Mythology/" "/billhowell.ca/Mythology/" +.....+ To : +.....+ mirror --reverse --recursion=newer "/media/bill/HOWELL_BASE/Website/Mythology/" "/billhowell.ca/Mythology/" +.....+ $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" 02--02 mirror: Access failed: /media/bill/HOWELL_BASE/Website/Mythology/Mythology: No such file or directory mkdir: Access failed: 550 Mythology/: File exists (/billhowell.ca/Mythology/) lftp: MirrorJob.cc:242: void MirrorJob::JobFinished(Job*): Assertion `transfer_count>0' failed. /media/bill/PROJECTS/bin - secure/lftp update www-BillHowell-ca.sh: line 9: 13990 Aborted lftp $PROTOCOL://$URL <<-UPLOAD user $USER "$PASS" cd $REMOTEDIR mirror --reverse --recursion=newer "/media/bill/HOWELL_BASE/Website/Mythology/Mythology/" "/billhowell.ca/Mythology/" close UPLOAD /media/bill/PROJECTS/bin - secure/lftp update www-BillHowell-ca.sh: line 33: /home/user/script.log: No such file or directory 02--02 >> failed as directory exists >> I need to check a small directory with an updated webPage /media/bill/SWAPPER/Website - raw/economics, markets/SP500/PE Schiller forward vs 10yr Tbills/ backup "PE Schiller forward vs 10yr Tbills/", wwwBillHowell_update() Change : +.....+ mirror --reverse --recursion=newer "/media/bill/HOWELL_BASE/Website/Mythology/" "/billhowell.ca/Mythology/" +.....+ To : +.....+ mirror --reverse --only-newer "/media/bill/HOWELL_BASE/Website/economics, markets/SP500/PE Schiller forward vs 10yr Tbills/" "/billhowell.ca/economics, markets/SP500/PE Schiller forward vs 10yr Tbills/" +.....+ $ bash "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" >> OK, it looked like ONLY the html file was uploaded backup & try : mirror --reverse --only-newer "/media/bill/SWAPPER/Website - raw/economics, markets/SP500/multi-fractal/" "/billhowell.ca/economics, markets/SP500/multi-fractal/" >> OK, again it looked like ONLY the html file was uploaded >> Check online file : looks good, has images now Try a dry-run of the whole webSite mirror --reverse --only-newer --dry-run "/media/bill/HOWELL_BASE/Website/" "/billhowell.ca/" Screw the [get, permissions] - just run with >> Howell's command in bash file "$d_PROJECTS""bin - secure/lftp update www-BillHowell-ca.sh" : mirror --reverse --only-newer --log=$LOG "/media/bill/HOWELL_BASE/Website/" "/billhowell.ca/" problems with LO file, doesn't accept spaces? Change to : LOG="lftp update www-BillHowell-ca log.txt" I had to remove --log=$LOG : mirror --reverse --only-newer --log=$LOG "/media/bill/HOWELL_BASE/Website/" "/billhowell.ca/" lftp is VERY slow compared to fileZilla!! >> go back to fileZilla... and fight with settings 08********08 25Nov2020 fixes : Neil Howell (lost images - manwebPages affected) Kyoto fraud - incorrect backtrack for bodylinks... Conference guides not covered by webSite_extract_pathsSubDirsFnames 05----05 '_Kyoto Premise - the scientists arent wearing any clothes.html' - fname problems? qnial> a := EACH string 'Landsea, The hurricane expert who stood up to UN junk science.pdf' qnial> chrs_fnames:= link chrs_alphaNumeric (EACH string ` `. `, `_ `-) +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+- |0|1|2|3|4|5|6|7|8|9|a|A|b|B|c|C|d|D|e|E|f|F|g|G|h|H|i|I|j|J|k|K|l|L|m|M|n|N|o|O|p|P|q|Q|r|R|s|S|t|T|u|U|v|V|w +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+- +-+-+-+-+-+-+-+-+-+-+-+-+ |W|x|X|y|Y|z|Z| |.|,|_|-| +-+-+-+-+-+-+-+-+-+-+-+-+ qnial> a EACHLEFT in chrs_fnames lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll >> OK - so no weird chrs >> This one is OK in webSite, other twrong backtrack like the others qnial> a := EACH string 'Akasofu, Little Ice Age is still with us.pdf' +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ |A|k|a|s|o|f|u|,| |L|i|t|t|l|e| |I|c|e| |A|g|e| |i|s| |s|t|i|l|l| |w|i|t|h| |u|s|.|p|d|f| +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ qnial> a EACHLEFT in chrs_fnames llllllllllllllllllllllllllllllllllllllllllll >> OK - so no weird chrs So why did these generate !!linkError!! ?? Also - all backtracks in bodyLinks are wrong - Why? eg : "../Climate - Kyoto Premise fraud/Abdussamatov, look to Mars for the truth on global warming.pdf" "../Howell - Are we ready for Global Cooling, comments 14Mar06.pdf" >> should have 2 ../ second example missing subDir Check other bodyLinks in d_webSite (I have already, but see again) OK index.html most Past & future worlds.html - only one failure : file:///media/bill/HOWELL_BASE/Website/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/!!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/ OK Peratt - Auroral phenomena and petroglyphs.html Maybe the fname-only case needs to add 1? nyet? I don't have an answer. 05----05 Neil Howell images don't show Of course - the current webPage doen't have > What happened? Look for recent version OK - update webPage without text wrap NO wrapping : https://www.angelfire.com/nm/thehtmlsource/jazzup/image/stoptextwrap.html


>> wow! easy, and after years of searching, only one guy makes it clear! 25Nov2020 Problem is, my coding probably destroyed many images during initial code development. Look at 'z_Old/201027 13h46m19s backups' versions I have to [list, check, fix] all of them!! (probably 10 or so) A index.html x Climate - Kyoto Premise fraud x Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html A _Pandemics, health, and the sun.html 1 Howell - corona virus.html x Howell - corona virus of countries, by region.html A Howell - influenza virus.html x _Climate and sun.html D 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html x _Lies, damned lies, and scientists.html x S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html R Conference Guide' webPage x = doesn't need correction or doesn't have images (checked in html file) c = corrected images n = # of image links fixed (not all) A = all image links had to be re-inserted R = all Conference guide webPages need a revamp of the header! D - desroyed by over-writing with another file!!! qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >>> loading start : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf >>>>>> loading start : webSite header.ndf <<<<<< loading ended : webSite header.ndf <<< loading ended : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf qnial> webSite_doAll 05----05 Kyoto fraud - incorrect backtrack for bodylinks... looks hard &g continnuation of many past bats on same issue leave this for after-taxes 05----05 Conference guides not covered by webSite_extract_pathsSubDirsFnames Nuts, I had left 'Conference guides' --invertmatch of webSite_extract_pathsSubDirsFnames qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >>> loading start : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf >>>>>> loading start : webSite header.ndf <<<<<< loading ended : webSite header.ndf <<< loading ended : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf qnial> webSite_extract_pathsSubDirsFnames >> 'webSite webPageList.txt' now has 187 webPages Redo webSite_doAll un-(commenting out) : writeDoStep (link 'urls_check ' chr_apo 'extern' chr_apo) ; qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >>> loading start : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf >>>>>> loading start : webSite header.ndf <<<<<< loading ended : webSite header.ndf <<< loading ended : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf qnial> webSite_doAll /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 201125 11h00m33s Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : +----+-----------------------------+ |7004|count of all links in webSite| +----+-----------------------------+ 1147 = count of all [file, dir]s targeted by links on the webSite Counts below are the number of unique TARGETED [file, dir]s of links (eg 3+ links per target on average) Failures : +--+------------+ |79|errors list | +--+------------+ |22|extern fails| +--+------------+ |38|howell list | +--+------------+ |27|intern fails| +--+------------+ Unknowns - I havent written code to really show [OK, fail] : +---+-----------+ |71 |mailto list| +---+-----------+ |277|pgPosn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |239|extern OK| +---+---------+ |394|intern OK| +---+---------+ [fail, unknown, OK, total] counts : +----+-------------+ | 166|failed links | +----+-------------+ | 348|unknown links| +----+-------------+ | 633|OK links | +----+-------------+ |1147|total | +----+-------------+ >> looks nice! 08********08 24Nov2020 upload to webOnln via fileZilla working well now? I have to break for a month - income taxes, visits etc! Problem with useless re-uploads (file dating problem!) 08********08 24Nov2020 webSite_doAll - add a check to see if [z_Archive, z_Old] dirs in d_webSite check for [z_Archive, z_Old] : find: ‘’: No such file or directory >> OK, this works However, now the webPageSites are not being updated! I can't update webOnln via fileZilla until it works Sigh - what now? 201124 11h39m47s webAllRawOrSite_update l "webPageSite_update 201124 11h39m47s webURLs_extract >> my guess - flag_break screws the optr? webAllRawOrSite_update Change : +.....+ ELSE webPageSite_update webPage ; +.....+ To : +.....+ ELSE webPageSite_update flag_backup webPage ; +.....+ Wait! - I havent updated webSite_extract_pathsSubDirsFnames >> But this will add in 'Conference Guides' >> What the hell, just do it? I can always revert back webSite_doAll - I added : writeDoStep 'webSite_extract_pathsSubDirsFnames' ; >> NUTS! webSite webPageList.txt was updated, but has NONE of the 'Conference Guides' webPages!!??!! qnial> gage shape htmlFnamesSortedByFname 106 qnial> webSite_extract_pathsSubDirsFnames qnial> gage shape htmlFnamesSortedByFname 106 >> not working. Why? qnial> webSite_doAll /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 201124 12h08m34s Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : Failures : +---+------------+ |54 |errors list | +---+------------+ |17 |extern fails| +---+------------+ |27 |howell list | +---+------------+ |651|intern fails| +---+------------+ Unknowns - I havent written code to really show [OK, fail] : +---+-----------+ |48 |mailto list| +---+-----------+ |109|pgPosn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |115|extern OK| +---+---------+ |3 |intern OK| +---+---------+ [fail, unknown, OK, total] counts : +----+-------------+ | 749|failed links | +----+-------------+ | 157|unknown links| +----+-------------+ | 118|OK links | +----+-------------+ |1024|total | +----+-------------+ >> STILL no update of webPageSites!!??!! >> webSite_doAll still comments out : % writeDoStep (link 'urls_check ' chr_apo 'extern' chr_apo) ; >> '0_webPage_update log.txt' - shows the changes that SHOULD have been made OUCH! I've overwritten webPageRawes with wbPageSite style >> I must restore, and lose a greatal of stuff? qnial> dirBackup_restoreTo_paths o (link d_webRawe 'z_Old/201123 22h24m20s backups webPageRawe_update/') htmlPathsSortedByPath >> o for non-dated fnames webPageSite_update Change : +.....+ host link 'mv "' p_temp '" "' webPageRawe '"' ; +.....+ To : +.....+ host link 'mv "' p_temp '" "' webPageSite '"' ; +.....+ webPageSite := link d_webSite subDir fname ; %write fname ; IF (path_exists '-f' p_temp) THEN host (link 'diff --width=85 "' webPageSite '" "' p_temp '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; host link 'echo "" >>"' p_log '"' ; IF flag_backup THEN host link 'mv "' p_temp '" "' webPageSite '"' ; ENDIF ; ELSE host link 'echo ?webPageSite_update error, p_temp not created >>"' p_log '"' ; ENDIF ; Reverse yesterday's change? back to : +.....+ WHILE (NOT isfault (line := readfile finn)) DO % webPageRawe_update MUST have already processed links ; % to '[#=; backtrack ;=#], followed by a [legit, full] subir ; % so here executeEmbeds if present, with (phrase values) pairList ; IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line (("fout fout)("backtrack backtrack)) ; ENDIF ; +.....+ To : +.....+ WHILE (NOT isfault (line := readfile finn)) DO % process links ; IF ('> Hmm, this is a problem, I think? Better to leave it for now. But the old way worked, with str_executeEmbeds BEFORE the internalLinks_return_relativePath lines. Try what I have (again, no urls_check 'extern') /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 201124 13h33m00s Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : Failures : +--+------------+ |54|errors list | +--+------------+ |17|extern fails| +--+------------+ |27|howell list | +--+------------+ |7 |intern fails| +--+------------+ Unknowns - I havent written code to really show [OK, fail] : +---+-----------+ |48 |mailto list| +---+-----------+ |105|pgPosn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |115|extern OK| +---+---------+ |330|intern OK| +---+---------+ [fail, unknown, OK, total] counts : +---+-------------+ |105|failed links | +---+-------------+ |153|unknown links| +---+-------------+ |445|OK links | +---+-------------+ |703|total | +---+-------------+ >> Looks nice, but did it work? index.html all main menu items work saw some !!linkError!! in .html file Climate - Kyoto Premise fraud all main menu items work Projects menus - all work EXCEPT : file:///media/bill/HOWELL_BASE/Website/Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists arent wearing any clothes.html new links to Financial Post Deniers work (3 tested) page blogs.html all subMenus work bodyLink to Howell's blog doesn't work (strange, as the menu does) Directory of files works (as do other webPageSites) +--+ Conference guides NO [main, subDir] menus work Directory of files doesn't work, no [GNU, Creative Commons] images How come this didn't give rise to a huge number of 'intern fails'? >> see Authors'guide below - only this web-page sucks? Authors' guide ALL main menu links work ALL 'Non-author actions' menus work ALL bodylinks that I looked at worked >> I am STUNNED that this works!!! PubChair guide as with Authors' guide, ALL menu links work nicely I didn't check bodyLinks >> Again, I am STUNNED that the conference guide links work, even if it was only a little bit!!! For fun, check 'extern' links : qnial> urls_check 'extern' 05-----05 Olde Code # olde code webPageSite_update : I don't need to back up webPageSite, as it is entirely based on webPageRawe. Only webPageRawe needs backup. Keep it in for now, just in case. IF flag_backup THEN IF (`m in d_htmlBackup) THEN webPage path_backupTo_dir d_htmlBackup ; ELSE webPage path_backupDatedTo_dir d_htmlBackup ; ENDIF ; ENDIF ; # 23Nov2020 WHILE (NOT isfault (line := readfile finn)) DO % process links ; IF ('> Why "/home/bill/ ?? backer() { # $p_excl - must avoid transferring [z_Archive, z_Old, References] p_excl="$1" d_src="$2" d_out="$3" becho "rsync $2" becho "to $3" rsync "$options" --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" bash "$d_bin""du_diff.sh" "$d_src" "$d_out" } Change : +.....+ bash "$d_bin""du_diff.sh" "$d_src" "$d_out" +.....+ To : +.....+ bash "$d_bin""du_diff.sh" "$d_src" "$d_out" >>"p_log" +.....+ rsync /media/bill/PROJECTS/bin/ to /media/bill/SWAPPER/Website - raw/Software programming & code/bin/ rsync: link_stat "/home/bill/ --stats --itemize-changes -rltgu " failed: No such file or directory (2) rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1196) [sender=3.1.2] NYET!!! I don't want du_diff.sh!! - dirSizes only! just remove this! Try save options DIRECTLY in backer() Change : +.....+ rsync "$options" --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" +.....+ To : +.....+ rsync --stats --itemize-changes -rltgu --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" +.....+ $ bash "$d_bin""rsync website.sh" rsync /media/bill/PROJECTS/bin/ to /media/bill/SWAPPER/Website - raw/Software programming & code/bin/ rsync /media/bill/PROJECTS/Lucas/ to /media/bill/SWAPPER/Website - raw/Lucas/ rsync /media/bill/PROJECTS/MindCode/ to /media/bill/SWAPPER/Website - raw/MindCode/ rsync /media/bill/PROJECTS/Qnial/ to /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/ rsync /media/bill/PROJECTS/System_maintenance/ to /media/bill/SWAPPER/Website - raw/Software programming & code/System_maintenance/ rsync /media/bill/SWAPPER/Website - raw/ to /media/bill/HOWELL_BASE/Website/ >> Wow! seems to work >> 'Climate - Kyoto Premise fraud' - has been rsync'd!!! Change : +.....+ rsync "$options" --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" ... # options=" --dry-run --itemize-changes -rltgu " # report what will be done, but no transfers options=" --stats --itemize-changes -rltgu " +.....+ To : +.....+ if [ "$options" == 'test' ]; then rsync --dry-run --itemize-changes -rltgu --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" else rsync --stats --itemize-changes -rltgu --exclude-from="$p_excl" "$d_src" "$d_out" >>"$p_log" fi ... # options="test" # report what will be done, but no transfers options="change" +.....+ 08********08 23Nov2020 Should webPageSite_update be doing internalLinks_return_relativePath? That is the job of webPageRawe_update BEFORE calling webPageSite_update, all links should have '[#=; backtrack ;=#], followed by a [legit, full] subir Can I Change : +.....+ WHILE (NOT isfault (line := readfile finn)) DO % process links ; IF (' webSite_doAll Holy shit! It works now and I don't know why!!?? /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt Website stats for : www.BillHowell.ca 201123 22h31m47s Summary of the number of links by type [external, internal, menu, tableOfContent] and [OK, bad] : Failures : +--+------------+ |54|errors list | +--+------------+ |18|extern fails| +--+------------+ |27|howell list | +--+------------+ |34|intern fails| +--+------------+ Unknowns - I havent written code to really show [OK, fail] : +---+-----------+ |48 |mailto list| +---+-----------+ |105|pgPosn list| +---+-----------+ OKs - these links have been shown to work : +---+---------+ |114|extern OK| +---+---------+ |303|intern OK| +---+---------+ [fail, unknown, OK, total] counts : +---+-------------+ |133|failed links | +---+-------------+ |153|unknown links| +---+-------------+ |417|OK links | +---+-------------+ |703|total | +---+-------------+ errors list (!!linkError!!) - will need real work? extern fails : amazon is (7/18), obviously I must not use their links in the futurre non seem critical, but is work to find alt link & fix howell list - special correction to 'Conference guides' intern fails : Almost all (27/34) are due to 'Climate - Kyoto Premise fraud' pdfs, which have not been rsync'd yet. Four are due to './' mistakes - easy to correct. 08********08 23Nov2020 re-retry webSite_doAll - see if webPageRawes are updated after this - I'm going to eat popcorn & break for the night, fail or succeed qnial> webSite_doAll >> same problems >> I should have run internalLinks_return_relativePath_tests first Ah hah! The simple fname fails now (again - it keeps doing that, and I keep fixing it) # internalLinks_return_relativePath_test example #1 : FAILED - result does NOT match standard t_input, t_standard, t_result = +---------+--------+-+-----------------------------------------------------------+ |../../../|| +---------+--------+-+-----------------------------------------------------------+
  • ........ 30Oct2020 Simple check of multiple " /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- Howell - corona virus.html:177: /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- Howell - corona virus.html update.html:177: /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- Howell - corona virus.html str_replaceIn_path.html:177: /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/test- Howell - corona virus.html convertBodyLinks.html:177: /media/bill/SWAPPER/Website - raw/economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html:178: >> ALL of these should work! >> NO, NOT for webPageSite_update!!?? It only inserets backtrack? 02--02 Howell 920000 Front - Introduction.pdf qnial> (= 'Howell 920000 Front - Introduction.pdf' allFnamesSortedByFname ) sublist allPathsSortedByFname >> again - no hits? It IS in : '/media/bill/HOWELL_BASE/Website/Pandemics, health, and the Sun/corona virus' try links via geany of 'webSite urlList.txt' : Howell 920000 Front - Introduction.pdf /media/bill/HOWELL_BASE/Website/Lies, Damned Lies, and Scientists/Howell 920000 Front - Introduction.pdf >> This is in that directory, so why the failure? 02--02 NUTS! I haven't updated using webSite_extract_pathsSubDirsFnames? >> that shouldn't be the problem! I guess the good news is that many of the intern fails have 2-3 fails per path. So it may not be as badit looks Why isn't internalLinks_return_relativePath working now? My changes yesterday must have screwed it up. Hard to track though, even with notes. From 20Nov2020 : internalLinks_return_relativePath : Change : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell (fname := (1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ >> This looks suspicious? To : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell (1 + (last findAll_Howell `/ lineList@midIndx)) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ >> Why did I drop lineList@midIndx? Might be a leftover? Re-[loaddefs, internalLinks_return_relativePath_test] : >> Now #2 also fails (!!linkError!!) >> problem rains - it seems that the step above is being ignored To : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell ((1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ Now Change (add "first") To : +.....+ ELSEIF (NOT isfault (i_fname := first find_Howell ((1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ 08********08 23Nov2020 Check [webPageSite, webPageOnln] etc for errors First - fix some problems, but not ones that require webAllRawOrSite_update, which I will do after >> OK for n0w - I only fixed one or two below qnial> webSite_doAll >> OUCH!! backtrack problem again - often the wrong number of'../' >> Conference guides are included!??! - possibly because I changed --max-depth to 4 I have now REMOVED the max-depth!! the invert-match should exclude ["z_Old, z_Archive, System_maintenance] anyways, but I removed Conference Guides exclusion. I might asl have it in as well - changes for next round... --invert-match "Conference guides\|z_Old\|z_Archive\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations\|i9018xtp.default/extensions/\|[0-9]\{6\} [0-9]\{2\}h[0-9]\{2\}m[0-9]\{2\}s " Big issue is the failed backtracks... Re-try : qnial> webSite_doAll Wait a minutewebPageSites were updated, but not webPageRawes. Is this normal? >> NO - webPageRawe_update is supposed to process all webPageRawes!! Why didn't it? webPageRawe_update IS OP flag_backup webPage IF flag_backup THEN host link 'mv "' p_temp '" "' webPage '"' ; ENDIF ; webSite_doAll IS { str_replaceIn_pathList l d_webRawe '!!linkError!!' '[#=; backtrack ;=#]' htmlPathsSortedByPath webAllRawOrSite_update l "webPageRawe_update ; webAllRawOrSite_update l "webPageSite_update ; webURLs_extract ; EACH urls_check 'intern' 'extern' ; webSite_link_counts ; } >so this should be changing every webPageRawe! last updates Sat 21Nov2020 Not this? : subDir fname := path_retrieve_subDirFname webPage d_webRawe ; fileops.ndf : path_retrieve_subDirFname IS OP path dirBase { LOCAL fname fPath subDir ; NONLOCAL webSiteAllPathList ; IF (NOT (dirBase subStr_in_str path)) THEN fault '?path_retrieve_subDirFname error, dirBase not in path' ELSE fname := path_extract_fname path ; IF (isfault fname) THEN fault '?path_retrieve_subDirFname error, fname' ELSE subDir fname := ((path str_extractPast_strFront dirBase) str_remove_subStr fname) fname ENDIF ENDIF } OUCH!! changed around arguments? : str_replaceIn_path IS OP flag_backup d_backup strOld strNew path str_replaceIn_pathList IS OP flag_backup d_backupRoot strOld strNew pathList ELSE str_replaceIn_path o '' strOld strNew pinn ; >> No, this is OK webPageRawe_update Change : +.....+ IF flag_backup THEN IF (`m in d_htmlBackup) THEN webPage path_backupTo_dir d_htmlBackup 'Rawe' ; ELSE webPage path_backupDatedTo_dir d_htmlBackup ; ENDIF ; ENDIF ; +.....+ To : +.....+ IF flag_backup THEN IF (`m in d_htmlBackup) THEN webPage path_backupTo_dir d_htmlBackup ; ELSE webPage path_backupDatedTo_dir d_htmlBackup ; ENDIF ; ENDIF ; +.....+ webPageRawe_update Change : +.....+ webPageRawe_update IS OP flag_backup webPage ... IF flag_backup THEN host link 'mv "' p_temp '" "' webPage '"' ; ENDIF ; +.....+ To : +.....+ webPageRawe_update IS OP flag_backup flag_update webPage ... IF flag_update THEN host link 'mv "' p_temp '" "' webPage '"' ; ENDIF ; +.....+ I will also thatve to change str_replaceIn_path IS OP flag_backup d_backup strOld strNew path { LOCAL finn fout flag_chg line p_temp ; % ; % backup unless a test, or called by [str_replaceIn_dir, str_replaceIn_pathList] which already backup ; IF flag_backup THEN path path_backupDatedTo_dir d_backup ; ENDIF ; p_temp := link d_temp 'str_replaceIn_path temp.txt' ; flag_chg := o ; % ; finn := open path "r ; fout := open p_temp "w ; WHILE (NOT isfault (line := readfile finn)) DO IF (strOld subStr_in_str line) THEN IF (= o flag_chg) THEN flag_chg := l ; %write path ; ENDIF ; EACH write '' line ; line := str_replace_subStr strOld strNew line ; write line ; ENDIF ; writefile fout line ; ENDWHILE ; EACH close finn fout ; % ; IF flag_backup THEN host link 'mv "' p_temp '" "' path '"' ; ENDIF ; } str_replaceIn_pathList IS OP flag_backup d_backupRoot strOld strNew pathList { LOCAL d_backup pinn ; % backups are automatically done, except for testing purposes!! ; IF flag_backup THEN d_backup := link d_backupRoot 'z_Archive/' timestamp_YYMMDD_HMS ' backups str_replaceIn_pathList/' ; host link 'mkdir "' d_backup '" ' ; ENDIF ; % ; IF (NOT path_exists '-d' d_backup) THEN EACH write '?str_replaceIn_pathList error : could not create d_backup : ' d_backup ; ENDIF ; FOR pinn WITH pathList DO IF (NOT path_exists ("r pinn)) THEN EACH write '' '?str_replaceIn_pathList error, file unknown : ' pinn '' ; % keep flag_backup = o (false) because all targeted files are backed up above ; ELSE str_replaceIn_path flag_backup d_backup strOld strNew pinn ; ENDIF ; ENDFOR ; } 5-----5 webPageSite : file:///media/bill/HOWELL_BASE/Website/index.html no images.. file:///media/bill/HOWELL_BASE/Website/page%20Publications%20&%20reports.html no menu - execute embed only file:///media/bill/HOWELL_BASE/Website/Neil%20Howell/_Neil%20Howell.html no images, menu, etc Projects subMenus 'S&P500 1872-2020, 83y trend' shows 'file:///media/bill/HOWELL_BASE/Website/economics,%20markets/SP500/multi-fractal/1872-2020%20SP500%20index,%20ratio%20of%20opening%20price%20to%20semi-log%20detrended%20price.html' >> wrong link! 'Pandemics, health, Sun : ' 'Fun, crazy stuff' -> change this 'Astro-correlates of health' 'Anthony Peratt -petroglyphs' -

    as Title 5-----5 Online checks Home page - no images? Olde Code % Delete past versions of p_temp, so update failures will result in a diff error message ; IF (path_exists '-f' p_temp) THEN host link 'rm "' p_temp '"' ; ENDIF ; 08********08 22Nov2020 Continue to setup 'rsync website.sh' 1. d_PROJECTS -> rsync script -> d_webRawe (see p_log="$d_bin""rsync log PROJECTS_to_webRawe.txt") 2. d_webRawe -> rsync script -> d_webSite (see p_log="$d_bin""rsync log webRawe_to_webSite.txt" ) 3. check that there are no z_[Archive, Old] in d_webSite must be very careful NOT to delete these from [PROJECTS, d_webRawe]!!! 4. d_webSite -> fileZilla ftp -> BillHowell.ca +-----+ Step 1 : d_PROJECTS -> rsync script -> d_webRawe (see p_log="$d_bin""rsync log PROJECTS_to_webRawe.txt") rsync NO webPages, as this will be done by : link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >> most had already been transferred perhaps several months ago. backer_rsync() - 201122 17h36m rsync of /media/bill/PROJECTS/bin/ to /media/bill/SWAPPER/Website - raw/Software programming & code/bin/ Number of files: 381 (reg: 360, dir: 21) Number of created files: 1 (reg: 1) sent 51,063 bytes received 115 bytes 102,356.00 bytes/sec total size is 30,488,826 speedup is 595.74 backer_rsync() - 201122 17h36m rsync of /media/bill/PROJECTS/Lucas/ to /media/bill/SWAPPER/Website - raw/Lucas Number of files: 21 (reg: 20, dir: 1) Number of created files: 0 sent 959 bytes received 12 bytes 1,942.00 bytes/sec total size is 1,104,218 speedup is 1,137.20 backer_rsync() - 201122 17h36m rsync of /media/bill/PROJECTS/Qnial/ to /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/ Number of files: 1,214 (reg: 1,143, dir: 71) Number of created files: 0 sent 32,806 bytes received 94 bytes 65,800.00 bytes/sec total size is 31,690,557 speedup is 963.24 backer_rsync() - 201122 17h36m rsync of /media/bill/PROJECTS/System_maintenance/ to /media/bill/SWAPPER/Website - raw/Software programming & code/System_maintenance/ Number of files: 1,089 (reg: 921, dir: 168) Number of created files: 0 sent 33,654 bytes received 183 bytes 67,674.00 bytes/sec total size is 401,253,276 speedup is 11,858.42 All looks OK? - won't really know until [testing, usage] +-----+ Step 2. d_webRawe -> rsync script -> d_webSite (see p_log="$d_bin""rsync log webRawe_to_webSite.txt" ) backer_rsync() - 201122 18h20m rsync of /media/bill/SWAPPER/Website - raw/ to /media/bill/HOWELL_BASE/Website/ Number of files: 10,638 (reg: 9,424, dir: 1,213, link: 1) Number of created files: 319 (reg: 304, dir: 15) sent 210,681,294 bytes received 12,066 bytes 32,414,363.08 bytes/sec total size is 10,026,762,188 speedup is 47.59 +-----+ Step 3. check that there are no z_[Archive, Old] in d_webSite d_webRawe NEEDs z_[Archive, Old], but I just want to see the status : $ find "/media/bill/SWAPPER/Website - raw/" -maxdepth 4 -type d -name "*z_Archive" /media/bill/SWAPPER/Website - raw/Lucas/math Howell/z_Archive /media/bill/SWAPPER/Website - raw/Lucas/math Lucas/z_Archive /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/z_Archive /media/bill/SWAPPER/Website - raw/z_Archive /media/bill/SWAPPER/Website - raw/Hussar/SummerDaze/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/bin/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/bin/email scripts/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/bin/backupper/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/System_maintenance/Linux/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/System_maintenance/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/code develop_test/z_Archive /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/Puetz & Borchardt/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/Lies, damned lies, and scientists/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/Voja - education/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/Colin James - KanbanNN, 4VL logic/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/JC-NPS/z_Archive /media/bill/SWAPPER/Website - raw/Projects - mini/History/z_Archive /media/bill/SWAPPER/Website - raw/economics, markets/z_Archive /media/bill/SWAPPER/Website - raw/economics, markets/SP500/z_Archive /media/bill/SWAPPER/Website - raw/economics, markets/SP500/Fibonacci mirror/z_Archive $ find "/media/bill/HOWELL_BASE/Website/" -maxdepth 4 -type d -name "*z_Archive" /media/bill/HOWELL_BASE/Website/Projects - mini/Puetz & Borchardt/z_Archive $ find "/media/bill/HOWELL_BASE/Website/" -maxdepth 4 -type d -name "*z_Old" /media/bill/HOWELL_BASE/Website/Projects - mini/History/Temple - Egyptian Dawn/z_Old /media/bill/HOWELL_BASE/Website/Software programming & code/System_maintenance/FireFox/z_Old /media/bill/HOWELL_BASE/Website/economics, markets/SP500/z_Old >> I deleted these! +-----+ Step 4. d_webSite -> fileZilla ftp -> BillHowell.ca I left it running while I went to visit Adrian. Seems to have uploaded 08********08 22Nov2020 Trivial change to symbols to align better webPageRawe d_webRawe 10:00 'urls extern fails.txt' - fixes 3/4 are a Financial Post series on climate deniers,eg : HTTP/1.1 404 Not Found !!! http://www.canada.com/nationalpost/financialpost/comment/story.html?id=2271ac23-6895-4789-9da0-6b28968b8d15 I should remove all and simply link to Lawrence Soloman's book 5 are amazon.com links - leave for now 05-----05 webURLs_extract - p_internURL_fails : pgPosns [extract to p_pgPosnURLs, remove from p_internURL_fails] Untested : % move all "pgPosn" links in p_internURLs to p_pgPosnURLs ; host link 'grep "#" "' p_internURLs '" >>"' p_pgPosnURLs '"' ; host link 'grep --invert-match "#" "' p_internURLs '" >"' p_temp1 '"' ; host link 'mv "' p_temp1 '" "' p_internURLs '"' ; >> watch out for problems with this!! 05-----05 fix anomalous first line of webPageSites : d_webRawe 'page Publications & reports.html' gives "' reports.html " at top of web-page d_webRawe 'Solar modeling and forecasting/_Solar modeling & forecasting.html' gives "' forecasting.html " at top of web-page (like earlier cases) >> looks OK now, I think I fixed it yesterday 05-----05 failed links - Maybe I need to rsync files?? : Randell Mills : /media/bill/HOWELL_BASE/Website/Randell Mills - hydrinos/ /media/bill/HOWELL_BASE/Website/Randell Mills - hydrinos/Howell - review of Holverstott 2016 Hydrino energy.pdf Social media : /media/bill/HOWELL_BASE/Website/Social media/Howell 110902 - Systems design issues for social media.pdf /media/bill/HOWELL_BASE/Website/Social media/Howell 111006 - Semantics beyond search.pdf /media/bill/HOWELL_BASE/Website/Social media/Howell 111230 - Social graphs, social sets, and social media.pdf Paul Vaughan : /media/bill/HOWELL_BASE/Website/Paul L Vaughan/Vaughan 120324 Solar-Terrestrial Resonance, Climate Shifts, & the Chandler Wobble Phase Reversal.pdf /media/bill/HOWELL_BASE/Website/Paul L Vaughan/Vaughan 120324 The Solar Cycles Footprint on Terrestrial Climate.PDF >> Yes, seems like problems after moving directories around. [Randell Mills, Social media] were moved into mini-projects subDir for both webPage[Rawe, Site] >> WAIT to setup rsync to update directorie's contents >> Do manually BEFORE setting up rsync!! >> I must correct webPagesRawe with links 05-----05 More problems : mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do S&P500 1872-2020, 83y trend - goes to covid-19 webPage!! COVID-19 - goes to S&P500 >> for some later date - maybe monthly rsyn-to-web 05-----05 Olde code 02--02 from link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' # subStr_in_str 'http://www.billhowell.ca/' 'HTTP/1.1 400 Bad Request !!! http://www.billhowell.ca/Bill Howells videos/150525 Icebreaker unchained - We should have lost World War II/' # cmd := link 'grep "Chandler Wobble" "' p_internURLs '"' qnial> host cmd /media/bill/HOWELL_BASE/Website/Paul L Vaughan/Vaughan 120324 Solar-Terrestrial Resonance, Climate Shifts, & the Chandler Wobble Phase Reversal.pdf qnial> cmd := link 'grep "" "' p_internURLs '"' >> OK, lists all files qnial> cmd := link 'grep "#" "' p_internURLs '"' qnial> host cmd ?Invalid argument qnial> cmd := link 'grep "\#" "' p_internURLs '"' grep "\#" "/media/bill/ramdisk/urls intern list.txt" qnial> host cmd ?Invalid argument 08********08 21Nov2020 22:07 02--02 intern fails : Howell_photo_Nov05_light.jpg /media/bill/HOWELL_BASE/Website/Paul L Vaughan/Vaughan 120324 Solar-Terrestrial Resonance, Climate Shifts, & the Chandler Wobble Phase Reversal.pdf /media/bill/HOWELL_BASE/Website/Paul L Vaughan/Vaughan 120324 The Solar Cycles Footprint on Terrestrial Climate.PDF /media/bill/HOWELL_BASE/Website/Personal/121225 Howells recent changes - career and location.pdf /media/bill/HOWELL_BASE/Website/Projects - mini/Howells projects.ods /media/bill/HOWELL_BASE/Website/Randell Mills - hydrinos/ /media/bill/HOWELL_BASE/Website/Randell Mills - hydrinos/Howell - review of Holverstott 2016 Hydrino energy.pdf /media/bill/HOWELL_BASE/Website/Social media/Howell 110902 - Systems design issues for social media.pdf /media/bill/HOWELL_BASE/Website/Social media/Howell 111006 - Semantics beyond search.pdf /media/bill/HOWELL_BASE/Website/Social media/Howell 111230 - Social graphs, social sets, and social media.pdf 02--02 errors : Ignoring internal page links, eg : !!linkError!!corona virus/#Corona virus models and Conference Guide links, eg : !!linkError!!Neural nets/Conference guides/Author guide website/Author guide.html >> I removed these "Conference guides" paths from p_allFileList, but they will be regenerated if I don't change : Note that many use small caps, not title case : /media/bill/SWAPPER/Website - raw/Software programming & code/bin/conference guides - format html.sh /media/bill/SWAPPER/Website - raw/Software programming & code/bin/conference guides - remove emails.sh >> most important is to modify webSite_extract_pathsSubDirsFnames so they won't be included. >> STOP!! wrong - these are shell scripts, numbskull. look at later These are left : !!linkError!! !!linkError!!Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Scenes/ !!linkError!!Calendar.odp !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, towards a quasi-predictive model of the rise and fall of civilisations.pdf !!linkError!!Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists aren't wearing any clothes.html !!linkError!!Cool emails/ !!linkError!!diversity_member/people.odt !!linkError!!Howell - Are we ready for global cooling.pdf !!linkError!!influenza/Howell - influenza virus.html#Astronomical correlates of pandemics !!linkError!!influenza/Howell - influenza virus.html#Howell - USA influenza [cases, deaths] alongside [sunspots, Kp index, zero Kp bins] !!linkError!!influenza/Howell - influenza virus.html#Influenza pandemics - Tapping, Mathias, and Surkan (TMS) theory !!linkError!!influenza/Howell - influenza virus.html#Is the effectiveness of vaccines over-rated? !!linkError!!influenza/Howell - influenza virus.html#Quite apart from the issue of the benefits of vaccines !!linkError!!influenza/Howell - influenza virus.html#Rebuttals of the [solar, disease] correlation !!linkError!!International Neural Network Society.JPG !!linkError!!LibreCalc bank account macro system.txt !!linkError!!Lucas's Universal Force for electrodynamics, gravity, mass, etc !!linkError!!National Post.jpg !!linkError!!Nial Systems Limited.JPG !!linkError!!Paul L Vaughan/Vaughan 120324 The Solar Cycle's Footprint on Terrestrial Climate.PDF !!linkError!!Puetz greatest of cycles/ !!linkError!!Software programming & code/ !!linkError!!Software programming & code/bin/bin - Howell's web-page.html !!linkError!!Software programming & code/Qnial/MY_NDFS/??? !!linkError!!Software programming & code/Qnial/MY_NDFS/fileops.ndf !!linkError!!Software programming & code/Qnial/MY_NDFS/MindCode/ !!linkError!!Software programming & code/Qnial/MY_NDFS/video production/ !!linkError!!Software programming & code/Qnial/MY_NDFS/website urls.ndf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Array Theory and the Design of Nial.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/Design of QNial V7.pdf !!linkError!!Software programming & code/Qnial/Qnial_bag/docs/V7 QNial Dictionary.html !!linkError!!Software programming & code/System_maintenance/ !!linkError!!S&P 500 Shiller-forward PE versus 10y US Treasury bond rates.jpg >> some had been corrected BEFORE I reverted to older '201117 17h00m21s backups' webPages >> I will have to repeat that work 02--02 extern fails : 3/4 are a Financial Post series on climate deniers,eg : HTTP/1.1 404 Not Found !!! http://www.canada.com/nationalpost/financialpost/comment/story.html?id=2271ac23-6895-4789-9da0-6b28968b8d15 I should remove all and simply link to Lawrence Soloman's book 5 are amazon.com links - leave for now 08********08 21Nov2020 05-----05 21:37 qnial> webSite_doAll >> no help for [extern, intern] links and I should have known that So why aren't they being [save, count]ed ? Just run : webURLs_extract >> ramdisk has good files : urls intern list.txt urls extern list.txt So how did I [lose, delete] them? urls_check : EACH path_delete p_list p_bad p_OKK ; >> brilliant, delete p_lists before they can be used (idiot!) urls_check Change : +.....+ EACH path_delete p_list p_bad p_OKK ; +.....+ To : +.....+ EACH path_delete p_bad p_OKK ; +.....+ qnial> webSite_link_counts /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 21Nov2020 Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : Failures : 02--02------------+ |56|errors list | 02--02------------+ |44|extern fails| 02--02------------+ |0 |howell list | 02--02------------+ |10|intern fails| 02--02------------+ Unknowns - I havent written code to really show [OK, fail] : 02--02-----------+ |2 |mailto list| 02--02-----------+ |80|pgPosn list| 02--02-----------+ OKs - these links have been shown to work : +---+---------+ |88 |extern OK| +---+---------+ |238|intern OK| +---+---------+ [fail, unknown, OK, total] counts : +---+-------------+ |110|failed links | +---+-------------+ | 82|unknown links| +---+-------------+ |326|OK links | +---+-------------+ |518|total | +---+-------------+ AWESOME!!! Stupid mistakes cost me 3 days. getting tired [errors, intern fails] are key - now is FAR EASIER to start work in confidence!!! 05-----05 17:53 qnial> webSite_doAll 02--02 no counts for intern or extern webPageSite : OK index.html Maybe the problem is in webURLs_extract : EACH path_delete p_errorsURLs p_externURLs p_howellURLs p_internURLs p_mailtoURLs p_pgPosnURLs p_temp1 ; >> comment and re-run >> this did NOT fix the problem!?? webURLs_extract Change : +.....+ ELSEIF (in `# linker) THEN writefile fpos linker ; +.....+ To : +.....+ ELSEIF (= `# (first linker)) THEN writefile fpos linker ; +.....+ >> all intern links will have '[#=; backtrack ;=#]' >> The new approach won't capture [extern, intern] links to positions within the webPage >> That can be done by later processing of !!linkError!! 02--02 no Past & future worlds.html backtrack fails in some bodyLinks, ' future worlds.html @pageTop The following were problems earlier : mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page subMenu Cdn Solar Forecast - gives "' forecasting.html " at top of web-page (like earlier cases) How many fnames have `&? qnial> ('&' EACHRIGHT subStr_in_str allFnamesSortedByFname) sublist allFnamesSortedByFname >> HUGE number of aa files qnial> EACH write (('&' EACHRIGHT subStr_in_str htmlFnamesSortedByFname) sublist htmlPathsSortedByFname) qnial> EACH write (('&' EACHRIGHT subStr_in_str htmlFnamesSortedByFname) sublist htmlPathsSortedByFname) /media/bill/SWAPPER/Website - raw/Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html >> OK /media/bill/SWAPPER/Website - raw/Solar modeling and forecasting/_Solar modeling & forecasting.html >> OK /media/bill/SWAPPER/Website - raw/economics, markets/Long term market indexes & PPI 0582.html >> ' PPI 0582.html /media/bill/SWAPPER/Website - raw/page Publications & reports.html >> ' reports.html /media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html >> ' future worlds.html /media/bill/SWAPPER/Website - raw/economics, markets/SP500/PE Schiller forward vs 10yr Tbills/S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html >> ' P 500 Shiller-forward PE versus 10y Treasury bond rates.html This really looks like a problem with the amperhsand!! But not all fail? 05-----05 Later look at more problems : subMenu Projects failed links - Randell Mills, S&P500 1872-2020, 83y trend - goes to covid-19 webPage!! COVID-19 - goes to S&P500 I didn't recheck Projects subMenu 24************************24 20Nov2020 subDirs sometimes work, sometimes don't! This is new. 05-----05 All failures were at 3 subDirs down test#1 qnial> a := '170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html' qnial> (a EACHRIGHT subStr_in_str htmlPathsSortedByPath) sublist htmlPathsSortedByPath +------------------------------------------------------------------------------------------------------------- |/media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & +------------------------------------------------------------------------------------------------------------- ------------------+ future worlds.html| ------------------+ >> OK, plus the final web-page works >> So it is internalLinks_return_relativePath_test t_standards that are wrong? >> nyet, although I did make changes so results conform to the correct backtracks subDirs are being truncated by dropping the front? I must first make it work in : internalLinks_return_relativePath_test internalLinks_return_relativePath : Change : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell (fname := (1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ >> This looks suspicious? To : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell (1 + (last findAll_Howell `/ lineList@midIndx)) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ >> Why did I drop lineList@midIndx? Might be a leftover? Re-[loaddefs, internalLinks_return_relativePath_test] : >> Now #2 also fails (!!linkError!!) >> problem rains - it seems that the step above is being ignored To : +.....+ ELSEIF (NOT isfault (i_fname := find_Howell ((1 + (last findAll_Howell `/ lineList@midIndx)) drop lineList@midIndx) allFnamesSortedByFname ) ) THEN lineList@midIndx := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; +.....+ qnial> EACH (gage shape) allFnamesSortedByFname allPathsSortedByFname 2799 2799 qnial> find_Howell ((1 + (last findAll_Howell `/ '/Past & future worlds.html' )) drop '/Past & future worlds.html') allFnamesSortedByFname 1953 This is a big problem, probably with a simple solution. setup.ndf : #] array_findAll_subArray IS OP subArray array - addr of ALL subArray in array, error if not found array_findAll_subArray IS OP subArray array_to_search { LOCAL i_ins ; i_ins := subArray EACHRIGHT = array_to_search ; i_adds := i_ins sublist (tell gage shape array_to_search) ; IF (isfault i_adds) THEN fault '?array_findAll_subArray : item not found' ELSE i_adds ENDIF } findAll_Howell IS array_findAll_subArray Change : +.....+ i_adds := i_ins sublist (tell gage shape array_to_search) ; +.....+ To : +.....+ i_adds := i_ins sublist (tell (gage shape array_to_search)) ; +.....+ [bye, qnial,loaddefs] qnial> find_Howell '/Past & future worlds.html' allFnamesSortedByFname ?find_Howell : item not found qnial> find_Howell 'Past & future worlds.html' allFnamesSortedByFname 1953 >> As always expected qnial> (1953 pick allPathsSortedByFname) str_remove_subStr d_webRaw Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html >> OK, as it should be I still can't see a problem qnial> find_Howell (solitary 'Past & future worlds.html') allFnamesSortedByFname ?find_Howell : item not found >> Ouch! I should have seen this qnial> find_Howell (first solitary 'Past & future worlds.html') allFnamesSortedByFname 1953 >> OK qnial> 1953 pick allPathsSortedByFname /media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html >> OK flag_break it -->[nextv] link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ../../../../economics, markets/SP500/multi-fractal/1872-2020 SP500 index semi-log detrended 1871-1926 & 1926-2020, TradingView.png >> OK, this is perfect, but this is NOT the first test!!!??? IF (= '
  • ' line) THEN break ; ENDIF ; internalLinks_return_relativePath Change : +.....+ liner := line str_remove_subStr '[#=; backtrack ;=#]' ; +.....+ To : +.....+ liner := liner str_remove_subStr '[#=; backtrack ;=#]' ; +.....+ >> Oh boy - most if I correct the t_standards, which depend on the backtrack provided, and not a caculated one >> Nasty little bug!! hard to see... # internalLinks_return_relativePath_test example #2 : FAILED - result does NOT match standard t_input, t_standard, t_result = +------------+--------+-+-------------------------------------------------------------------------------------------------------------------------+ |../../../../|| +------------+--------+-+-------------------------------------------------------------------------------------------------------------------------+
  • ........ 30Oct2020 Simple check of internal path to fname-only link. This is the remaining test error, assuming that all t_standards are correct, which may not be the case, as continually in the past. >> Nope - calcualtion is correct, t_standard was wrong. I added : i_test := i_test + 1 ; backtrack := '../../../../' ; t_name := link '# internalLinks_return_relativePath_test example #' (string i_test) ; t_input := backtrack ' WIDTH=90% NAME="1872-2020 SP500 index semi-log detrended"
    ' ; t_standard := ' WIDTH=90% NAME="1872-2020 SP500 index semi-log detrended"
    ' ; t_result := internalLinks_return_relativePath t_input ; test_comment t_name t_input t_standard t_result ; EACH write '........' '30Oct2020 Simple check of "Bill Howells videos >> should be OK? check current index.html : >> Yes, it works webPage has [' reports.html, etc] - this problem is back 02--02 Re-do previous corrections : #] from 'fileops.ndf' : pathList_backupTo_dir htmlPathsSortedByPath (link d_webRaw 'z_Archive/') >> OK #] from 'fileops.ndf' : str_replaceIn_pathList l d_webRaw '!!linkError!!' '' htmlPathsSortedByPath >> OK? 02--02 from 17Nov2020 Redo 6. : problem with extral junk line at top of webPage >> I suspect a recurring problem with the ampersand `& in the fname? 'page Software programming.html' >> OK already 'Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html' Change : +.....+ _Charvatova - solar inertial motion [#!: writefile fout '<TITLE> path ' activity.html +.....+ To : +.....+ _Charvatova - solar inertial motion +.....+ 'Solar modeling and forecasting/_Solar modeling & forecasting.html' Change : +.....+ _Solar modeling [#!: writefile fout '<TITLE> path ' forecasting.html +.....+ To : +.....+ Solar modeling and forecasting.html +.....+ Later look at more problems : mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page subMenu Projects failed links - Randell Mills, S&P500 1872-2020, 83y trend - goes to covid-19 webPage!! COVID-19 - goes to S&P500 Cdn Solar Forecast - gives "' forecasting.html " at top of web-page (like earlier cases) 05-----05 Red herring - depther : qnial> d_webRaw /media/bill/SWAPPER/Website - raw/ qnial> d_webSite /media/bill/HOWELL_BASE/Website/ >> so for my current setup, depther = 0 for for all "webRoot" files But : qnial> null reshape 5 5 qnial> null reshape (solitary '../') o---+ |../| +---+ qnial> 0 reshape (solitary '../') So the trick may be to convert a null to 0 - but better yet to set backtrack := './'? webPageSite_update, Change : +.....+ depther_global := 0 ; IF (OR ('Menu' 'fin Head' 'fin Footer' 'fin footer' EACHLEFT subStr_in_str webPage)) THEN depther := depther_global ; ELSE depther := (gage shape (`/ findAll_Howell webPage )) - (gage shape (`/ findAll_Howell d_webRaw)); ENDIF ; backtrack := link (depther reshape (solitary '../')) ; +.....+ To : +.....+ depther := (gage shape (`/ findAll_Howell webPage )) - (gage shape (`/ findAll_Howell d_webRaw)); IF (= 0 depther) THEN backtrack := '' ; ELSE backtrack := link (depther reshape (solitary '../')) ; ENDIF +.....+ Or : +.....+ depther := (gage shape (`/ findAll_Howell webPage )) - (gage shape (`/ findAll_Howell d_webRaw)); backtrack := link (depther reshape (solitary '../')) ; +.....+ >> I don't think that the 'Or:' will work, but I can learn from the results >> Nah - menuHeadFoots are special - they always need to start from root? For [GNU, Creative Commons]? >> Keep the same for now 05-----05 Olde Code 02--02 from fileops.ndf : # find "$d_Qndfs" -maxdepth 3 -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number 'str_replaceIn_path' "FILE" >> jillions in fileops.ndf and it's [backups, z_Archives] # find "$d_Qndfs" -maxdepth 3 -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number 'replaceStringIn_path' "FILE" >> many z_Archive, backups] plus : /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:629: replaceStringIn_path IS str_replaceIn_path # find "$d_Qndfs" -maxdepth 3 -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number 'str_executeEmbeds' "FILE" >> many hits # sed - selection of separator character based on search-replace text # (`/ `# `! `| `@ EACHLEFT subStr_in_str (link strOld strNew)) sublist (`/ `# `! `| `@) ) # tests # $ strOld='pinn_writeExecute_pout' # $ strNew='str_executeEmbeds' # $ path='/media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html' # $ cat "$path" | sed "s#$strOld#$strNew#" # qnial> str_replaceSubStrIn_path o 'pinn_writeExecute_pout' 'str_executeEmbeds' (link d_webRaw 'Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html') # EACHRIGHT str_replaceIn_path EACH link (((solitary (flag_backup strOld strNew )) cart (solitary pathList) # find "$d_webRaw" -maxdepth 3 -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number ':&file-insert &:' "FILE" | sed 's#:.*##' | sort -u # str_replaceIn_pathList d_webRaw 'pinn_writeExecute_pout' 'str_executeEmbeds' htmlPathsSortedByFname # newly created operator # str_changeIn_webRaw '!!linkError!!' '[#=; backtrack ;=#]' >> this ran very well, no error outputs, and very quickly >> now to check a couple of files : /media/bill/SWAPPER/Website - raw/page projects.html:62:
  • >> OK /media/bill/SWAPPER/Website - raw/Climate and sun/_Climate and sun.html:99: >> This was NOT changed!!?? Maybe I had already changed the file above? >> !!linkError!!S&P 500 Shiller-forward PE versus 10y US Treasury bond rates.jpg not changed in 'S&P 500 Shiller-forward PE versus 10y Treasury bond rates.html' Why aren't the changes "holding"? # I already had an operator - adapt it # str_replaceIn_pathList d_webRaw '!!linkError!!' '[#=; backtrack ;=#]' pathList # tests # check EACHALL : # a b c d := 1 2 3 4 # EACH link (((solitary (1 2 3)) cart (tell 10))) # qnial> str_replaceIn_dir l ??? # EACHRIGHT pass EACH link (((solitary (flag_backup strOld strNew )) cart (solitary pathList) # olde code result := EACHRIGHT pass EACH link (((solitary (flag_backup strOld strNew )) cart (solitary pList))) ; % result := EACHRIGHT str_replaceIn_path EACH link (((solitary (flag_backup strOld strNew )) cart (solitary pList))) ; 24************************24 20Nov2020 from yesterday : I have to restore last good version of d_webRaw Double-shit - I stupidly dated the files. NOT easy to restore!!! Change path_backupDatedTo_dir To path_backupTo_dir Last set without dates : in '/media/bill/SWAPPER/Website - raw/z_Archive/201117 17h00m21s backups' just drop ~17 chars - do a few single file renames backupDir_restore_dated optr create dirBackupDated_restoreTo_paths use link d_webRaw 'z_Archive/201117 17h00m21s backups/' Problematic : qnial> dirBackupDated_restoreTo_paths (link d_webRaw 'z_Archive/201119 20h02m56s backups/') htmlPathsSortedByPath cp -p "/media/bill/SWAPPER/Website - raw/z_Archive/201119 20h02m56s backups/201119 20h02m56s 0_Big Data, Deep Learning, and Safety.html" "/media/bill/SWAPPER/Website - raw/Bill Howells videos/160901 Big Data, Deep Learning, and Safety/0_Big Data, Deep Learning, and Safety.html" >> no - this is OK! Even though the cp look legitimate, copies don't occur, and there are no error outputs. Why? Long day - take a break. >Eureka! Of cou the file dates don't change - they are preserved!! OK - re-try solution to problem : qnial> loaddefs link d_Qndfs 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf' >>> loading start : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf >>>>>> loading start : webSite header.ndf <<<<<< loading ended : webSite header.ndf ?expecting end of block: PATH_BACKUPDATEDTO_DIR D_HTMLBACKUP ; <***> P_LIST P_BAD P_OKK ?undefined identifier: ; EACH URLS_CHECK <***> 'intern' 'extern' ; <<< loading ended : webSite maintain [menu, header, footer, body] links, TableOfContents.ndf errors found: 2 >> I just can't see how this occurs. ?expecting end of block: PATH_BACKUPDATEDTO_DIR D_HTMLBACKUP ; <***> P_LIST P_BAD P_OKK comment out & re-try, as the problem is probably rooted elsewhere. 02--02 loading urls_check ?expecting end of block: ; 02--02 >> up, migrating glitch Perhaps an undefined variable? chr_apos balance no curly brain optr From link "$d_Qroot""help - [develop, debug, error list, etc]/0_QNial error list.txt" : #] ?expecting end of block: ; - 14Jan2020 I checked for missing ';', unbalanced ' - no workee >> I had defined an operator with arguments, but it wasn't supposed to have them (IS, not IS OP) I commented out several expressions : 02--02 %NONLOCAL d_htmlBackup d_temp d_webRaw d_webSite htmlPathsSortedByPath p_webPageList p_webSiteURLlist p_externURLs p_externURL_fails p_externURL_OK p_internURLs p_internURL_fails p_internURL_OK p_pgPosnURLs p_pgPosnURL_fails p_pgPosnURL_OK ; % ; % check each link, save [OK,fail]s ; %p_list p_bad p_OKK := EACH execute (EACH link (cart ((solitary 'p_') (solitary linkType) (solitary 'URL') ('s' '_fails' '_OK')) )) ; % backup files ; %p_list p_bad p_OKK EACHLEFT path_backupDatedTo_dir d_htmlBackup ; %p_list p_bad p_OKK EACHLEFT path_delete ; 02--02 >>OK, now loads when it shouldn't because loaren't defined!?!?! Now try : 02--02 NONLOCAL d_htmlBackup d_temp d_webRaw d_webSite htmlPathsSortedByPath p_webPageList p_webSiteURLlist p_externURLs p_externURL_fails p_externURL_OK p_internURLs p_internURL_fails p_internURL_OK p_pgPosnURLs p_pgPosnURL_fails p_pgPosnURL_OK ; % ; % check each link, save [OK,fail]s ; %p_list p_bad p_OKK := EACH execute (EACH link (cart ((solitary 'p_') (solitary linkType) (solitary 'URL') ('s' '_fails' '_OK')) )) ; % backup files ; %p_list p_bad p_OKK EACHLEFT path_backupDatedTo_dir d_htmlBackup ; %p_list p_bad p_OKK EACHLEFT path_delete ; 02--02 >> still holding ... 02--02 NONLOCAL d_htmlBackup d_temp d_webRaw d_webSite htmlPathsSortedByPath p_webPageList p_webSiteURLlist p_externURLs p_externURL_fails p_externURL_OK p_internURLs p_internURL_fails p_internURL_OK p_pgPosnURLs p_pgPosnURL_fails p_pgPosnURL_OK ; % ; % check each link, save [OK,fail]s ; p_list p_bad p_OKK := EACH execute (EACH link (cart ((solitary 'p_') (solitary linkType) (solitary 'URL') ('s' '_fails' '_OK')) )) ; % backup files ; p_list p_bad p_OKK EACHLEFT path_backupDatedTo_dir d_htmlBackup ; %p_list p_bad p_OKK EACHLEFT path_delete ; 02--02 >> still holding ... 02--02 NONLOCAL d_htmlBackup d_temp d_webRaw d_webSite htmlPathsSortedByPath p_webPageList p_webSiteURLlist p_externURLs p_externURL_fails p_externURL_OK p_internURLs p_internURL_fails p_internURL_OK p_pgPosnURLs p_pgPosnURL_fails p_pgPosnURL_OK ; % ; % check each link, save [OK,fail]s ; p_list p_bad p_OKK := EACH execute (EACH link (cart ((solitary 'p_') (solitary linkType) (solitary 'URL') ('s' '_fails' '_OK')) )) ; % backup files ; p_list p_bad p_OKK EACHLEFT path_backupDatedTo_dir d_htmlBackup ; p_list p_bad p_OKK EACHLEFT path_delete ; 02--02 Jackass! shuld be : EACH path_delete p_list p_bad p_OKK ; >> Now everything loaddefs 05-----05 Now try webSite_doAll Let's see if the new coding averts another disaster, eg webPageSite_update : IF flag_backup THEN IF (`m in d_htmlBackup) THEN webPage path_backupTo_dir d_htmlBackup ; ELSE webPage path_backupDatedTo_dir d_htmlBackup ; ENDIF ; ENDIF ; >> Wait a minute, I don't need to back up webPageSite, as it is entirely based on webPageRaw. Only webPageRaw needs backup. Keep it in for now, just in case. webSite_doAll : >> urls don't work, I'll have to che updates later. 02--02 /media/bill/ramdisk/urls intern list.txt /media/bill/SWAPPER/Website - raw/webWork files/urls intern fails.txt /media/bill/SWAPPER/Website - raw/webWork files/urls intern OK.txt ?urls_check file unknown error : p_list /media/bill/ramdisk/urls extern list.txt /media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt /media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt ?urls_check file unknown error : p_list wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls intern fails.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls intern OK.txt': No such file or directory /media/bill/SWAPPER/Website - raw/webWork files/webSite summary of [fail, unknown, OK,total] links.txt webSite stats for : www.BillHowell.ca : 20Nov2020 Summary of the number of targeted links by type [external, internal, menu, tableOfContent] and [OK, bad] : Failures : +-+------------+ |0|errors list | +-+------------+ | |extern fails| +-+------------+ |0|howell list | +-+------------+ | |intern fails| +-+------------+ Unknowns - I havent written code to really show [OK, fail] : 02--02-----------+ |2 |mailto list| 02--02-----------+ |80|pgPosn list| 02--02-----------+ OKs - these links have been shown to work : ++---------+ ||extern OK| ++---------+ ||intern OK| ++---------+ [fail, unknown, OK, total] counts : 02--02-------------+ | 0|failed links | 02--02-------------+ |82|unknown links| 02--02-------------+ | 0|OK links | 02--02-------------+ |82|total | 02--02-------------+ 02--02 Fix : ?urls_check file unknown error : p_list NUTS!!! more important - backups of webAll[Raw, Site] are dated still!?!? OOPS!! I haven't even written path_backupTo_dir. delete this in fileops.ndf : path_backupTo_dir IS path_backupDatedTo_dir >> I created path_backupTo_dir Back to fixing : ?urls_check file unknown error : p_list No internal [> check d_webRaw index.html - no [> example : ?dirBackup_restoreTo_paths : missing backup fname : webSite [menuHeadFoot, link, TableOfContents, link] tools. >> it doesn't have dates, >> First, I have to re-check code, not as advanced as dirBackupDated_restoreTo_paths qnial> dirBackup_restoreTo_paths o (link d_webRaw 'z_Archive/201117 17h00m21s backups/') htmlPathsSortedByPath ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m24s home.html ?dirBackup_restoreTo_paths : missing backup fname : test- home.html ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m24s Canadian Solar Workshop 2006 home page.html ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m24s CSWProgram.html ?dirBackup_restoreTo_paths : missing backup fname : test- Canadian Solar Workshop 2006 home page.html ?dirBackup_restoreTo_paths : missing backup fname : test- CSWProgram.html ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m24s Authors Guide BLOG home.html ?dirBackup_restoreTo_paths : missing backup fname : test- Authors Guide BLOG home.html ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m24s email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?dirBackup_restoreTo_paths : missing backup fname : 201111 15h32m25s Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?dirBackup_restoreTo_paths : missing backup fname : test- email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?dirBackup_restoreTo_paths : missing backup fname : test- Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?dirBackup_restoreTo_paths : missing backup fname : webSite [menuHeadFoot, link, TableOfContents, link] tools.html >> OK - these files shouldn't be in that dir qnial> webSite_doAll >> same er 02--02 /media/bill/SWAPPER/Website - raw/webWork files/urls intern OK.txt ?urls_check file unknown error : p_list ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt /media/bill/SWAPPER/Website - raw/z_Archive/201120 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt /media/bill/SWAPPER/Website - raw/z_Archive/201120 backups/ rm: cannot remove '/media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt': No such file or directory rm: cannot remove '/media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt': No such file or directory /media/bill/ramdisk/urls extern list.txt /media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt /media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt ?urls_check file unknown error : p_list wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls intern fails.txt': No such file or directory wc: '/media/bill/SWAPPER/Website - raw/webWork files/urls intern OK.txt': No such file or directory 02--02 These look good : 'webSite urlList.txt' 'index.html' Might be a fname problem for p_list? Ah hah! d_webSite does not have relative links webPageSite_update backtrack := link (depther reshape (solitary '../')) ; >> why has this stopped working? IF flag_backup THEN IF (`m in d_htmlBackup) THEN webPage path_backupTo_dir d_htmlBackup ; ELSE webPage path_backupDatedTo_dir d_htmlBackup ; ENDIF ; ENDIF ; >> This seems to be doing the right thing THEN line := internalLinks_return_relativePath backtrack ' internalLinks_return_relativePath_test >> [1,3,5,6] out of 12 failed test# 1
  • >> no subDir test# 3
  • >> no subDir test# 5
  • gnuplot.sh is the tiny bash script used to select gnuplot scripts. My other bash scripts can be found here.
  • gnuplot.sh is the tiny bash script used to select gnuplot scripts. My other bash scripts can be found here. >> incomplete subDir test# 6
  • QNial programming language - Quenns University Nested Interactive Array Language (QNial) is my top prefered programming language for modestly complex to insane programming challenges, along with at least 3 other people in the world. Bash scripts make a great companion to QNial. semi-log formula.ndf is the tiny "program" used to set up the semi-log line fits. More generally : here are many of my QNial programs. Subdirectories provide programs for various projects etc.
  • QNial programming language - Quenns University Nested Interactive Array Language (QNial) is my top prefered programming language for modestly complex to insane programming challenges, along with at least 3 other people in the world. Bash scripts make a great companion to QNial. semi-log formula.ndf is the tiny "program" used to set up the semi-log line fits. More generally : here are many of my QNial programs. Subdirectories provide programs for various projects etc. >> This has the right subDirs subDirs sometimes work, sometimes don't! This is new. All failures were at 3 subDirs down Quit for the night. 05-----05 Oddball stuff z_Old a file (conflicting vhanges) $ diff "$d_Qtest""Website updates- tests.ndf" "$d_Qtest""Website updates- tests modfied.ndf" IF flag_debug THEN write 'loading dirBackupDated_restoreTo_paths' ; ENDIF ; #] dirBackup_restoreTo_paths IS OP flag_backupDated d_backup pathList - restore paths listed in a backup (FLAT) dir # 20Nov2020 initial,based on dirBackup_restoreTo_paths # pathList fnames - may be a partial list of fnames in d_backup # error output if an fname in pathList is not in d_backup # webSite work : Most often, pathList will be webPageList, not p_htmlFileList IF flag_break THEN BREAK ; ENDIF ; dirBackupDated_restoreTo_paths IS OP d_backup pathList { LOCAL cmd path pathList pinn pinnList pinnDropList ; IF (NOT path_exists ("w d_backup)) THEN EACH write '?path_backupDatedTo_dir dir unknown error, d_backup : ' d_backup '' ; ELSE % this returns the fname only ; cmd := link 'ls -1 "' d_backup '"' ; pinnList := host_result cmd ; fnameDropList := 17 EACHRIGHT drop pinnList ; fnamePathList := EACH path_extract_fname pathList ; FOR i WITH (tell (gage shape fnamePathList)) DO i_fnameDrop := find_Howell fnamePathList@i fnameDropList ; IF (isfault i_fnameDrop) THEN write link '?dirBackupDated_restoreTo_paths : missing backup fname : ' fnamePathList@i ; ELSE % write link 'cp -p "' (link d_backup pinnList@i_fnameDrop) '" "' pathList@i '" ' ; host link 'cp -p "' (link d_backup pinnList@i_fnameDrop) '" "' pathList@i '" ' ; ENDIF ; ENDFOR ; ENDIF ; } # tests with write rather than host : # dirBackupDated_restoreTo_paths (link d_webRaw 'z_Archive/201119 20h02m56s backups/') htmlPathsSortedByPath >> works great! # olde code cmd := link 'ls -1 "' d_backup '"' ; pinnList := host_result cmd ; FOR path WITH pathList DO write path ; fname := path_extract_fname path ; IF (path_exists (pinn := link d_backup fname)) THEN host link 'cp -p "' pinn '" "' path '" ' ; ELSE write link '?dirBackup_restoreTo_paths : missing backup fname : ' fname ; 24************************24 19Nov2020 fix 'urls errors list.txt' # track down specific !!linkError!! # host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!Civilisations and sun/Howell \- Mega-Life\, Mega-Death and the Sun II\, towards a quasi\-predictive model of the rise and fall of civilisations\.pdf" "FILE" | grep --invert-match "z_Old|z_Archive" ' 02--02 /media/bill/SWAPPER/Website - raw/Galactic rays and evolution/_Galactic rays and evolution - life, the mind, civilisation, economics, financial markets.html:69: /media/bill/SWAPPER/Website - raw/z_Archive/201117 17h00m21s backups/_Climate and sun.html:99: /media/bill/SWAPPER/Website - raw/z_Archive/201117 17h00m21s backups/_Galactic rays and evolution - life, the mind, civilisation, economics, financial markets.html:69: /media/bill/SWAPPER/Website - raw/Climate and sun/_Climate and sun.html:99: 02--02 >> why the z_Archive?? try without -f # host link 'find "' d_webRaw '" -maxdepth 4 -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!Civilisations and sun/Howell \- Mega-Life\, Mega-Death and the Sun II\, towards a quasi\-predictive model of the rise and fall of civilisations\.pdf" "FILE" | grep --invert-match "z_Old|z_Archive" ' >> Yikes! didn't do the trick. II to separate greps for [z_old, z_Archive] >> change fname to Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations 070128.pdf Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf !!linkError!!Climate - Kyoto Premise fraud/_Kyoto Premise - the scientists aren't wearing any clothes.html Oops - this is a link : _Kyoto Premise - the scientists aren't wearing any clothes.html~ >> I have no idea of where this is!? >> In any case, noerror in file itself, try : # host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!Climate \- Kyoto Premise fraud/_Kyoto Premise \- the scientists aren' chr_apo 't wearing any clothes\.html" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" ' /media/bill/SWAPPER/Website - raw/page projects.html:62:
  • Changed (got rid of chr_apo) Also Change : +.....+ !!linkError!!Lucas's Universal Force for electrodynamics, gravity, mass, etc +.....+ To : +.....+ [#=; backtrack ;=#]Lucas/ +.....+ !!linkError!!Howell - Are we ready for global cooling.pdf # host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!Howell \- Are we ready for global cooling\.pdf" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" ' /media/bill/SWAPPER/Website - raw/page Publications & reports.html:62:
  • Bill Howell "Are we ready for global cooling?" - A short presentation to Toastmasters – Dows Lake, Ottawa, 14Mar06. Needs corrections and comments! (some time later...)

    Change : +.....+ !!linkError!!Howell - Are we ready for global cooling.pdf +.....+ To : +.....+ [#=; backtrack ;=#]Howell - Are we ready for global cooling.pdf +.....+ find 'Howell - Are we ready for global cooling.pdf' allFnamesSortedByFname) pick allFnamesSortedByFname qnial> find_Howell 'Howell - Are we ready for global cooling.pdf' allFnamesSortedByFname) pick allPathsSortedByFname ?tokens left: FIND 'Howell - Are we ready for global cooling.pdf' qnial> fnd `# 'my name # is Sue' ?undefined identifier: FND <***> `# 'my name # is Sue' qnial> find `# 'my name # is Sue' 8 >> what's wrong with expression? qnial> (find_Howell 'Howell - Are we ready for global cooling.pdf' allFnamesSortedByFname) pick allPathsSortedByFname ?address >> It isn't in allPathsSortedByFname >> Did I even write this? problematic, leave for now & add to ToDos !!linkError!!International Neural Network Society.JPG qnial> (find_Howell 'International Neural Network Society.JPG' allFnamesSortedByFname) pick allPathsSortedByFname ?address !!linkError!!Neural nets/Conference guides/Author guide website/IEEE electronic Copyright form.html >> This and others should be OK do a mass remove of !!linkError!! host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | sed "s/!!linkError!!//g" ' >> NUTS!! screen output only >> NUTS! what a mess! it will re-generate the same problem!! A I have to change the original In the replacement text: '&\/\n' ; - Should have been : host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "!!linkError!!" "FILE" | grep --invert-match "z_Old" | grep --invert-match "z_Archive" | sed "s/!!linkError!!/[#=; backtrack ;=#]/g" ' I need to write a QNial optr to do this via p_temp 20:02 OK - now webSiteURLs_doAll >> YIKES! huge delay to check urls, it looksike a failure of : extern fails, OK] - no fails (eg 4040 code)? impossible as I didn't fix them!?? intern [fails = 1, OK = 0] impossible as intern list had [19, 343] at last sucessful run. intern list on 86 bytes, 1 line mailto, pgPosn weren't touched! >> What went wrong - check a few files! Try : qnial> urls_check 'intern' nah interns webRaw don't have '[#=; backtrack ;=#]'!??? what a mess!! >> SHIT - I goofed up with `; Change : +.....+ # str_changeIn_webRaw '!!linkError!!' '[#=; backtrack ;=#]' +.....+ To : +.....+ # str_changeIn_webRaw '!!linkError!!' '[\#=; backtrack ;=\#]' +.....+ I have to restore last good version of d_webRaw Double-shit - I stupidly dated the filesnot easy to restore!!! Change path_backupDatedTo_dir To path_backupTo_dir Last set without dates : in '/media/bill/SWAPPER/Website - raw/z_Archive/201117 17h00m21s backups' 24************************24 19Nov2020 fix 'urls errors list.txt' Examples : !!linkError!!corona virus/#Corona virus models >> put every link with `# into 'urls pgPosn list.txt' >> It should already be doing that! : ELSEIF (subStr_in_str '#' linker) THEN writefile fpos linker ; >> changed to : ELSEIF (in `# linker) THEN writefile fpos linker ; !!linkError!!Neural nets/Conference guides/ >> subDir problem should now be OK !!linkError!!Software programming & code/Qnial/MY_NDFS/MindCode/ >> link is wrong : moved dir, or [changed, wrong] fname !!linkError!!International Neural Network Society.JPG !!linkError!!LibreCalc bank account macro system.txt !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun II, towards a quasi-predictive model of the rise and fall of civilisations.pdf !!linkError!!Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, towards a quasi-predictive model of the rise and fall of civilisations.pdf Some we [lost, not created] : !!linkError!!LibreCalc bank account macro system.txt Some [files, dir]s will NOT have been rsync'd : Qnial, etc etc >finish an rsync program that I had started I need [find-grep-sed]s to [locate, change] affected files, en mass and one by one as I work on them. 1. remove ALL !!linkError!! and re-run, that should reduce list by 20-40% 2. [find, fix] individual errors 24************************24 19Nov2020 fix urls smmary table # a := tell 5 # b := reverse tell 10 qnial> (a (EACHLEFT EACHRIGHT subStr_in_str b)) sublist b ?first arg of sublist not boolean qnial> (a EACHLEFT EACHRIGHT subStr_in_str b) sublist b ?first arg of sublist not boolean qnial> (a EACHALL subStr_in_str b) sublist b ?first arg of sublist not boolean qnial> (a ITERATE subStr_in_str b) ?first arg of sublist not boolean qnial> EACH = (cart a b) oooooooool oooooooolo oooooooloo oooooolooo oooooloooo qnial> (rows (EACH = (cart a b))) EACHLEFT sublist b +-+-+-+-+-+ |0|1|2|3|4| +-+-+-+-+-+ # a := 'jim' 'nancy' 'john' 'betty' 'harry' # b := 'fred' 'jim' 'harry' 'john' 'dan' 'nancy' 'betty' 'floyd' qnial> (cols (EACH = (cart a b))) EACHLEFT sublist b ++------------05-----05-------------++-----------+------------+05-----05 ||+----05-----05|+---+|05-----05-----+||+---05-----05|+----05-----05||+---+| |||fred|nancy|||dan|||harry|floyd||||jim|betty|||john|harry||||dan|| ||+----05-----05|+---+|05-----05-----+||+---05-----05|+----05-----05||+---+| ++------------05-----05-------------++-----------+------------+05-----05 qnial> (cols (EACH = (cart a b))) EACHLEFT sublist a +05-----05-------+------++-------+-------++-------+ ||+---+|05-----05|+----+||05-----05|05-----05||05-----05| |||jim|||harry|||john||||nancy|||betty||||harry|| ||+---+|05-----05|+----+||05-----05|05-----05||05-----05| +05-----05-------+------++-------+-------++-------+ qnial> (rows (EACH = (cart a b))) EACHLEFT sublist a +-------++-------++------+ |05-----05||05-----05||+----+| ||nancy||||betty||||john|| |05-----05||05-----05||+----+| +-------++-------++------+ qnial> (rows (EACH = (cart a b))) EACHLEFT sublist b 05-----05-------+------+-------+-------------+ |+---+|05-----05|+----+|05-----05|05-----05-----+| ||jim|||nancy|||john|||betty|||harry|harry|| |+---+|05-----05|+----+|05-----05|05-----05-----+| 05-----05-------+------+-------+-------------+ # b := 'fred' 'jim' 'harry' 'john' 'dan' 'nancy' 'betty' 'floyd' 'eloise' qnial> (rows (EACH = (cart a b))) EACHLEFT sublist b 05-----05-------+------+-------+-------+ |+---+|05-----05|+----+|05-----05|05-----05| ||jim|||nancy|||john|||betty|||harry|| |+---+|05-----05|+----+|05-----05|05-----05| 05-----05-------+------+-------+-------+ # b := 'fred' 'dan' 'betty' 'jim' 'harry' 'john' 'dan' 'nancy' 'betty' 'floyd' 'harry' 'eloise' qnial> (rows (EACH = (cart a b))) EACHLEFT sublist b 05-----05-------+------+-------------+-------------+ |+---+|05-----05|+----+|05-----05-----+|05-----05-----+| ||jim|||nancy|||john|||betty|betty|||harry|harry|| |+---+|05-----05|+----+|05-----05-----+|05-----05-----+| 05-----05-------+------+-------------+-------------+ qnial> (rows (EACH subStr_in_str (cart a b))) EACHLEFT sublist b 05-----05-------+------+-------------+-------------+ |+---+|05-----05|+----+|05-----05-----+|05-----05-----+| ||jim|||nancy|||john|||betty|betty|||harry|harry|| |+---+|05-----05|+----+|05-----05-----+|05-----05-----+| 05-----05-------+------+-------------+-------------+ # b := 'fred' 'dan1' 'betty1' 'jim' 'harry1' 'john' 'dan2' 'nancy' 'betty2' 'floyd' 'harry2' 'eloise' qnial> b := 'fred' 'dan1' 'betty1' 'jim' 'harry1' 'john' 'dan2' 'nancy' 'betty2' 'floyd' 'harry2' 'eloise' +----+----+------+---+------+----+----05-----05------05-----05------+------+ |fred|dan1|betty1|jim|harry1|john|dan2|nancy|betty2|floyd|harry2|eloise| +----+----+------+---+------+----+----05-----05------05-----05------+------+ qnial> (rows (EACH subStr_in_str (cart a b))) EACHLEFT sublist b 05-----05-------+------+---------------+---------------+ |+---+|05-----05|+----+|+------+------+|+------+------+| ||jim|||nancy|||john|||betty1|betty2|||harry1|harry2|| |+---+|05-----05|+----+|+------+------+|+------+------+| 05-----05-------+------+---------------+---------------+ 05-----05 olde code n_fails := sum (fails_i sublist counts) ; n_OKKs := sum (unkns_i sublist counts) ; n_unkns := sum (OKKs_i sublist counts) ; # find "$d_Qndfs" -maxdepth 3 -type f -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "merge_2_1" "FILE" 24************************24 18Nov2020 I adapted these. They work. #] webURLs_extract IS - extract all link urls from a website [external, internal, menu, pagePosn] #] urls_check IS OP linkType - create sublists of [internal,xternal] links classed as [fail, OK] #] check internal with path_exists '-f', externals with curl Backups gave errors (that's OK) 02--02 qnial> urls_check 'intern' ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls intern fails.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls intern OK.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ +-- qnial> urls_check 'extern' ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern fails.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/urls extern OK.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ 02--02 HTTP/1.1 404 Not Found - was classified as OK for external link I changed to : IF (OR ('200' '300' '301' '302' EACHLEFT in_string curlHdr)) I need to make an easy list of code explanations Adapt& run website_link_counts : qnial> see "MERGE_2_1 merge_2_1 IS OPERATION A { 0 catenate ( A @ 0 A @ 1 ) } qnial> website_link_counts ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/webSite linkType fnames.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?path_backupDatedTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/webWork files/webSite linkType counts table.txt /media/bill/SWAPPER/Website - raw/z_Archive/201118 backups/ ?f_html_reformat file unknown error : p_linkTypeL Ouput file : ?invalid left arg in split ?invalid left arg in split OK - the code mostly works, now to look at failures subDirs - are failing,perhaps because of '-f' option for p_exists? 05-----05 remove olde code from urls_check : allLinks := strList_readFrom_path p_webSiteURLlist ; % construct path for each link ; p_list p_clean p_sort := EACH execute ( EACH link (cart ((solitary 'p_') (solitary linkType) (solitary 'URL') ('s' '_clean' '_sort') )) ) ; IF (NOT AND (EACHRIGHT file_exists ("r p_list) ("w p_clean))) THEN write fault link '?urls_check file unknown error : p_list or d_clean' ; ELSE flst := open p_list "r ; fcln := open p_clean "w ; write (2 2 reshape flst p_list fcln p_clean) ; WHILE (~= ??eof (line := readfile flst)) DO %write line ; % ; IF (OR (= 'howell' linkType) (= 'extern' linkType)) THEN % 08Oct2020 I can't think of any cleaning required - really just copy ; % in the case of 'howell', the source files will have to be corrected to be internal links only ; writefile fcln line ; % ; ELSE % extract text of line ; colons := findall `: line ; p_path := (first colons) take line ; IF (= 'pgPosn' linkType) THEN d_base := p_path ; ELSE d_base := ((last findall `/ p_path) + 1) take p_path ; ENDIF ; lineTxt := rest drop (second colons) line ; lineTxt := (find `" lineTxt) take lineTxt ; %write link d_base lineTxt ; % crawl up path_link directory reversals, at the same time truncating path_src ; WHILE (~= null lineTxt) DO IF (= './' (2 take lineTxt)) THEN lineTxt := 2 drop lineTxt ; %d_base := d_base ; ELSEIF (= '../' (3 take lineTxt)) THEN lineTxt := 3 drop lineTxt ; d_base := ((last front findall `/ d_base) + 1) take d_base ; ELSE EXIT 'urls_internal_check' ; ENDIF ; ENDWHILE ; writefile fcln (link d_base lineTxt) ; % ; ENDIF ; ENDWHILE ; EACH close flst fcln ; host link 'sort -u "' p_clean '" >"' p_sort '" ' ; host 'sleep 1s' ; ENDIF ; # olde code IF (= 'pgPosn' linkType) THEN p_link pagePosn := string_cut_by_char `# p_link ; IF (NOT file_exists '-f' p_link) THEN write fault link '?urls_check file unknown error : p_link' ; ELSE ???????????????????? % check if pagePosition is setup ; IF (= null (host_result link 'grep ' chr_apo '' chr_apo ' "' p_link '" ')) THEN writefile fbad p_link ; ELSE writefile fOKK p_link ; ENDIF ; ENDIF ; # old code writefile fout (first host_result link 'wc -l "' (link d_webRaw fname) '" ') ; % tbl_rows := link table_rows (host_result (link 'wc -l "' (link d_webRaw fname) '" '))) ; write table_rows ; % picture merge_2_1 tbl_linkCnt tbl_tots ; # p_htmlPathsSiteList htmlPathsSiteSortedByPath ; cmd := link 'grep -E -i --with-filename --line-number ">"' p_webSiteURLlist '" ' ; 24************************24 17Nov2020 ALL webPages : +----+ 7. check all links with 'website urls.ndf' & grep !!linkError!! see link d_Qndfs 'website urls notes.txt' 24************************24 17Nov2020 ALL webPages : 1. backups qnial> pathList_backupTo_dir htmlPathsSortedByPath d_htmlBackup >> OK 2. pathList_change_headFoot htmlPathsSortedByPath >> This was already done. >> Check - OK 3. Make sure that [data, optrs] are up-to-date qnial> lq_fileops qnial> loaddefs link d_Qtest 'Website updates- tests.ndf' 4. d_webRaw - update links, ensure [full, proper] subDir & backtrack >> first time use!!! qnial> webAllRawOrSite_update l "webPageRaw_update >> seems OK, no faults 5. d_webSite execute embeds [menu, Head, Foot, body], provide proper relative links >> first time use!!! qnial> webAllRawOrSite_update l "webPageSite_update 6. check 5 random webPageSites for results : check all main menu items check all subMenu items, but for common subMenus, just do once check footer - [GNU, Creative Commons] images, List of webPages selected : 'page Software programming.html' 'Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html' 'Neil Howell/_Neil Howell.html' 'Professional & Resume/_Resumes, work experience.html' 'Solar modeling and forecasting/_Solar modeling & forecasting.html' 'Solar modeling and forecasting/_Solar modeling & forecasting.html' : [Puetz, Randell Mills] links STILL fail!! : exasperating!, >> these were NOT updated!!! files last changed 27Oct2020!?!?!? d_webRaw update 14:41 ?? - this wasn't done either!!?? 201117 16h25m37s backups - three backups since 16:09, this is the most recent No use looking at the other webPageSites. For some reason, webAllRawOrSite_update doesn't work. What am I missing? Redo - this time do pathList_change_headFoot OK pathList_change_headFoot htmlPathsSortedByPath OK webAllRawOrSite_update l "webPageRaw_update NO webAllRawOrSite_update l "webPageSite_update >> so the last step above doesn't work. Why? >> The processing for both was far too fast. Something's wrong check for file updates beyond 16:50 : NO webAllRawOrSite_update l "webPageRaw_update >> however, there is a new timed backup dir >> So why doesn't webAllRawOrSite_update work? webAllRawOrSite_update Change : +.....+ apply optr_rawOrSite flag_backup webPage ; +.....+ To : +.....+ apply optr_rawOrSite (flag_backup webPage) ; +.....+ Re-try qnial> loaddefs link d_Qtest 'Website updates- tests.ndf' qnial> webAllRawOrSite_update l "webPageRaw_update >> OK, at least one webPage updated (several more also) >> runs much slower (only takes 5-10s?) qnial> webAllRawOrSite_update l "webPageSite_update >> OK, at least one webPage updated (several more also) >> runs much slower (only takes 5-10s?) So that was the problem. 05-----05 Redo 6. check 5 random webPageSites for results : check all main menu items check all subMenu items, but for common subMenus, just do once check footer - [GNU, Creative Commons] images, List of webPages selected : 'page Software programming.html' mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page, but goes to right page perhaps this is due to an apostrophe in the fname or whatever? mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do subMenu - QNial link doesn't go to my new web-page footer - dir list and [GNU, Creative Commons] images : OK 'Charvatova solar inertial motion & activity/_Charvatova - solar inertial motion & activity.html' mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do same as for previous webPage subMenu Charvatova - gives "' activity.html " at top of web-page (like earlier case) subMenu Projects failed links - Randell Mills, S&P500 1872-2020, 83y trend - goes to covid-19 webPage!! COVID-19 - goes to S&P500 Cdn Solar Forecast - gives "' forecasting.html " at top of web-page (like earlier cases) SAFIRE - electric sun experiment - add as well to [economics, markets] subMenu footer - dir list and [GNU, Creative Commons] images : OK 'Neil Howell/_Neil Howell.html' mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do subMenu Hosted sites - OK footer - dir list and [GNU, Creative Commons] images : OK 'Professional & Resume/_Resumes, work experience.html' mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do subMenu Professional - OK footer - dir list and [GNU, Creative Commons] images : OK 'Solar modeling and forecasting/_Solar modeling & forecasting.html' Cdn Solar Forecast - gives "' forecasting.html " at top of web-page (like earlier cases) mainMenu 'page Publications & reports.html' gives "' reports.html " at top of web-page mainMenu '_Neil Howell.html' - images of paintings don't appear, but [GNU, Creative Commons] do I didn't recheck Projects subMenu footer - dir list and [GNU, Creative Commons] images : OK +----+ 7. check all links with 'website urls.ndf' & grep !!linkError!! 24************************24 17Nov2020 New menus items : 05-----05 'Menu projects.html' - rearrange and add missing webPages: Lies, Damned Lies Pandemics (general link) influenza corona virus suicide economics & markets SP500-Schiller vs T-bill multifractal Fiboci mirror Fed Rese control zone Problematic web-pages -are due to [move, dirChanges] Randell Mills Puetz & Borchardt /media/bill/SWAPPER/Website - raw/Projects - mini/Puetz & Borchardt/Howell - comments on Puetz UWS, the greatest of cycles, human implications.odt /media/bill/SWAPPER/Website - raw/Projects - mini/Randell Mills/Howell - review of Holverstott 2016 Randell Mills hydrino energy.pdf 05-----05 'Menu neural nets.html' Neural nets main link OK - leave the rest of the [subDir, fname] corrections for later, after the url checks 24************************24 17Nov2020 Test webPages : Post-backups, check all links! 13:43 Is str_splitLftRgtTo_midIndxs_StrList still a problem? midIndxs lineList := str_splitLftRgtTo_midIndxs_StrList strLeft strRight liner ; put '[#=; backtrack ;=#]' into 5 files : 'fin [Footer, footer [Neil Howell,Paul Vauhan,Steven Wickson,Steven Yaskell].html' qnial> loaddefs link d_Qtest 'Website updates- tests.ndf' qnial> webPageRaw_update_test l (do all as the files aren't being over-written) qnial> webPageSite_update_test l Oh shit! 'whole-line-embed footer.txt' Change : +.....+ [#!: path_insertIn_fHand (link d_webWork 'fin Footer.html') fout ; +.....+ To : +.....+ [#!: path_executeEmbedsInsertIn_fHand (link d_webRaw 'webWork files/fin Footer.html') phraseValueList ; +.....+ qnial> webPageRaw_update_test l (do all as the files aren't being over-written) qnial> webPageSite_update_test l Can't just do that. I have to repeat pathList_change_headFoot pathList_change_headFoot htmlPathsSortedByPath webPageRaw_update_test : I don't think that I need to do this, as subDirs already put in? qnial> webPageSite_update_test l >> OK, [GNU, Creative Commons] images ap >> BUT! stupid `; problem is back pathList_change_headFoot - remove `; ??? Change : +.....+ cmd := link 'cat "' p_temp3 '" | sed "s|\(.*\)\(.*\)|<TITLE> ' fname ' ; |" >"' p_temp4 '"' ; +.....+ To : +.....+ cmd := link 'cat "' p_temp3 '" | sed "s|\(.*\)\(.*\)|<TITLE> ' fname ' |" >"' p_temp4 '"' ; +.....+ >> I thought that I had alrdone this - probably directly in the files? qnial> pathList_change_headFoot htmlPathsSortedByPath qnial> webPageSite_update_test l >> I STILL get a '` 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html ; This appears in 'pathList_change_headFoot temp4.txt' : Steven Wickson.html ; >> OOPS! forgot qnial> lq_fileops OK AWESOME! - basics seem to work well... (maybe older files will have problems?) 05-----05 12:57 With the 3-problem fixes below, test webPage[Raw, Site]s qnial> lq_fileops qnial> loaddefs link d_Qtest 'Website updates- tests.ndf' qnial> webPageRaw_update_test l qnial> webPageSite_update_test l 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html page Howell - blog.html Howell - corona virus.html _Lies, damned lies, and scientists.html Howell - influenza virus.html >> NUTS!!! I over-wrote the webPageRaws!!! (fuck-up) fix 'webPageSite_update' Change back to previous : +.....+ host link 'cp "' p_temp '" "' webPage '"' ; +.....+ To : +.....+ host link 'cp "' p_temp '" "' p_webSite '"' ; +.....+ Copy-back from backup dir qnial> webPageSite_update_test l >> no extraneous `; >> test- files NOT stored on webSite (where are they stored? >> failed [GNU, Creative Commons] images Damned failed [GNU, Creative Commons] images '1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' : Perhaps these have moved. Is p_allFileList up to date? GNU Public License Creative Commons License /media/bill/SWAPPER/Website - raw/Creative commons.png - correct subDir fname (both web[Raw, Site]) /media/bill/SWAPPER/Website - raw/gnu-head-mini.png - correct subDir fname (both web[Raw, Site]) /media/bill/SWAPPER/Website - raw/logo Bill OReilly.png /media/bill/SWAPPER/Website - raw/logo Creative Commons.png webPageRaw_update - check to make sure that '"' is used as StrRght, not '">' THEN line := internalLinks_return_relativePath backtrack '> these are OK Problematic web-pages -are due to [move, dirChanges] Randell Mills Puetz & Borchardt 08:51 (earlier) 05-----05 First run all webPageRaw_update_test with save flag_backup = l, check embeds loaddefs link d_Qtest 'Website updates- tests.ndf' webPageRaw_update_test l >> oops - missing headFoot fileops.ndf inserted : d_webWork := link d_webRaw 'webWork files/' ; >> now it looks good! >> must have had this in global varSpace before... 05-----05 Now, run all webPageSite_update_test with save flag_backup = l, check links qnial> webPageSite_update_test l subDirfname = economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html >> `; appears at top of webPage? >> [menuHeadFoot, body]-links look good (except known problems links) >> [GNU, Creative Commons] images don't appear subDirfname = page Howell - blog.html >> same as 1st, but GNU image does appear subDirfname = Pandemics, health, and the Sun/corona virus/Howell - corona virus.html >> same as 1st subDirfname = Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html >> same as 1st, also 'Problems with Science ' subMenu links don't work : Lies, Damned Lies, and Scientists GR turkey, QM fools paradise subDirfname = Pandemics, health, and the Sun/influenza/Howell - influenza virus.html >> same as 1st 05-----05 Fix the `; problem Can't see where this comes from check 'pathList_change_headFoot temp[1-5].txt' temp1 - no temp2 - no temp3 - no temp4 - no So it must be the temp5 step pathList_change_headFoot Change : +.....+ cmd := link 'cat "' p_temp4 '" | sed "s|\(\[#!: str_executeEmbeds (link d_webWork ' chr_apo 'Menu\)\(.*.html' chr_apo ')\)\(.*\)|\[#!: path_executeEmbedsInsertIn_fHand (link d_webWork ' chr_apo 'Menu\2 phraseValueList ; |" >"' p_temp5 '"' ; +.....+ To : +.....+ cmd := link 'cat "' p_temp4 '" | sed "s|\(\[#!: str_executeEmbeds (link d_webWork ' chr_apo 'Menu\)\(.*.html' chr_apo ')\)\(.*\)|\[#!: path_executeEmbedsInsertIn_fHand (link d_webWork ' chr_apo 'Menu\2 phraseValueList |" >"' p_temp5 '"' ; +.....+ >> removed `;, but I can't see how this will help >> problem is header, not the footer whole-line-embed header.txt [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand (link d_webWork 'fin Head_one.html') fout ; [#!: writefile fout ' path ' ; [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout ; >> but this is similar to the footer, which doesn't have this problem... [#!: full-line executeEmbeds, phraseValueList = (("fout fout)("backtrack backtrack)) [#!: path_insertIn_fHand (link d_webWork 'fin Footer.html') fout ; >> So maybe the problem is from : [#!: writefile fout ' path ' ; >> except the `; comes BEFORE the menus This is driving me nuts... outputwebPageRaw : has ' QNial.html ; ' That`; will be a problem 'whole-line-embed header.txt' Change : +.....+ QNial.html ; +.....+ To : +.....+ QNial.html +.....+ 05-----05 Fix the [GNU, Creative Commons] images 'fin [Footer, footer [Neil Howell,Paul Vauhan,Steven Wickson,Steven Yaskell].html' Change : +.....+ Creative Commons License +.....+ To : +.....+ GNU Public License Creative Commons License +.....+ >> Hopefully, that will do it. 05-----05 Redirect 'test- *' file creation to z_Archive This was done even today! it's not an olde thing Not in : 'fileops.ndf' 'Website updates- tests.ndf' 'Website updates.ndf' 'Website header.ndf' 'strings.ndf' Where the heck is it? check all backup optrs path_backupDatedToSameDir IS OP path - backup dated version of a file in same directory path_backupDatedTo_dir IS OP path dirBackup - backup dated version of a file to a specified FLAT dir dirBackup_restoreTo_paths IS OP d_backup p_pathList - restore paths listed in a backup (FLAT) dir path_backupDated_delete IS OP path - rename a file with date precursor Maybe it's in MY_NDFS somewhere other than the ndfs above? Oop the ORIGINAL filenames have 'test- ' problem is that they are saved to d_webRaw, not d_Qtest Look in 'Website updates.ndf' >> I can't see where the problem occurs... ??? webPageSite_update Change : +.....+ p_webSite := link d_webSite subDir fname ; write link 'subDirfname = ' subDir fname ; IF (path_exists '-f' p_temp) THEN host (link 'diff --width=85 "' p_webSite '" "' p_temp '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; host link 'echo "" >>"' p_log '"' ; IF flag_backup THEN host link 'mv "' p_temp '" "' p_webSite '"' ; ENDIF ; ELSE host link 'echo ?webPageSite_update error, p_temp not created >>"' p_log '"' ; ENDIF ; +.....+ To : +.....+ p_webSite := link d_webSite subDir fname ; write link 'subDirfname = ' subDir fname ; IF (path_exists '-f' p_temp) THEN host (link 'diff --width=85 "' p_webSite '" "' p_temp '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; host link 'echo "" >>"' p_log '"' ; IF flag_backup THEN host link 'mv "' p_temp '" "' webPage '"' ; ENDIF ; ELSE host link 'echo ?webPageSite_update error, p_temp not created >>"' p_log '"' ; ENDIF ; +.....+ >> Hopefully, that will do it. 24************************24 16Nov2020 webPageSite_update_test - check actual change in webPageSite Note: there STILL seems to be a subDir problem with menus? Just run & check qnial> webPageSite_update_test l main menu - all links are OK Projects : all are EXCEPT (as before) Randell Mills- hydrinos Puetz - The Greatest of cycles bodyLinks : All seem OK (not many) external : I noticed that this didn't work Ben Davidson of Suspicious Observers Try all 5 test webpages 'economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' 'page Howell - blog.html' 'Pandemics, health, and the Sun/corona virus/Howell - corona virus.html' 'Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html' 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html' First I have to update the whole-line-embeds. >> OK, all tests look good Backup all targeted webPages : qnial> pathList_backupTo_dir htmlPathsSortedByPath (link d_webRaw 'z_Archive/') cp: cannot stat '/media/bill/SWAPPER/Website - raw/economics, markets/SP500/multi-fractal/test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html': No such file or directory cp: cannot stat '/media/bill/SWAPPER/Website - raw/Lies, Damned Lies, and Scientists/test- _Lies, damned lies, and scientists.html': No such file or directory cp: cannot stat '/media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/test- Howell - corona virus.html': No such file or directory cp: cannot stat '/media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/influenza/test- Howell - influenza virus.html': No such file or directory cp: cannot stat '/media/bill/SWAPPER/Website - raw/test- page Howell - blog.html': No such file or directory >> I deleted the test- webPages 05-----05 qnial> pathList_change_headFoot htmlPathsSortedByPath test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html cat: '/media/bill/SWAPPER/Website - raw/economics, markets/SP500/multi-fractal/test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html': No such file or directory test- _Lies, damned lies, and scientists.html cat: '/media/bill/SWAPPER/Website - raw/Lies, Damned Lies, and Scientists/test- _Lies, damned lies, and scientists.html': No such file or directory test- Howell - corona virus.html cat: '/media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/test- Howell - corona virus.html': No such file or directory test- Howell - influenza virus.html cat: '/media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/influenza/test- Howell - influenza virus.html': No such file or directory test- page Howell - blog.html cat: '/media/bill/SWAPPER/Website - raw/test- page Howell - blog.html': No such file or directory >> only failures are noed above, basically only test files >> - so all normal webPages were processed (very quickly!) 24************************24 16Nov2020 webPageSite_update, [str, path]_executeEmbeds qnial> webPageSite_update_test o headFoots OK Menus : backtrack remains a string!? Home small or sleeping : bodyLinks : don't even appear Break for brunch qnial> webPageSite_update_test o ------------------------------------------------------------- Break debug loop: enter debug commands, expressions or type: resume to exit debug loop executes the indicated debug command current call stack : webpagesite_update_test webpagesite_update_output webpagesite_update str_executeembeds path_executeembedsinsertin_fhand path_executeembeds str_executeembeds ------------------------------------------------------------- -->[stepv] nextv ?.. IF ( not or ( isfault Midindxs ) ( = Null Midindxs ) ) THEN Midlinks := EACH execute Midlinks ; (...) -->[nextv] str phraseValueList +----------------------------------------------------------+----------------------------+ | |+------+-------------------+| | ||fout 6|backtrack ?no_value|| | |+------+-------------------+| +----------------------------------------------------------+----------------------------+ >> Oops, n value for backtrack - co Change : 05-----05 path_executeEmbeds IS OP path phraseValueList { LOCAL backtrack finn fout line p_temp ; % ; % Create p_temp (str), and delete past versions so update failures will result in a diff error message ; p_temp := executeEmbedsGet_pathTemp ; IF (path_exists '-f' p_temp) THEN host link 'rm "' p_temp '"' ; ENDIF ; % ; finn := open path "r ; fout := open p_temp "w ; WHILE (NOT isfault (line := readfile finn)) DO IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line (("fout fout)("backtrack backtrack)) ; ENDIF ; writefile fout line ; ENDWHILE ; EACH close finn fout ; p_temp } +.....+ To : +.....+ path_executeEmbeds IS OP path phraseValueList { LOCAL backtrack finn fout line p_temp ; % ; % Create p_temp (str), and delete past versions so update failures will result in a diff error message ; p_temp := executeEmbedsGet_pathTemp ; IF (path_exists '-f' p_temp) THEN host link 'rm "' p_temp '"' ; ENDIF ; % ; finn := open path "r ; fout := open p_temp "w ; WHILE (NOT isfault (line := readfile finn)) DO IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line phraseValueList ; ENDIF ; writefile fout line ; ENDWHILE ; EACH close finn fout ; p_temp } +.....+ Retry qnial> webPageSite_update_test o >> still: backtrack rather than n*'../' bodyLinks lack subDir! str_executeEmbeds doesn't put the midLinks back into strList!! I must have dropped it during copy-paste? Change : +.....+ midLinks := EACH execute midLinks ; +.....+ To : +.....+ strList#midIndxs := EACH execute midLinks ; +.....+ Retry qnial> webPageSite_update_test o >> OK!! '../../' substituted for 'backtrack' >> bodyLinks 05-----05
  • K.F. Tapping, R.G. Mathias, D.L. Surkan, " Pandemics and solar activity" . This is an unpublished, expanded paper of the 2001 paper by the same authors, which is listed in the references below. 05-----05 becomes 05-----05
  • K.F. Tapping, R.G. Mathias, D.L. Surkan, Pandemics and solar activity" . This is an unpublished, expanded paper of the 2001 paper by the same authors, which is listed in the references below. 05-----05 >> '' subStr_in_str str) THEN BREAK ; ENDIF ; >> this worked within str_executeEmbeds -->[nextv] " subDirfname = Pandemics, health, and the Sun/influenza/Howell - influenza virus.html >> so why isn't it in the output file? In webPageSite_update : 05-----05 WHILE (NOT isfault (line := readfile finn)) DO % first executeEmbeds if present, with (phrase values) pairList ; IF (OR ('[#!: ' '[#=; ' EACHLEFT subStr_in_str line)) THEN line := str_executeEmbeds line (("fout fout)("backtrack backtrack)) ; ENDIF ; % now process links ; IF ('> This is BACKWARDS!! Change to : 05-----05 WHILE (NOT isfault (line := readfile finn)) DO % process links ; IF ('' subStr_in_str str) THEN BREAK ; ENDIF ; Re-Try Here is the problem : ? 05-----05 -->[nextv] line " -->[nextv] ?.. ( '[nextv] " ?.. Line := str_executeembeds Line ( ( "fout Fout ) ( "backtrack Backtrack ) ) -->[nextv] " 05-----05 >> missing subDir!?, but at least the '../../' is written to the output now. internalLinks_return_relativePath isn't putting in the subDir - why? Move break to internalLinks_return_relativePath : IF ('' subStr_in_str line) THEN BREAK ; ENDIF ; Nuts! - elimiating line because of `# 05-----05 ?.. ( or ( Midindxslines_bads EACHLEFT substr_in_str Linelist @ I ) ) -->[nextv] ?.. or ( Midindxslines_bads EACHLEFT substr_in_str Linelist @ I ) -->[nextv] l l ?.. Null 05-----05 Remove '#' from midIndxsLines_bads Change : +.....+ IF (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@i)) THEN +.....+ To : +.....+ IF (OR (= `# (first lineList@i)) (OR (midIndxsLines_bads EACHLEFT subStr_in_str lineList@i))) THEN +.....+ ?.. Linelist @ I := link Backtrack ( ( I_fname pick Allpathssortedbyfname ) str_remove_substr D_webraw ) -->[nextv] +-+--------++-+--------------------+-+-+-++ | ||"|| +-+--------++-+--------------------+-+-+-++ >> '../../' should NOT be in there - how did this happen? I added : liner := line str_remove_subStr '[#=; backtrack ;=#]' ; After much floundering : ?.. Linelist @ I := link Backtrack ( ( I_fname pick Allpathssortedbyfname ) str_remove_substr D_webraw ) -->[nextv] i_fname 2477 -->[nextv] +-+--------++-+----------------------------------------------------------------------------------------------- | ||"|| ---------+-+-+-++ OK, now subDirs are added 24************************24 16Nov2020 webPageSite_update, [str, path]_executeEmbeds current issues : '[#=; backtrack ;=#]' rather than n*'../' bodyLinks lack subDir! 'strings.ndf' Change : +.....+ strList#midIndxs := EACH execute midLinks ; +.....+ To : +.....+ midLinks := EACH execute midLinks ; +.....+ qnial> flag_break := l l qnial> webPageSite_update_test o subDirfname = Pandemics, health, and the Sun/influenza/Howell - influenza virus.html >> the break in str_executeEmbeds never occurs, WHY? midIndxs strList := str_splitLftRgtTo_midIndxs_StrList '[#=; ' ' ;=#]' strNew ; -->[nextv] midIndxs strList ++----------------------------------------------------------------+ ||+---------------------------05-----05---------05-----05------------+| ||| || ||+---------------------------05-----05---------05-----05------------+| ++----------------------------------------------------------------+ >> why is midIndxs null? Run str_splitLftRgtTo_midIndxs_StrList tests, something must have changed? I have to create a test in link d_Qtest 'strings- tests.ndf' Check str_splitWith_subStr_test All 12 work EXCEPT the 2 with '[#=; '!! qnial> str_splitLftRgtTo_midIndxs_StrList_test #05-----05 str_splitLftRgtTo_midIndxs_StrList_test, Mon Nov 16 10:30:23 2020 # string_splitWith_string_test example 1 : FAILED - result does NOT match standard 05-----05-----+------------------------------------------------------------------+ |[#=; | ;=#]|| 05-----05-----+------------------------------------------------------------------+ +---------05-----05---------05-----05--------------------------------------+ || +---------05-----05---------05-----05--------------------------------------+ ++------------------------------------------------------------------------+ ||+---------05-----05---------05-----05--------------------------------------+| ||||| ||+---------05-----05---------05-----05--------------------------------------+| ++------------------------------------------------------------------------+ qnial> loaddefs link d_Qtest 'strings- tests.ndf' >>> loading start : strings- tests.ndf <<< loading ended : strings- tests.ndf qnial> str_splitLftRgtTo_midIndxs_StrList_test #05-----05 str_splitLftRgtTo_midIndxs_StrList_test, Mon Nov 16 10:31:04 2020 # string_splitWith_string_test example 1 : FAILED - result does NOT match standard 05-----05-----+------------------------------------------------------------------+ |[#=; | ;=#]|| 05-----05-----+------------------------------------------------------------------+ +-+------------------------------------------------------------------------+ |2|+---------05-----05---------05-----05--------------------------------------+| | |||| | |+---------05-----05---------05-----05--------------------------------------+| +-+------------------------------------------------------------------------+ ++------------------------------------------------------------------------+ ||+---------05-----05---------05-----05--------------------------------------+| ||||| ||+---------05-----05---------05-----05--------------------------------------+| ++------------------------------------------------------------------------+ >> Interesting, missing midIndxs. what happened? str_splitWith_subStr_test : 8 tests are OK break in str_splitLftRgtTo_midIndxs_StrList_test str_splitLftRgtTo_midIndxs_StrList Change : +.....+ IF (OR EACH OR (null EACHRIGHT = i_heads i_tails)) +.....+ To : +.....+ IF (OR (null EACHRIGHT = i_heads i_tails)) +.....+ I think the change was to make weird heasdTails work str_splitLftRgtTo_midIndxs_StrList IS OP strLft strRgt str { LOCAL i i_heads i_tails midIndxs splits valids ; IF flag_break THEN BREAK ; ENDIF ; splits := link (strRgt EACHRIGHT str_splitWith_subStr (str_splitWith_subStr strLft str) ) ; midIndxs := tell (gage shape splits) ; i_heads := (strLft EACHRIGHT = splits) sublist midIndxs ; i_tails := (strRgt EACHRIGHT = splits) sublist midIndxs ; IF (OR (null EACHRIGHT = i_heads i_tails)) THEN null (fault '?str_splitLftRgtTo_midIndxs_StrList error : OR[i_heads, i_tails] is null') ELSE valids := 3 + (((i_heads + 4) EACHLEFT in i_tails) sublist i_heads) ; valids splits ENDIF } str_splitLftRgtTo_midIndxs_StrList Change : +.....+ valids := 3 + (((i_heads + 4) EACHLEFT in i_tails) sublist i_heads) ; +.....+ To : +.....+ valids := 1 + (((i_heads + 2) EACHLEFT in i_tails) sublist i_heads) ; +.....+ >> OK, this works. But it must have "damaged" the other application. Add more str_splitLftRgtTo_midIndxs_StrList_test & re-test ['> perhaps because midInxs is a list?! >> OK, that was that problem. Now to change str_splitLftRgtTo_midIndxs_StrList for ['> OK, now all 5 str_splitLftRgtTo_midIndxs_StrList_test are OK Double-check other link d_Qtest 'strings- tests.ndf' : I changed outputs of all tests to use : test_comment t_name t_input t_standard t_result ; >> I need to fix the format! qnial> strings_alltest link d_Qtest '201116 12h25m44s alltest strings.txt' : +----+ Summary of test results : /media/bill/PROJECTS/Qnial/code develop_test/201116 12h25m44s alltest strings.txt, date= 201116 12h25m # str_to_unicodeList_test example #1 : OK - result matches standard # str_to_unicodeList_test example #2 : OK - result matches standard # str_to_unicodeList_test example #3 : OK - result matches standard # str_to_unicodeList_test example #4 : OK - result matches standard # str_to_unicodeList_test example #5 : OK - result matches standard # str_to_unicodeList_test example #6 : OK - result matches standard # string_sub_test example #1 : OK - result matches standard # string_sub_test example #2 : OK - result matches standard # string_sub_test example #3 : OK - result matches standard # string_sub_test example #4 : OK - result matches standard # string_sub_test example #5 : OK - result matches standard # string_sub_test example #6 : OK - result matches standard # string_sub_test example #7 : OK - result matches standard # string_sub_test example #8 : OK - result matches standard # string_sub_test example #9 : OK - result matches standard # string_sub_test example #10 : OK - result matches standard # string_sub_test example #11 : OK - result matches standard # string_sub_test example #12 : OK - result matches standard # string_sub_test example #13 : OK - result matches standard # str_splitWith_subStr_test example #1 : OK - result matches standard # str_splitWith_subStr_test example #2 : OK - result matches standard # str_splitWith_subStr_test example #3 : OK - result matches standard # str_splitWith_subStr_test example #4 : OK - result matches standard # str_splitWith_subStr_test example #5 : OK - result matches standard # str_splitWith_subStr_test example #6 : OK - result matches standard # str_splitWith_subStr_test example #7 : OK - result matches standard # str_splitWith_subStr_test example #8 : OK - result matches standard # str_splitLftRgtTo_midIndxs_StrList_test example #1 : OK - result matches standard # str_splitLftRgtTo_midIndxs_StrList_test example #2 : OK - result matches standard # str_splitLftRgtTo_midIndxs_StrList_test example #3 : OK - result matches standard # str_splitLftRgtTo_midIndxs_StrList_test example #4 : OK - result matches standard # str_splitLftRgtTo_midIndxs_StrList_test example #5 : OK - result matches standard +----+ 24************************24 15Nov2020 [str, path]_executeEmbeds 06:20 [develop, test] new code with 'Howell - influenza virus.html' Key upgrades to : #] str_executeEmbeds IS OP str phraseValueList - execute embeds in line, return a str #] path_executeEmbeds IS OP path phraseValueList - execute embeds in a file 'Howell - influenza virus.html' New optrs : #] executeEmbedsGet_pathTemp IS - generate a unique p_temp fname #] path_executeEmbedsInsertIn_fHand IS OP path phraseValueList - execute embeds, insert in fileHandle 14:34 webPageSite_update_test with 'Howell - influenza virus.html' 02--02 qnial> webPageSite_update_test o ?type error in fault ?type error in fault ?type error in fault ?type error in fault ?type error in fault p_webSite = /media/bill/HOWELL_BASE/Website/Pandemics, health, and the Sun/influenza/Howell - influenza virus.html subDir = Pandemics, health, and the Sun/influenza/ 02--02 >> NO [embeds, links] survived, >> later : nyet - the backtracks survived as original, no processing the whole-line-embeds disappeared entirely 'Website updates.ndf' Change : +.....+ line := str_executeEmbeds line fout backtrack ; +.....+ To : +.....+ line := str_executeEmbeds line (("fout fout)("backtrack backtrack)) ; +.....+ >> No faults, but no embeds either... internal links no longer in webPage Check within-line-embeds '[#=; ' break in str_executeEmbeds 02--02 qnial> webPageSite_update_test o >> no break occurred, same error? 24************************24 14Nov2020 fix up tests - a bit confusing 21:27 'Website updates.ndf', Change : +.....+ line := str_executeEmbeds line fout backtrack ; +.....+ To : +.....+ line := str_executeEmbeds line (("fout fout) ("backtrack backtrack)) ; +.....+ > I probably need a list of paired (("phrase value)...) 16:57 str_executeEmbeds isrepeating the footer part multiple times? looking at a typical full-line embed, which is not set up properly : 02--02 [#!: str_executeEmbeds (link d_webWork 'fin Head_one.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; 02--02 Note the recursive use of embedding. [#!: strList_writeTo_path (str_executeEmbeds (link d_webWork 'fin Head_one.html')) (p_temp := link d_temp 'executeEmbed temp.txt') ; path_insertIn_fHand p_temp fout ; ???????????? 16:52 I fixed path_retrieve_subDirFname $ find "$d_Qroot" -maxdepth 3 -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number 'path_retrieve_subDirFname' "FILE" /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:30: path_retrieve_subDirFname IS OP path dirBase - returns (subDir fname) from a path /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:245:IF flag_debug THEN write 'loading path_retrieve_subDirFname' ; ENDIF ; /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:247:#] path_retrieve_subDirFname IS OP path dirBase - returns (subDir fname) from a path /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:252: path_retrieve_subDirFname IS OP path dirBase /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:256: THEN fault '?path_retrieve_subDirFname error, dirBase not in path' /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:260: THEN fault '?path_retrieve_subDirFname error, fname' 14Nov2020 11:49 $ find "$d_Qroot" -maxdepth 3 -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep -w -i --with-filename --line-number 'd_webDone' "FILE" 05-----05 /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:181:# Note that menuHeadFoot are NOT [copied, converted] to d_webDone, as they are useless there. /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:192: NONLOCAL d_htmlBackup d_webRaw d_webDone ; /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:265:# Note that menuHeadFoot are NOT [copied, converted] to d_webDone, as they are useless there. /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:276: NONLOCAL d_htmlBackup d_webRaw d_webDone ; /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:358: d_htmlBackup := link d_webDone 'z_Archive/' timestamp_YYMMDD_HMS ' backups/' ; /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:20: - for testing only (depends on [pinn, d_webRaw, d_webDone]) /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:22: depther inputs (eg n times '../') are arbitrary - for testing only (depends on [pinn, d_webRaw, d_webDone]) /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:60:#] - for testing only (depends on [pinn, d_webRaw, d_webDone]) /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:234:#] depther inputs (eg n times '../') are arbitrary - for testing only (depends on [pinn, d_webRaw, d_webDone]) /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:400: NONLOCAL d_htmlBackup d_webRaw d_webDone ; /media/bill/PROJECTS/Qnial/code develop_test/Website updates- tests.ndf:419: NONLOCAL d_htmlBackup d_webRaw d_webDone ; 05-----05 >> replace d_webDone with d_webSite ***** 13Nov2020 thorough check of [menuHeadFoot, body]links several menus are not working - even more subMenus fail (maybe most?) bodylinks are NOT working, the subDirs are missing! I revamped webPageRaw_update so that no [test, backup] files appear in d_webRaw - they are in d_htmlBackup. The subDirs are still missing - may not be a problem as lon this is done for the webSite? BUT - the code indicates that the full path SHOULD appear!!??? 02--02 lineList@i := link backtrack ((i_fname pick allPathsSortedByFname) str_remove_subStr d_webRaw) ; ... lineList@i := link backtrack ((i_subDir pick allSubDirsSortedBySubdir) str_remove_subStr d_webRaw) ; 02--02 Run a test webPageDone_update to see what happens. 05-----05 Olde code # 13Nov2020 no longer used dw_base := d_webRaw ; ds_base := d_webDone ; ****** 11Nov2020 update to website, check menuHeadFoots webSite_update etc, etc >> webSite_update seems to work. From a quick check - most menu items seem to work. A few exceptions were noted : 02--02 top menu - all looks good subMenu Neural Networks : Neural Nets - no works MindCode - OK subMenu Projects : MindCode neural network - OK Randell Mills- hydrinos - no works Puetz - The Greatest of cycles - no works rest of subMenus look good 02--02 >> a thorough check is needed, including bodyLinks! Do a more thorough check Friday! ****** 11Nov2020 webPage_update_test continued Idiot - I chopped the " and not the >, so now changed to : THEN line := internalLinks_return_relativePath backtrack 'this Directory's listing. Check d_website version : No update? - oops, at present, this goes to 'test- ' versions. Leave it for debugging >> test versions don't have menuHeadFoots Now, run ALL webPage_update_test : check log file, and output files >> still a problem with ./ eg : You can see these files via this Directory's listing. >> but only for [1,2]? backup '1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' by dating then rename 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' to '1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' >> This reduces the [known, good] changes from future diff outputs, making it easier to focus on real problems. 05-----05 Why is ./ still a problem? : I added './' to midIndxsLines_bads Removed from internalLinks_return_relativePath : % check if ./ ; ELSEIF (= './' lineList@i) THEN null ; Re-run webPage_update_test for only : 'economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' >> 05-----05 menuHeadFoot disappearance : str_executeEmbeds 02--02 -->[nextv] str fout backtrack +------------------------------------------------------------------------------------------------------------- |[#!: pinn_writeExecute_pout (link d_webWork 'fin Head_one.html') stdTmp d_webRaw d_webSite ; path_ins +------------------------------------------------------------------------------------------------------------- ------------------------------+-+---------+ ertIn_fHand stdTmp fout ; |4|../../../| ------------------------------+-+---------+ -->[nextv] ?undefined identifier: PINN_WRITEEXECUTE_POUT <***> ( LINK D_WEBWORK 02--02 >> Arrrggghhh! I forgot to change the embeds!! Create a new fileops.ndf optr : str_replaceIn_pathList IS OP flag_backup d_backup strOld strNew f_Pattern pathList str_replaceIn_pathList IS OP flag_backup d_backup strOld strNew f_Pattern pathList I did a huge amount of fixing : created str_replaceIn_pathList converted od optr name : str_replaceIn_pathList d_webRaw 'pinn_writeExecute_pout' 'str_executeEmbeds' htmlPathsSortedByFname 24************************24 10Nov2020 webPage_update_test - continue to change, debug Run (single example for development) : webPage_update_test oops - blew up 02--02 qnial> webPage_update_test increasing call stack to 200 increasing call stack to 300 ... Segmentation fault 02--02 >> probably a path error or infinite loop [webPage, pout] ARE checked to stop progress, so maybe an infinite loop? 24************************24 10Nov2020 05-----05 Problem with subDir faults - internalLinks_return_relativePath_test tests #[5 6] More efficient to create a new list allEndSubDirsSortedByFullSubdir Key changes to internalLinks_return_relativePath_test >> now all tests work nicely! 05-----05 test webPage_update # Set of tests : # htmlFname := 'economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' # htmlFname := 'page Howell - blog.html' # htmlFname := 'Pandemics, health, and the Sun/corona virus/Howell - corona virus.html' # htmlFname := 'Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html' # htmlFname := 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html' # webPage_update (d_webRaw link htmlFname) 'page Howell - blog.html'
    >> IMG SRC problem - it goes to ">, but should only go to the next " Change : +.....+ IF ('' line ; +.....+ To : +.....+ IF ('> one step, but NO backtracks & subDir not provided for > ALL internalLinks_return_relativePath_test are OK. >> But now for ALL test-output file : no menueHeadFoot ' '[nextv] line
    ?.. Midindxs Linelist := str_splitlftrgtto_midindxs_strlist Strleft Strright Liner -->[nextv] lineList ?no_value -->[nextv] ++-------------------------------------------------------------------------+ ||?str_splitLftRgtTo_midIndxs_StrList error : OR[i_heads, i_tails] is null| ++-------------------------------------------------------------------------+ fix str_splitLftRgtTo_midIndxs_StrList to use " = strRgt Change : +.....+ valids := 1 + (((i_heads + 2) EACHLEFT in i_tails) sublist i_heads) ; +.....+ To : +.....+ valids := 1 + (((i_heads + 4) EACHLEFT in i_tails) sublist i_heads) ; +.....+ Also - remove " from strRght Change str_splitLftRgtTo_midIndxs_StrList : +.....+ valids := 2 + (((i_heads + 4) EACHLEFT in i_tails) sublist i_heads) ; +.....+ To : +.....+ valids := 3 + (((i_heads + 4) EACHLEFT in i_tails) sublist i_heads) ; +.....+ Remove " from strLeft strRght in 'Website updates- tests.ndf' Re-run : internalLinks_return_backupSubDirFnames_test : 02--02 # internalLinks_return_backupSubDirFnames_test example #4 : FAILED - result does NOT match standard
  • gnuplot Ive used the unofficial extension .plt to designate gnuplot scripts for each of the graphs. You can see these files via this Directorys listing.
  • gnuplot Ive used the unofficial extension .plt to designate gnuplot scripts for each of the graphs. You can see these files via this Directorys listing. 02--02 >> now the ./ doesn't work - oops, I had forgotten to add it to internalLinks_return_backupSubDirFnames internalLinks_return_relativePath_test : 02--02 # internalLinks_return_relativePath_test example #9 : FAILED - result does NOT match standard WIDTH=90% NAME="1872-2020 SP500 index semi-log detrended"

    02--02 >> most files are OK >> how did this get the wrong file???? Oops - I mixed up the standard with another example >> Now ALL tests are OK If possible, I need to consolidate internalLinks_return_backupSubDirFnames - This is a "shorthand" version for editing, '[#=; backtrack ;=#]' used by webPage_convertBodyLinks internalLinks_return_relativePath - This is the "full" version for updating the webSite used by webPage_update For : internalLinks_return_relativePath_BodyLinks_test t_input := '[#=; backtrack ;=#]' blah blah OK - more changes to consolidate [internalLinks_return_backupSubDirFnames, internalLinks_return_relativePath] Re-run : internalLinks_return_relativePath_BodyLinks_test : 02--02 [3,5,6,8,9] failed, but [3,5,6,9] are due to incorrect standards, given my current "full path" approach - I fixed the standard response [3,5,6,9] - now the full '> All are OK now internalLinks_return_relativePath_test : >> [1-12] all are OK 05-----05 Olde code # OUTMODED! now outputs go into sa d_webRaw subDir as the webPage IF flag_debug THEN write 'loading webPage_update_test' ; ENDIF ; #] webPage_update_test IS - Check for proper processing of embedded executables # These tests cannot be run from d_Qtests : I must create "test" output files as standards in d_webSite # uncomment to run one test (insert line below, remove extra line for next test definition below) webPage_update_test IS OP webPage { LOCAL comments flog i_test p_inn p_log p_std ; flag_screenOut := o ; p_log := link d_Qtest 'webPage_update_test log.txt' ; path_backupDated_delete p_log ; flog := open p_log "a ; flog EACHRIGHT writefile '********' (link 'webPage_update_test, ' timestamp_DDMMMYYYY_HMS) '' ; close flog ; i_test := 0 ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_test_output flag_screenOut i_test p_inn p_std comments ; } # webPage_update_test_bag webPage_update_test IS { LOCAL comments flog i_test p_inn p_log p_std ; flag_screenOut := o ; p_log := link d_Qtest 'webPage_update_test log.txt' ; path_backupDated_delete p_log ; flog := open p_log "a ; flog EACHRIGHT writefile '********' (link 'webPage_update_test, ' timestamp_DDMMMYYYY_HMS) '' ; close flog ; i_test := 0 ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- page Howell - blog.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- page Howell - blog.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_test_output flag_screenOut i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_test_output flag_screenOut i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- Howell - corona virus.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- Howell - corona virus.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_test_output flag_screenOut i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- _Lies, damned lies, and scientists.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- _Lies, damned lies, and scientists.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_test_output flag_screenOut i_test p_inn p_std comments ; webPage_update_output i_test p_inn p_std comments ; } # olde version IF flag_debug THEN write 'loading internalLinks_return_backupSubDirFnames' ; ENDIF ; #] internalLinks_return_backupSubDirFnames IS OP strLeft strRight line - returns backupFnameSubDir #] for an an internal link, where SubDir goes into d_webSite # Fixing links is noble, but perhaps equally important is [de-list, label]ing those that are flawed? # and labelling "bad links" for manfixes later ; # subDirs may be vulnerable to duplicates in different directories? Only the first match is used? # EACH non-http link should have EITHER '[#=; backtrack ;=#]' or '!!linkError!!', but not both. # I need to do more for paths with `#, as some van be "rescued". later ... internalLinks_return_backupSubDirFnames IS OP strLeft strRight line { LOCAL i lineList fixIndxs_in fname midIndx midIndxs midLinks n_midLinks path subDir ; NONLOCAL allPathsSortedByPath allFnamesSortedByFname ; % ; midIndxs lineList := str_splitLftRgtTo_midIndxs_StrList strLeft strRight line ; midLinks := lineList#midIndxs ; n_midLinks := gage shape midLinks ; fixIndxs_in := n_midLinks reshape l ; % most lines do not have [ gnuplot Ive used the unofficial extension .plt to designate gnuplot scripts for each of the graphs. You can see these files via this Directorys listing.' # str_remove_subStr '
  • gnuplot Ive used the unofficial extension .plt to designate gnuplot scripts for each of the graphs. You can see these files via this Directorys listing.' 'hello' # (solitary 'Past & future worlds.html') EACHLEFT find_howell Allfnamessortedbyfname # loaddefs link d_Qndfs 'Website updates.ndf' # olde code removed (path_extract_dir pinn) (path_extract_fname pinn) 07Nov2020 : pouter := pout ; IF (~= stdTmp pouter) THEN pouter := link d_out subDir ; ENDIF ; depther_global := depther ; continue := l ; IF continue THEN ENDIF ; 24************************24 09Nov2020 So, backtracks seem OK, but fname only, no paths!!! re-run internalLinks_return_relativePath_test : >> They ALL failed!!! (arghhh!) quick tes qnial> 1028 pick allPathsSortedByFname /media/bill/SWAPPER/Website - raw/Software programming & code/bin/dir_size sum, net test.txt >> OK run test on one file >> internalLinks_return_relativePath flag_break is not encountered for correct fname, no subDir >> I had problem before!! reverted to fail qnial> internalLinks_return_relativePath_test #05-----05 internalLinks_return_relativePath_test, Mon Nov 9 17:08:26 2020 # internalLinks_return_relativePath_test example #1 : FAILED - result does NOT match standard t_input, t_standard, t_result = +---------02--02--------------------------------------------------------------+ ||
  • | +---------02--02--------------------------------------------------------------+
  • ?op_parameter Oops!! - left-over for some reason? Change : +.....+ t_result := internalLinks_return_relativePath l t_input ; +.....+ To : +.....+ t_result := internalLinks_return_relativePath t_input ; +.....+ Are '../../../' being removed? No - I forgot to add that coding. I applied remove to entire line of internalLinks_return_relativePath : line := line str_remove_subStr '../' ; midIndxs lineList := str_splitLftRgtTo_midIndxs_StrList strLeft strRight line ; changed lineliner so it can be modified Re-try internalLinks_return_relativePath_test : >> Nuts - I got the midIndx wrong - it does refer to path pathsIndxs := midIndxs + 3 ; paths := lineList#pathsIndxs ; pathList := null ; I significantly simplified internalLinks_return_relativePath I am missing a test for bads, eg from internalLinks_return_backupSubDirFnames : IF (OR (midIndxsLines_bads EACHLEFT subStr_in_str path)) THEN fixIndxs_in@i := o ; ENDIF ; I adapted this] Re-try internalLinks_return_relativePath_test : 24************************24 09Nov2020 webPage_update_test After MANY, many changes, run test : >> OOPS : 1. 'A HREF="' etc removed 2. only '../' appears 3. p_log doesn't work 1. internalLinks_return_relativePath - removed : % remove [#= =#] brackets ; shaper := gage shape midIndxs ; lineList#(midIndxs - 1) := shaper reshape (solitary null) ; lineList#(midIndxs + 1) := shaper reshape (solitary null) ; 2. in webPage_update : 02--02 depther_global := 0 ; IF (OR ('Menu' 'fin Head' 'fin Footer' 'fin footer' EACHLEFT subStr_in_str pinn)) THEN depther := depther_global ; ELSE depther := (gage shape (`/ findAll_Howell pinn )) - (gage shape (`/ findAll_Howell d_webRaw)); ENDIF ; 02--02 >> but d_Qtests is used, NOT d_webRaw U could put in something like : 02--02 ELSEIF (d_Qtest subStr_in_str pinn) THEN depther := (gage shape (`/ findAll_Howell pinn )) - (gage shape (`/ findAll_Howell d_webRaw)); 02--02 >> but this is just complicates everything. Just leave as is and rely on actual webPage updates to check backtracks 3. flag_test is currently set in webPage_update_test_output, so it's OK : webPage_update l p_inn p_temp_webPageUpdate ; Re-run webPage_update_test_output with log output and flag_tst := o This allows diff to compare. Interesting - stray `> gave double-HREF!?!? In addition to the video of the presentation, "Howell 161220 Big Data, Deep Learning, Safety.ogv">, I have also In addition to the video of the presentation, "Howell 161220 Big Data, Deep Learning, Safety.ogv!!linkError!!">, I have also posted >> worry about this later - good that '!!linkError!!' flags it. 05-----05 Try all tests qnial> webPage_update_test diff: /media/bill/PROJECTS/Qnial/code develop_test/test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html update.html: No such file or directory was missing 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html update.html' >> copied from p_temp_webPageUpdate Redo tests, but with flag_screenOut # htmlFname := 'economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' # webPage_update l (d_webRaw link htmlFname) (d_webSite link htmlFname) >> seems OK >> retried after failures below - still seems to work # htmlFname := 'page Howell - blog.html' # webPage_update l (d_webRaw link htmlFname) (d_webSite link htmlFname) >> total failure of backtracks >> wait - root wbPages won't have backtracks, but then we sould see a full oath!! # htmlFname := 'Pandemics, health, and the Sun/corona virus/Howell - corona virus.html' # webPage_update l (d_webRaw link htmlFname) (d_webSite link htmlFname) >> total failure of backtracks >> [#=; backtrack ;=#] not in webPage? (did I run webPage_convertBodyLinks?) >> wait - root wbPages won't have backtracks, but then we sould see a full oath!! # htmlFname := 'Lies, Damned Lies, and Scientists/_Lies, damned lies, and scientists.html' # webPage_update l (d_webRaw link htmlFname) (d_webSite link htmlFname) >> backtrack OK, but fname only, no paths!!! # htmlFname := 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html' # webPage_update l (d_webRaw link htmlFname) (d_webSite link htmlFname) >> backtrack OK, but fname only, no paths!!! 05-----05 So, backtracks seem OK, but fname only, no paths!!! stop now - supper and preps for FireBoD meeting 24************************24 09Nov2020 05-----05 Problem with backtrack (was due to faulty code, see below) : # tests - playing with backtrack # backtrack := link chr_apo '../../../' chr_apo ' ' chr_apo 'banana' chr_apo # backtrack := link 'pass ' chr_apo '../../../' chr_apo # backtrack := link chr_apo '../../../' chr_apo # execute backtrack # olde code %strList#midIndxs := (chr_apo EACHRIGHT link strList#midIndxs) EACHLEFT link chr_apo ; # tests - t_input t_input := '
  • ' 1 (link 'pass ' chr_apo '../../../' chr_apo) ;
  • ?.. Strlist # Midindxs := EACH execute Strlist # Midindxs -->[nextv] backtrack pass '../../../' -->[nextv] execute backtrack ../../../ -->[nextv] each execute midLinks +----------------+ |pass '../../../'| +----------------+ -->[nextv] execute link 'pass ' chr_apo '../../../' chr_apo ../../../ >> why isn't this working? -->[nextv] backtrack pass '../../../' -->[nextv] execute backtrack ../../../ # 02--02 t_input := '
  • ' 1 (link chr_apo '../../../' chr_apo) ;
  • >> close, but no cigar 02--02 t_input := '
  • ' 1 '5' ; >> works fine, ts just a string that should remain a string (without quotes) that is a problem >> after Change : +.....+ strList#midIndxs := EACH execute strList#midIndxs ; +.....+ To : +.....+ strList#midIndxs := EACH execute midLinks ; +.....+ ?.. Strlist # Midindxs := EACH execute Midlinks -->[nextv] +--------------------------05-----05-----------05-----05---------------------------+ |
  • | +--------------------------05-----05-----------05-----05---------------------------+ 02--02 Now try, post-correction : t_input := '
  • ' 1 '../../../' ; # str_executeEmbeds_test example #1 : OK - result matches standard t_input, t_standard, t_result = +------------------------------------------------------------------------+-+---------+ |
  • |1|../../../| +------------------------------------------------------------------------+-+---------+
  • 05-----05 Remove : # loaddefs link d_Qtest 'Website updates- tests.ndf' IF flag_debug THEN write 'loading webPage_update_output' ; ENDIF ; # webPage_update_output IS - write results to log file # 07Nov2020 initial webPage_update_output IS OP i_test p_inn p_std comments { LOCAL flog p_temp_BodyLinks ; NONLOCAL d_webRaw d_webSite ; p_temp_webPageUpdate := link d_temp 'webPageUpdate temp.txt' ; p_log := link d_Qtest 'webPage_update_test log.txt' ; flog := open p_log "a ; flog EACHRIGHT writefile '........' (link '# webPage_update example #' (string i_test)) (link 'webPage_update_test for : "' (path_extract_fname p_inn) '"') ; flog EACHRIGHT writefile comments ; writefile flog 'diff results : ' ; close flog ; % pinn_executeEmbeddedTo_pout has the [dir, file] existence checks, so no need here ; pinn_executeEmbeddedTo_pout p_inn p_temp_webPageUpdate d_webRaw d_webSite ; host (link 'diff --width=85 "' p_std '" "' p_temp_webPageUpdate '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; host link 'echo "" >>"' p_log '"' ; % 07Nov2020 manually move p_temp_webPageUpdate to p_std as required ; } # loaddefs link d_Qtest 'Website updates- tests.ndf' IF flag_debug THEN write 'loading webPage_update_test' ; ENDIF ; # webPage_update_test IS - change links in the body of html to the relative format # This doesn't help at all, as I can't test the links unless a file is in d_webSte!! # 07Nov2020 test created webPage_update_test IS { LOCAL comments flog i_test p_inn p_log p_std ; p_log := link d_Qtest 'webPage_update_test log.txt' ; path_backupDated_delete p_log ; flog := open p_log "a ; flog EACHRIGHT writefile '********' (link 'webPage_update_test, ' timestamp_DDMMMYYYY_HMS) '' ; close flog ; i_test := 0 ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- page Howell - blog.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- page Howell - blog.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_output i_test p_inn p_std comments ; } # webPage_update_test_bag % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- page Howell - blog.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- page Howell - blog.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_output i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_output i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- Howell - corona virus.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- Howell - corona virus.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_output i_test p_inn p_std comments ; % ; i_test := i_test + 1 ; p_inn := link d_Qtest 'test- _Lies, damned lies, and scientists.html convertBodyLinks.html' ; p_std := link d_Qtest 'test- _Lies, damned lies, and scientists.html update.html' ; comments := 'This file has many "", no "mailto:". Straightforward.' '29Oct2020 - I still have to do a thorough check of the output, all cases.' ; webPage_update_output i_test p_inn p_std comments ; 24************************24 08Nov2020 test & fix pinn_writeExecute_pout 05-----05 # Header example : [#!: pinn_writeExecute_pout (link d_webWork 'fin Head_one.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; Howell web - Past and Future Worlds [#!: pinn_writeExecute_pout (link d_webWork 'fin Head_two.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; [#!: pinn_writeExecute_pout (link d_webWork 'Menu.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; [#!: pinn_writeExecute_pout (link d_webWork 'Menu Howell videos.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; # For all webPages, should change to : [#!: menuHeadFoot_writeTo_fout (link d_webWork 'fin Head_one.html') fout backtrack ; Howell web - Past and Future Worlds [#!: menuHeadFoot_writeTo_fout (link d_webWork 'fin Head_two.html') fout backtrack ; [#!: menuHeadFoot_writeTo_fout (link d_webWork 'Menu.html') fout backtrack ; [#!: menuHeadFoot_writeTo_fout (link d_webWork 'Menu Howell videos.html') fout backtrack ; 24************************24 07Nov2020 Abandoned as useless : created : webPage_update_output IS - write results to log file pinn_writeExecute_pout - I commented out : % 0Nov2020 comment out : IF (~= stdTmp pouter) THEN pouter := link d_out subDir ; ENDIF ; pouter >> pouter is last line of optr webPage_update_test on 'test- page Howell - blog.html convertBodyLinks.html' : >> WRONG!!!, needs FULL subDir!!! most links are the same... what is wrong? nyet : I think its because pinn_writeExecute_pout is baffled by having dwebRaw & d_webSite in same dir? nyet - useless : I created webPage_update_output to handle test webPage updates. I removed the 'pouter' return of pinn_executeEmbeddedTo_pout!! (idiot) 02--02 Set of tests : see link d_Qndfs 'file_ops.ndf' >> the full path of a file is NOT provided, nor are the '../' Where is "backtrack"? It's in pinn_executeEmbedsTo_pout I have to find the full path to [fnames, subDirs] in pinn_executeEmbedsTo_pout Major adaptation from internalLinks_return_backupSubDirFnames 17:38 OK - loaddefs works, but probably not pinn_executeEmbedsTo_pout Break for the day, do dishes. 24************************24 07Nov2020 05-----05 See how many linkErrors remain : $ find "$d_webRaw" -maxdepth 3 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" | grep --invert-match "z_Archive" 02--02 These ones are known problems (probably never did create the files?) : /media/bill/SWAPPER/Website - raw/page Howell - blog.html:75:
  • 02--02 These ones shouldn't be a problem, but... : fname problems? : /media/bill/SWAPPER/Website - raw/page Howell - blog.html:763:
  • >> just recopy path /media/bill/SWAPPER/Website - raw/page Howell - blog.html:888:
  • >> oops, apo /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/Howell - pandemics and disease blog.html:45:

    >> oops "2020" /media/bill/SWAPPER/Website - raw/page Publications & reports.html:61:
  • Bill Howell "Are we ready for global cooling?" - A short presentation to Toastmasters – Dows Lake, Ottawa, 14Mar06. Needs corrections and comments! (some time later...)

    /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:177: /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:179: /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:181: /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:194: /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:196: /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/corona virus/Howell - corona virus.html:204: >> I either fixed the fname, or re-[copy, paste]ed the fname ?subDir not working? : /media/bill/SWAPPER/Website - raw/page Software programming.html:56:
    /media/bill/SWAPPER/Website - raw/page Software programming.html:12:
  • /media/bill/SWAPPER/Website - raw/index.html:100:
  • Cool emails /media/bill/SWAPPER/Website - raw/index.html:122:
  • Puetz greatest of cycles /media/bill/SWAPPER/Website - raw/index.html:130:
  • System_maintenance /media/bill/SWAPPER/Website - raw/page blogs.html:15:
  • /media/bill/SWAPPER/Website - raw/Bill Howells videos/170930 Past and Future Worlds - a STEM for kids/Past & future worlds.html:18:
  • >> I corrected "Software programming & code/Qnial programming language/" -> "Software programming & code/Qnial/" >> I corrected severeal other subDir mistakes, including the requirement for a FULL subDir >> Is the "multiple subD" issue manifesting? eg [Cool emails, Scenes,...] /media/bill/SWAPPER/Website - raw/Lies, Damned Lies, and Scientists/General Relativity is a turkey, Quantum Mechanics is a fools paradise.html:13:I first posted this theme in my review "???". >> OK - never put in a link. I think this was 'Howell - review of Holverstott 2016 Hydrino energy.pdf' /media/bill/SWAPPER/Website - raw/index.html:111:
  • Linux bash scripts >> OK - wrong subDir I changed to 'Software programming & code/bin/' 02--02 05-----05 05-----05 Rerun qnial> loaddefs link d_Qtest 'Website updates- tests.ndf' 05-----05 qnial> webSite_convertBodyLinks 02--02 ?webPage_convertBodyLinks file unknown error, one of : /media/bill/SWAPPER/Website - raw/economics, markets/Nuclear for tar sands 23Sep05.html /media/bill/ramdisk/convertBodyLinks temp.txt diff: /media/bill/SWAPPER/Website - raw/z_Archive/201107 09h58m28s backups/Nuclear for tar sands 23Sep05.html: No such file or directory diff: /media/bill/SWAPPER/Website - raw/economics, markets/Nuclear for tar sands 23Sep05.html: No such file or directory 02--02 >> If p_htmlFileList is updated each loaddef, why does 'Nuclear for tar sands 23Sep05.html' even appear? Are z_archive files included (shouldn't be)? >> 'Nuclear for tar sands 23Sep05.html' WAS in p_htmlFileList. Why? Check 'webWork files/201107 09h58m28s webSite_convertBodyLinks log.txt' 02--02 These changes didn't work? for now - manually fix in d_webRaw "active" files by adding [#=; backtrack ;=#]: webPage : "Past & future worlds.html" diff results :
  • webPage : "index.html" diff results :
  • Cool emails
  • Linux bash scripts
  • Puetz - greatest of cycles
  • System_maintenance webPage : "page blogs.html" diff results :
  • webPage : "Howell - corona virus.html" diff results : webPage : "page Software programming.html" diff results :

  • >> Check the new d_webRaw html files : Maybe the problem is with : "reaching up" the directory path? - maybe look at later.. "Howell - review of Holverstott 2016 Hydrino energy". webPage : "page Howell - blog.html" diff results :
  • webPage : "Howell - pandemics and disease blog.html" diff results :

    >> OK - they are all good. 02--02 Other problems to fix : >> why wasn't !!linkError!! (or [#=; backtrack ;=#]) inserted? >> I manually put in [#=; backtrack ;=#] 02--02 05-----05 $ find "$d_webRaw" -maxdepth 3 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" | grep --invert-match "z_Archive" /media/bill/SWAPPER/Website - raw/page Howell - blog.html:75:
  • >> leave it as a known problem, I added a comment that the link won't work /media/bill/SWAPPER/Website - raw/page Publications & reports.html:61:
  • Bill Howell "Are we ready for global cooling?" - A short presentation to Toastmasters – Dows Lake, Ottawa, 14Mar06. Needs corrections and comments! (some time later...)

    >> I added [#=; backtrack ;=#] 24************************24 06Nov2020 webSite_convertBodyLinks link d_webRaw 'webWork files/201106 17h48m29s webSite_convertBodyLinks log.txt' >> most have no diff (no changes) those that do show NO [#=; backtrack ;=#] >> I must have the diff files switched? - yes, this was fixed take changes "permanent" At point, webSite_convertBodyLinks does NOT change the original files. check "_Climate and sun.html" : But first, change original file & re-run (I haven't set up a full webSite test). Change : +.....+ webPage_convertBodyLinks l o webPage ; +.....+ To : +.....+ webPage_convertBodyLinks l l webPage ; +.....+ >> Mostly looks good, and at least subDirs shortened to last subDir and have inserted [#=; backtrack ;=#] >> OOPS, all files gave diff error, eg : diff: /media/bill/ramdisk/convertBodyLinks temp.txt: No such file or directory This moves p_temp_BodyLinks IF flag_move THEN host link 'mv "' p_temp_BodyLinks '" "' webPage '"' ; ENDIF ; so the appropriate diff is between [(link d_htmlBackup fname), webPage] However, that won't help now that the original htmls have been changed. It's still possible by restoring 24************************24 06Nov2020 05-----05 >> OCH!! tons of '!!linkError!!' This result is wrong - I must have a coding error that generates this problem? 1. reinstate "$d_webRaw""z_Archive/201105 18h37m51s backups/" 2. manual fix of links on webSite 3. redo all 'Website updates- tests.ndf' - find problem 4. fix problem with webPage_convertBodyLinks 05-----05 1. reinstate "$d_webRaw""z_Archive/201105 18h37m51s backups/" qnial> dirBackup_restoreTo_paths (link d_webRaw 'z_Archive/201105 18h37m51s backups/') p_webPageList $ find "$d_webRaw" -maxdepth 3 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" >"$d_webRaw""webWork files/5 linkerrors.xt" >> 506 lines affected! - many subDirs that don't with `/? - many moved web-[page, dir]s - maybe [TableOfContent, menuHeadFoot]?? Too many z_Archive : $ find "$d_webRaw" -maxdepth 3 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" | grep --invert-match "z_Archive" >"$d_webRaw""webWork files/5 linkerrors.xt" >> Now down to "only" 124 problems in 17 html files. 05-----05 2. manual fix of links on webSite Fix the 17 html files NOW! reduce extra work later. see commentary in : "$d_webRaw""webWork files/201106 5 linkerrors.txt" >> Maybe "???". 05-----05 delete redundant directories [d_webSite, www.BillHowell.ca] - video [production, active] - already gone - another subDir can't remember... 05-----05 3. redo all 'Website updates- tests.ndf' - find problem internalLinks_return_backupSubDirFnames_test >> ALL failed! this is an issue! internalLinks_return_backupSubDirFnames IS OP strLeft strRight line >> no longer uses d_webRaw >> OK, now all tests are OK 05-----05 4. fix problem with webPage_convertBodyLinks webPage_convertBodyLinks_test qnial> webPage_convertBodyLinks_test diff: /media/bill/PROJECTS/Qnial/code develop_test/test- HELP.html convertBodyLinks.html: No such file or directory >> What? that file exists!??? OOPS - extra space was removed >> only ONE test was logged : webPage_convertBodyLinks_test for : "test- page Howell - blog.html" >> OOPS! did I set webPage_convertBodyLinks to repeatedly delete p_log? No - only when test is initiated, so thatOK. So why only one test was logged? -> the first test Yet obviously from the error message, 'test- HELP.html' was run. p_log is opened in "a append mode, no host commands have `>, just '>>'. I fixed a problem - should ope_log in BOTH [webPage_convertBodyLinks_test, webPage_convertBodyLinks_output] webPage_convertBodyLinks_test >> OK, now all tests run, only one has diff results (i.e. no changes to the others) ........ # webPage_convertBodyLinks example #4 webPage_convertBodyLinks_test for : "test- _Lies, damned lies, and scientists.html" diff results :
  • Title, table of condents, copyright
  • Introduction
  • Intro - The impossible conclusion, Scientists can't think
  • logical, and scientific thinking by essentially all scientists
  • profile
  • challenged, and when they crumble
  • and Scientific Thinking?
  • B2 - Cheating theory and Game theory
  • B3 - Pre-and-post-Science Philosophies for Thinking
  • C1 - A Brave new world
  • C2 - The rise & fall of Enlightenment
  • C3 - Suggestions for science, policy, and society
  • D0 - Conclusions
  • 24************************24 05Nov2020 webSite_convertBodyLinks Code modified to [capture process p_log, backup files, log diffs] 02--02 % create a new backup directory for every use of webSite_convert, as damage can be VERY time-costly ; d_htmlBackup := link d_webRaw 'z_Archive/' timestamp_YYMMDD_HMS ' backups/' ; host link 'mkdir "' d_htmlBackup '" ' ; % ; flog := open p_log "w ; flog EACHRIGHT writefile '********' (link 'webSite_convertBodyLinks, ' timestamp_DDMMMYYYY_HMS) '' ; close flog ; FOR webPage WITH (strList_readFrom_path p_webPageList) DO fname := path_extract_fname webPage ; p_backup := link d_htmlBackup fname ; flog := open p_log "a ; flog EACHRIGHT writefile '........' (link 'webPage : "' fname '"') 'diff results : ' ; close flog ; % webPage_convertBodyLinks o webPage d_webRaw ; % host (link 'diff --width=85 "' p_backup '" "' webPage '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; host link 'echo "" >>"' p_log '"' ; ENDFOR ; 02--02 05-----05 qnial> webSite_convertBodyLinks 2_website htmlFileLs.txt : /media/bill/SWAPPER/Website - raw/Projects - mini/Diversity - ssh site/diversity_public/home.html /media/bill/SWAPPER/Website - raw/Projects - mini/Solar system/Cdn Solar Forecasting/Canadian Solar Workshop 2006 home page.html /media/bill/SWAPPER/Website - raw/Projects - mini/Solar system/Cdn Solar Forecasting/CSWProgram.html /media/bill/SWAPPER/Website - raw/Projects - mini/wordpress site/Authors Guide BLOG home.html /media/bill/SWAPPER/Website - raw/security/encryption-decryption instructions.html /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial.html /media/bill/SWAPPER/Website - raw/Solar modeling and forecasting/_Solar modeling & forecasting.html /media/bill/SWAPPER/Website - raw/Steven H Yaskell/0_Steven H Yaskell.html /media/bill/SWAPPER/Website - raw/webWork files/4_test Kyoto Premise - the scientists arent wearing any clothes (copy).html /media/bill/SWAPPER/Website - raw/webWork files/fin organisations.html 201105 17h55m50s webSite_convertBodyLinks log.txt : webPage : "home.html" webPage : "Canadian Solar Workshop 2006 home page.html" webPage : "CSWProgram.html" webPage : "Authors Guide BLOG home.html" webPage : "encryption-decryption instructions.html" webPage : "email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html" webPage : "Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html" webPage : "QNial.html" webPage : "_Solar modeling & forecasting.html" webPage : "0_Steven H Yaskell.html" webPage : "4_test Kyoto Premise - the scientists arent wearing any clothes (copy).html" Only one NOT in 'webSite_convertBodyLinks log.txt : ' : /media/bill/SWAPPER/Website - raw/webWork files/fin organisations.html >> worry about later 05-----05 Now to uncomment : % webPage_convertBodyLinks o webPage d_webRaw ; % host (link 'diff --width=85 "' p_backup '" "' webPage '" --suppress-common-lines | grep ' chr_apo '^>' chr_apo ' | sed ' chr_apo 's/^>\ //' chr_apo ' >>"' p_log '"') ; qnial> webSite_convertBodyLinks >> OOPS! backups are NOT being made!! >> The actual backups are done by webPage_convertBodyLinks, whick I hadn't activated Try with flag_backup, but not yet flag_move qnial> webSite_convertBodyLinks >> NO diffs? Maybe the last time I over-wrote the webPages, it was done properly? 05-----05 Big mistake with my find in this section : ignore it all $ find "$d_webRaw" -maxdepth 3 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" >"$d_webRaw""webWork files/5 linkerrors.xt" >> using "!!linkError!!" gave jillions of : grep: invalid max count >> OCH!! tons of '!!linkError!!' This result is wrong - I must have a coding error that generates this problem? Many are the result of a lack of `/ at the end of a subDir Maybe due to eliminating d_webaw as "base dir"!!!???!!! 467 cases in total - I have to sed these out and re-translate!! find ONLY in recent backup dir $ find "$d_webRaw""z_Archive/201105 18h37m51s backups/" -maxdepth 0 -type f -name "*.html" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '!!linkError!!' "FILE" >"$d_webRaw""webWork files/5 linkerrors.xt" Nyet - reinstate "$d_webRaw""z_Archive/201105 18h37m51s backups/" fix problem with webPage_convertBodyLinks break for the day! 24************************24 02Nov2020 test "real" webPages to make sure they are saved, and that other problems don't arise. Adjust link d_Qtest 'Website updates.ndf' and cold loaddef : Uncomment in webPage_convertBodyLinks : % host link 'mv "' p_temp_webPage_convertEncoding '" "' webPage '"' ; Set flag_backup := l qnial> bye / qnial qnial> loaddefs link d_Qtest 'Website updates.ndf' define pinn - use one of already-tested webPages, example : qnial> pinn := link d_webRaw 'page Howell - blog.html' qnial> webPage_convertBodyLinks l pinn d_webRaw analyse the results - check diff output - open both the [raw, bodyLinked] versions, visually compare by searching 'A >"' p_log '"')) ) EACHLEFT link chr_apo ; write writefileList ; EACH host writefileList ; 05-----05 qnial> pinn := link d_webRaw 'page Howell - blog.html' qnial> webPage_convertBodyLinks l pinn d_webRaw # $ diff --width=85 "$d_webRaw""page Howell - blog.html" "$d_Q_tests""test- page Howell - blog.html convertBodyLinks.html" --suppress-common-lines Issues : +-+ +-+ 05-----05 qnial> pinn := link d_webRaw 'economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html' qnial> webPage_convertBodyLinks l pinn d_webRaw # $ diff --width=85 "$d_webRaw""economics, markets/SP500/multi-fractal/1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html" "$d_Q_tests""test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html convertBodyLinks.html" --suppress-common-lines Issues : +-+ +-+ 05-----05 qnial> pinn := link d_webRaw 'Pandemics, health, and the Sun/corona virus/Howell - corona virus.html' qnial> webPage_convertBodyLinks l pinn d_webRaw # $ diff --width=85 "$d_webRaw""Pandemics, health, and the Sun/corona virus/Howell - corona virus.html" "$d_Q_tests""webPage_convertEncoding temp.txt" --suppress-common-lines Issues : +-+ +-+ 24************************24 02Nov2020 link_fixErrs - remove ../ and fix many links loaddefs link d_Qtest 'Website updates- tests.ndf' internalLinks_return_backupSubDirFnames_test >> OK, seems to work fine now on all tests webPage_convertBodyLinks_test >> I no longer have diff results? lost the code I threw quick code up - doesn't see, to be working?... 05-----05 First test : p_inn := link d_Qtest 'test- Howell - corona virus.html' ; >> looks good, with notable issues : 02--02 >> Most (10) bodyLinks work well, eg : 89,90c87,88 < Anglophone < Scandanavia --- > Anglophone > Scandanavia 02--02 >> Some IMG work, eg : 256c254 < --- > 272c270 < --- > 02--02 >> Why didn't these work? IMG may not be working? Maybe the files were renamed? >> Yikes! It appears that I overwrote the test file father than the pocessed file. >> Many already have !!linkError!! although the program should handle these. 179c177 < --- > 181c179 < --- > 02--02 >> Two cases where subDirs are included when that is not necessary?

    02--02 >> Missing lines : "webPage_convertEncoding temp.txt" --suppress-common-lines 66,67d65 < < 02--02 >> Is there a problem with the initial removal of [#=; backtrack ;=#]? But that didn't seem to be a problem with most? >> There is no diff output? But it works well from bash!! Still, the output file is better than what I had, so I will rename it and copy over the existing standard. Fix diff problem! >> Nuts, I shortened the filename : p_std := link d_Qtest 'test- Howell - corona virus.html webPage_convertBodyLinks.html' ; >> I removed these. Try the same test again. >> OK (no change in output), but diff STILL doesn't work. 05-----05 Second test : $ diff --width=85 "$d_Qroot""code develop_test/'test- 1872-2020 SP500 index, ratio of opening price to semi-log detrended price.html convertBodyLinks.html" "$d_temp""webPage_convertEncoding temp.txt" --suppress-common-lines NO initial conversion?? No faults, either. I was missing flag_backup (o) : webPage_convertBodyLinks o p_inn p_std ; Rerun : >> Works beautifully, including partial subDirs. still no QNial diff, though. 05-----05 Third test : 'test- page Howell - blog.html' As with previous tests : run once, over-write p_std, then manually check. (diff will give no feedback at this stage). Issues : +-+ 02--02 >> Awesome - most link conversions worked! 05-----05 Fourth test : 'test- HELP.html' This is a 'Conference Guide' webPage! As with previous tests : run once, over-write p_std, then manually check. (diff will give no feedback at this stage). Issues : Not many BodyLinks, but at least the milto:s look OK. Confrence Guide menue part of the raw files. Great deal of work to fix all that up to current approach. Forget it. 05-----05 Fifth & final test : 'test- Menu.html' This is a 'Conference Guide' menu! As with previous tests : run once, over-write p_std, then manually check. (diff will give no feedback at this stage). Issues : No bodyLinks. All this shows is that it's a waste of time to process menuHeadFoot files... 05-----05 olde code % Doesn't work!? ; midIndxs midLinks := EACH link fixIndxs (EACH solitary fixLinks) ; midIndxsLines_ignoreBads_test ; link_fixErrs_test ; 24************************24 02Nov2020 reve old code IF flag_debug THEN write 'loading midIndxsLines_ignoreBads_test' ; ENDIF ; #] midIndxsLines_ignoreBads_test IS - midIndxsLines_ignoreBads_test IS { LOCAL are_good ; % ; EACH write_testStr '#05-----05' (link 'midIndxsLines_ignoreBads_test, ' timestamp) ; i_test := 0 ; % ; i_test := i_test + 1 ; t_name := link '# midIndxsLines_ignoreBads_test example #' (string i_test) ; t_input := (22 4 3 12) ('Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf' 'https://www.mackinac.org/SP1998-01' '#Key [results, comments]') ; t_standard := (22 12) (solitary 'Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf') ; t_result := midIndxsLines_ignoreBads t_input ; test_comment t_name t_input t_standard t_result ; % ; i_test := i_test + 1 ; t_name := link '# midIndxsLines_ignoreBads_test example #' (string i_test) ; t_input := 22 (solitary 'https://www.mackinac.org/SP1998-01') ; t_standard := null null ; t_result := midIndxsLines_ignoreBads t_input ; test_comment t_name t_input t_standard t_result ; } IF flag_debug THEN write 'loading link_fixErrs_test' ; ENDIF ; #] link_fixErrs_test IS - link_fixErrs_test IS { LOCAL are_good ; % ; EACH write_testStr '#05-----05' (link 'link_fixErrs_test, ' timestamp) ; i_test := 0 ; % ; i_test := i_test + 1 ; t_name := link '# link_fixErrs_test example #' (string i_test) ; t_input := (solitary 22) (solitary 'Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf') ; t_standard := (solitary 22) (solitary 'Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf') ; t_result := link_fixErrs t_input ; test_comment t_name t_input t_standard t_result ; EACH write '........' '30Oct2020 Simple case of fname-only' '' ; % ; i_test := i_test + 1 ; t_name := link '# link_fixErrs_test example #' (string i_test) ; t_input := (22 4 3) ('#Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf' 'Civilisations and sun/' '[#=; backtrack ;=#]Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.') ; t_standard := (solitary 4) (solitary 'Civilisations and sun/') ; t_result := link_fixErrs t_input ; test_comment t_name t_input t_standard t_result ; EACH write '........' '30Oct2020 Initially returned null. ../ problem?' '' ; % ; i_test := i_test + 1 ; t_name := link '# link_fixErrs_test example #' (string i_test) ; t_input := (22 4 3) ('Civilisations and sun/Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf' 'https://www.mackinac.org/SP1998-01' '#Key [results, comments]') ; t_standard := null null ; t_result := link_fixErrs t_input ; test_comment t_name t_input t_standard t_result ; EACH write '........' '30Oct2020 Initially returned null. ../ problem?' '' ; } # find_Howell '#Howell - Mega-Life, Mega-Death and the Sun, the rise and fall of civilisations.pdf' allFnamesGradeupList 24************************24 01Nov2020 link_fixErrs - remove ../ and fix many links good progress, not done initial revamp... 05-----05 link d_Qndfs 'Website updates.ndf' # old code IF (NOT err_fixed) THEN % Later, tackle common problems [./, ???, etc] ; null ; ENDIF ; IF (NOT isfault (find_Howell errLinks@i allFnamesSortedByFname)) THEN fixIndxs@i := solitary errIndxs@i ; fixLinks@i := solitary errLinks@i ; err_fixed := l ; ENDIF ; % check if a legitimate subDir without fname ; IF (NOT err_fixed) THEN subLink := str_remove_subStr errLinks@i '../' ; IF (NOT isfault (find_Howell subLink allSubDirsSortedBySubdir)) THEN fixIndxs@i := solitary errIndxs@i ; fixLinks@i := solitary subLink ; err_fixed := l ; ENDIF ; ENDIF ; IF (NOT OR (isfault midIndxs) (= null midIndxs)) IF flag_debug THEN write 'loading link_fixErrs' ; ENDIF ; #] link_fixErrs IS OP errIndxs errLinks - returns fixed links, or the original "bad link" # Fixing links is noble, but perhaps more important is labelling those that are flawed? EACH non-http link should have EITHER '[#=; backtrack ;=#]' or '!!linkError!!', but not both. link_fixErrs IS OP errIndxs errLinks { LOCAL i err_fixed fixIndxs fixLinks fixLink subLink ; NONLOCAL allFnamesSortedByFname allSubDirsSortedBySubdir ; fixIndxs fixLinks := errIndxs errLinks ; % 31Oct2020 IMPORTANT - use [strList, NOT indxs] for shape-related optrs, as atomic indxs screw up ; err_fixed := o ; FOR i WITH (tell (gage shape errLinks)) DO % check if legitimate fname-only. If so - don't label as error. ; IF (NOT isfault (find_Howell errLinks@i allFnamesSortedByFname)) THEN err_fixed := l ; ENDIF ; % check if a legitimate subDir without fname ; IF (NOT err_fixed) THEN IF (`/ = (last fixLinks@i)) THEN subLink := str_remove_subStr errLinks@i '../' ; IF (NOT isfault (find_Howell subLink allSubDirsSortedBySubdir)) THEN fixLinks@i := subLink ; err_fixed := l ; ENDIF ; ENDIF ; ENDIF ; IF (NOT err_fixed) THEN % label this link as an error as it has failed all tests ; fixLinks@i := link '!!linkError!!' fixLinks@i ; ENDIF ; ENDFOR ; fixIndxs fixLinks } #] midIndxsLines_ignoreBads IS OP webPage webSite - remove "bad links" that should not be processed # 29Oct2020 initial IF flag_break THEN BREAK ; ENDIF ; 0 midIndxsLines_bads := 'http' '#' 'mailto:' '!!linkError!!' ; midIndxsLines_badShapes := EACH (gage shape) midIndxsLines_bads ; midIndxsLines_ignoreBads IS OP midIndices midList { LOCAL are_good result ; NONLOCAL midIndxsLines_bads midIndxsLines_badShapes ; IF (= null midIndices) THEN result := null ; ELSE takeArgs := midIndxsLines_badShapes cart midList ; are_good := NOT EACH OR (cols EACHALL OR (midIndxsLines_bads EACHLEFT EACHRIGHT = (EACH take takeArgs))) ; result := are_good EACHRIGHT sublist midIndices midList ; ENDIF ; result } # 'mailto:IEEE%20WCCI%202020%20HELP%20daemon%20?subject=IEEE%20WCCI%202020%20HELP%20:%20testing&body=Approximately 10 minutes after sending this email, you should receive two emails :%0D%0A 1. a confirmation that you have sent the email%0D%0A 2. the email that was forwarded by the daemon : addressed to me, and cc-d to you. %0D%0A Normally you do NOT receive the forwarded email, just the confirmation.%0D%0A%0D%0AYou can type in the email body below, and add cc: recipients, but extra To: recipients are ignored. DO NOT change the Subject:, or your email goes straight to trash!') are_good := NOT ( = (EACH take takeArgs))) ; zzTHEN zz fixIndxs fixLinks := errIndxs errLinks ; zz % 31Oct2020 IMPORTANT - use [strList, NOT indxs] for shape-related optrs, as atomic indxs screw up ; zz err_fixed := o ; IF (NOT err_fixed) THEN % label this link as an error as it has failed all tests ; fixLinks@i := link '!!linkError!!' fixLinks@i ; ENDIF ; fixIndxs fixLinks % remove [#=; backtrack ;=#] if present' ; % find [[dir, fname]-only, '!!linkError!!'] ; fnames := (`/ EACHRIGHT (1 + last findall) midLinks) EACHBOTH drop midLinks ; fnames := (`/ EACHRIGHT (1 + last findall) midLinks) EACHBOTH drop midLinks ; link_errs := EACH isfault fnames ; link_goos := EACH NOT link_errs ; % goos means good [indx, link]s, aligns coding ; errIndxs errLinks := link_errs EACHRIGHT sublist midIndxs midLinks ; gooIndxs gooLinks := link_goos EACHRIGHT sublist midIndxs fnames ; % link_fixErrs returns (fixIndxs fixLinks), including the original "bad links" ; % This avoids loss of text and hints at how to manually fix the links ; % ordering of the midLinks is unimportant, as long as [Indx, Link]s are properly paired ; fixIndxs fixLinks := link_fixErrs errIndxs errLinks ; midIndxs midLinks := gooIndxs gooLinks EACHBOTH link fixIndxs fixLinks ; midLinks := '[#=; backtrack ;=#]' EACHRIGHT link midLinks ; # old code fnameIndxs := fnames EACHLEFT find_Howell allFnamesSortedByFname ; fnameSubDirs := fnamesIndxs EACHLEFT pick allSubDirs ; midLinks := midLinks EACHLEFT str_remove_subStr d_webRaw ; link_errs := EACH isfault fnames ; IF (OR link_errs) THEN link_goos := EACH NOT link_errs ; % goos means good [indx, link]s, aligns coding ; errIndxs errLinks := link_errs EACHRIGHT sublist midIndxs midLinks ; gooIndxs gooLinks := link_goos EACHRIGHT sublist midIndxs midLinks ; % link_fixErrs returns fixed errLinks, or the original "bad links" ; % This avoids loss of text and hints at how to manually fix the links ; fixIndxs fixLinks := link_fixErrs errIndxs errLinks ; % ordering of the midLinks is unimportant, as long as [Indx, Link]s are properly paired ; midIndxs midLinks := gooIndxs gooLinks EACHBOTH link fixIndxs fixLinks ; ENDIF ; midIndxs midLinks := gooIndxs gooLinks EACHBOTH link fixIndxs fixLinks ; midIndxs midLinks := link_fixErrs midIndxs midLinks ; 24************************24 31Oct2020 link_fixErrs loaddefs link d_Qtest 'Website updates- tests.ndf' qnial> link_fixErrs_test Tricky solitary issues... allSubDirsList - WRONG!, it includes filenames. [webSite_extractAll_pathsSubDirsFnames, webSite_extractHTML_pathsSubDirsFnames] : rewritten, corrected, update 'Website updates- tests.ndf' Now to test : qnial> internalLinks_return_backupSubDirFnames_test Many, many, many fixes (round in circles). Only remaining problem : 05-----05 # internalLinks_return_backupSubDirFnames_test example #2 : FAILED - result does NOT match standard t_input, t_standard, t_result = +---------02--02-----------------------------------------------------+----------------------------------+ ||
  • |/media/bill/SWAPPER/Website - raw/| +---------02--02-----------------------------------------------------+----------------------------------+
  • ........ 30Oct2020 Simple check of multiple "Table of Contents
  • >> no result 02--02 While most results are provided in sections above, links to data [spreadsheets, text files] and software [???, source code] are listed below along with brief comments. A full listing of files (including other SP500 web-pages) can be seen via this Directory's listing. Hopefully this will help those who want to do something different, as the programs etc mayhelp with [learning, debugging].
  • gnuplot I've used the unofficial extension .plt to designate gnuplot scripts for each of the graphs. You can see these files via this Directory's listing.
  • gnuplot.sh is the tiny bash script used to select gnuplot scripts. My other bash scripts can be found here.
  • QNial programming language - Quenn's University Nested Interactive Array Language (Q'Nial) is my top prefered programming language for modestly complex to insane programming challenges, along with at least 3 other people in the world. Bash scripts make a great companion to QNial. semi-log formula.ndf is the tiny "program" used to set up the semi-log line fits. More generally : here are many of my QNial programs. Subdirectories provide programs for various projects etc. >> Oops - entire paragraphs removed! no file (need to add) >> I need to fix this 02--02 All ' loaddefs link d_Qtest 'Website updates- tests.ndf' currently tests only 'test- page Howell - blog webPage_convertBodyLinks.html' qnial> webPage_convertBodyLinks_test 05-----05 -->[stepv] nextv ?.. Fnames := ( `/ EACHRIGHT ( 1 + last findall ) Linelist # Indices ) EACHBOTH drop Linelist # Indices -->[nextv] +---------+ |SP1998-01| +---------+ ?.. Fnamesubdirs := ( Fnames EACHLEFT find Fhtmlgradeuplist ) EACHLEFT pick Htmlfilesgradeuplist -->[nextv] ?address ?.. Fnamesubdirsweb := Fnamesubdirs cart ( solitary Website ) -->[nextv] +---------------------------------------------+ |+--------+----------------------------------+| ||?address|/media/bill/SWAPPER/Website - raw/|| |+--------+----------------------------------+| +---------------------------------------------+ 05-----05 >> big screwup, I need to exclude ^[http, #, mailto:] I added : indiceslineList_bads := 'http' '#' 'mailto:' ; indiceslineList_badShapes := EACH (gage shape) indiceslineList_bads ; indiceslineList_removeHttpHashMailto IS OP indices lineList { LOCAL are_good ; NONLOCAL indiceslineList_bads indiceslineList_badShapes ; IF flag_break THEN BREAK ; ENDIF ; IF (= null indices) THEN null ELSE takeArgs := indiceslineList_badShapes cart lineList ; are_good := NOT EACH OR (cols EACHALL OR (indiceslineList_bads EACHLEFT EACHRIGHT = (EACH take takeArgs))) ; are_good EACHRIGHT sublist indices lineList ENDIF } Retry qnial> fonn qnial> webPage_convertBodyLinks_test All of a sudden, path_retrieve_subDirFname_test fails!? Glad that I set up the test!!! Actually, I changed path_retrieve_subDirFname to be more generic, so I must change the test. >> OK, works revamped internalLinks_return_backupSubDirFnames : internalLinks_return_backupSubDirFnames IS OP strLeft strRight line webSite { LOCAL fnames fnamesIndices fnameSubDirs fnameSubDirsWeb midlList midlIndices ; NONLOCAL htmlFnamesGradeupList htmlSubDirsList ; IF flag_break THEN BREAK ; ENDIF ; midlIndices lineList := str_splitLftRgtTo_midlIndices_StrList '' line ; IF (~= null midlIndices) THEN midlList := midlIndices EACHLEFT choose lineList ; midlIndices midlList := midlIndicesLines_removeBads midlIndices (midlIndices choose lineList) ; IF (~= null midlList) THEN fnames := (`/ EACHRIGHT (1 + last findAll_Howell) midlList) EACHBOTH drop midlList ; IF (isfault fnames) THEN line ELSE fnamesIndices := fnames EACHLEFT find htmlFnamesGradeupList ; fnameSubDirs := fnamesIndices EACHLEFT pick htmlSubDirsList ; midlList := '[#=; backtrack ;=#]' EACHRIGHT link fnameSubDirs ; lineList#midlIndices := internalLinks_return_backupSubDirFnames midlIndices lineList webSite ; link lineList ENDIF ELSE line ENDIF ; ELSE line ENDIF } # loaddefs link d_Qndfs 'Website updates.ndf' # loaddefs link d_Qtest 'Website updates- tests.ndf' # webPage_convertBodyLinks_test >> Oops all lines with '> Oh, OK. This is an old filename still in the webPage. So the line should be returned! >> I put in findAll_Howell, and cfor an error ?.. Linelist # Midlindices := internallinks_return_backupsubdirfnames Midlindices Linelist Website -->[nextv] +----------------------+---------+-------------02--02+ |
  • ||| +----------------------+---------+-------------02--02+ >> OOPS again Change : +.....+ lineList#midlIndices := internalLinks_return_backupSubDirFnames midlIndices lineList webSite ; +.....+ To : +.....+ lineList#midlIndices := midlList ; +.....+ Dark matter video 1 - initial, simple.mpeg >> NUTS!! I have to use allFnamesGradeupList etc, because links aren't restricted to html files. >> Have to bye & start - somehow new coding didn't take effect I got internallinks_return_backupsubdirfnames to work Still no '17) errors reported like : 05-----05 /media/bill/PROJECTS/Qnial/code develop_test/test- page Howell - blog.html WRONG! non-null diff result : 42d41 <
  • 48c47 <
  • --- >
  • 55c54 < If we take an "Electric Universe" perspective, then perhaps shifts in the galactic currents could be expected to "light up" or "extinguish" stars to various degrees as the currents shift and move. In other words, the "lit-up regions" motions may relate more to drifts of galactic currents than to the motions of the stars themselves? My own [cheap, crappy] animation for the spiral currents moving though stations stars is shown in my video (mpeg format) : Bill Howells videos/Birkeland rotation in galaxy - not dark matter/Dark matter video 1 - initial, simple.mpeg

    --- 05-----05 >> dropthe entire line :
  • >> but this line appears (modified) :
  • >> but this line was properly translated :
  • So - just mv : link d_temp 'webPage_convertEncoding temp.txt' back to the test standard file : link d_Qtest 'test- page Howell - blog webPage_convertBodyLinks.html' Rerun : qnial> webPage_convertBodyLinks_test #05-----05 str_to_unicodeList_test, Thu Oct 29 19:56:31 2020 # webPage_convertBodyLinks example #1 /media/bill/PROJECTS/Qnial/code develop_test/test- page Howell - blog.html OK - diff is null, so the standard result was generated. Looks great! But there may be errors that I am not picking up There are three otthst files - do tomorrow as I'm too blah to do a good job now. 05-----05 Remove old code fileops.c IF flag_debug THEN write 'loading path_retrieve_subDirFname' ; ENDIF ; #] path_retrieve_subDirFname IS OP path dirBase - returns fnameSubDir for an fname in dirBase # 28Oct2020 fix links in the body of the dirBase # 29Oct2020 make more generic - remove conditions for [http, #, [#=; backtrack ;=#]] # This is vulnerable to duplicate filenames in different directories!!! Only the first match is used. # I need to do more for paths with `#, as some do require processing. later ... path_retrieve_subDirFname IS OP path dirBase { LOCAL fname fPath subDirFname ; NONLOCAL webSiteAllPathList ; fname := path_extract_fname path ; IF (isfault fPath) THEN path ELSE subDirFname := str_extractPast_strFront path dirBase ; link '[#=; backtrack ;=#]' subDirFname ENDIF } # for tests, see link d_Qtest 'file_ops- test.ndf' # old code IF (chr_in_str `# path) THEN path ELSEIF (= 'http' ( 4 take path)) THEN path ELSE fname := path_extract_fname path ; fPath := first ((fname EACHRIGHT str_in_path webSiteAllPathList) sublist webSiteAllPathList) ; % write 'fPath = ' fPath ; IF (isfault fPath) THEN path ELSE subDirFname := str_extractPast_strFront fPath dirBase ; link '[#=; backtrack ;=#]' subDirFname ENDIF ENDIF 05-----05 str_splitLftRgtTo_midIndxs_StrList UST return a list of solitary indices, not a list of numbers. Otherwise, a list of one integer causes faults!! 31Oct2020 NYET - I reverall this, and simple used : FOR i WITH (tell (gage shape errLinks)) DO 02--02 str_splitLftRgtTo_midIndxs_StrList IS OP strLft strRgt str L midIndxs := EACH solitary (tell (gage shape splits)) ; >> This will affect many operators!!! $ find "$d_Qndfs" -maxdepth 3 -type f -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "str_splitLftRgtTo_midIndxs_StrList" "FILE" /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1585:IF flag_debug THEN write 'loading str_splitLftRgtTo_midIndxs_StrList' ; ENDIF ; /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1587:#] str_splitLftRgtTo_midIndxs_StrList IS OP strLft strRgt str - split str by paired [left, right]-end-marks /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1590:# 19Oct2020 initial, based on str_splitLftRgtTo_midIndxs_StrList /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1597: str_splitLftRgtTo_midIndxs_StrList IS OP strLft strRgt str /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1604: THEN fault '?str_splitLftRgtTo_midIndxs_StrList error : OR[i_heads, i_tails] is null' /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1611: str_splitLftRgtTo_Indxs_StrList IS str_splitLftRgtTo_midIndxs_StrList /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:151: midIndxs lineList := str_splitLftRgtTo_midIndxs_StrList strLeft strRight line ; /media/bill/PROJECTS/Qnial/MY_NDFS/Website updates.ndf:332: THEN indices lineList := str_splitLftRgtTo_midIndxs_StrList 'mailto:' '">' line ; /media/bill/PROJECTS/Qnial/MY_NDFS/file_ops.ndf:862: indicesMidls strList := str_splitLftRgtTo_midIndxs_StrList '[#=; ' ' ;=#]' line ; $ find "$d_Qndfs" -maxdepth 1 -type f -name "*.ndf" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number "str_splitLftRgtTo_Indxs_StrList" "FILE" /media/bill/PROJECTS/Qnial/MY_NDFS/strings.ndf:1611: str_splitLftRgtTo_Indxs_StrList IS str_splitLftRgtTo_midIndxs_StrList OK - easy to change,then test 24************************24 2Oct2020 create filename-only sorted lists for p_[all, html]FileList I already did this for my symbols system see link d_QNial_mine 'Website header.ndf' webSite_sortCullGradeupOn1st_allPathsAndFnames IS { LOCAL fnameList ; NONLOCAL d_webRaw allFilesList allFilesGradeupList fnameGradeupList p_allFileList ; host link 'find "' d_webRaw '" -maxdepth 4 -type f -name "*" | grep --invert-match "Conference guides\|z_Old\|z_Archive\|System_maintenance\|Qnial_bag\|Cool emails/\|Electric Universe/References/\|Electric Universe/References/\|Yoonsuck Choe - conf program book/\|fin Head\|Menu\|fin [F,f]ooter\|fin organisations|i9018xtp.default/extensions/" | sort -u >"' p_allFileList '" ' ; % ; allFilesList := strList_readFrom_path p_allFileList ; fnameList := (`/ EACHRIGHT (1 + last findall) allFilesList) EACHBOTH drop allFilesList ; fnameGradeupList allFilesGradeupList := lists_sortupCullOn1st (fnameList allFilesList) ; } Seems to work fine 24************************24 28Oct2020 webPage_convertBodyLinks see link d_Qtest 'file_ops- test.ndf' I did my first full-file test. 05-----05 for link d_Qtest 'test- Howell - corona virus webPage_convertBodyLinks.html' : line 325-328, with multiple Howell - Pandemics and the sun Howell - Selected pandemics & epidemics.pdf Hoyte & Schatten year - solar influence on climate & natural systems, graphs.pdf Tapping, Mathias, Surkan - Pandemics & solar activity Only a few of '> same problems of [missing, incomplete] subDir OK, Change : +.....+ lineList#indices := EACH path_retrieve_subDirFname fnameSubDir ; +.....+ To : +.....+ lineList#indices := EACH path_retrieve_subDirFname fnameSubDir webSite ; +.....+ The test were not processed to replace %20 with space - check current files in dwebRaw Not working for directory links.. none of the links has [#=; backtrack ;=#] ??? fileops.ndf : 05-----05 path_retrieve_subDirFname IS OP path dirBase { LOCAL fname fPath subDirFname ; NONLOCAL webSiteAllPathList ; IF (chr_in_str `# path) THEN path ELSEIF (= 'http' ( 4 take path)) THEN path ELSE fname := path_extract_fname path ; fPath := first ((fname EACHRIGHT subStr_in_str webSiteAllPathList) sublist webSiteAllPathList) ; % write 'fPath = ' fPath ; IF (isfault fPath) THEN path ELSE subDirFname := str_extractPast_strFront fPath dirBase ; link '[#=; backtrack ;=#]' subDirFname ENDIF ENDIF } 05-----05 >> [#=; backtrack ;=#] should be there! >> so the > not because of `[ as that isn't in the relevant code >> I didn't see any cases due to `# webPage_convertBodyLinks_test : webPage_convertBodyLinks p_inn d_Qtest ; % output goes to p_temp_webPage_convertEncoding ; >> is d_Qtest the problem? >> try d_webRaw >> still doesn't work It seems like this isn't working - fails and simply writes the line : IF (subStr_in_str ' View -> Directory listing filters -> check ONLY "Temporary & backup files" for local filters -> Edit filter rules -> "Temporary & backup files" : Filename ends with : [~, .bak.] Filename contains : [References, z_Archive, z_Old, z_References] check ALL : Conditions are case sensitive, Filter applies to : Files, Directories Click OK to retain changes Remove remaining transfer queue : Menu -> Edit -> Clear private data -> check Clear transfer queue box click in transfer [queued files, failed transfers, successful transfers] windows and [clear, delete] lists after all done Re-instate transfer only newer files : Menu -> Edit -> settings -> Transfers -> File exists action : Downloads -> Overwrite file if source file newer Uploads -> Overwrite file if source file newer +---+ >> OK, now to test Holidays - neural networks and genomics.html massive link screwup (missing ) http://www.billhowell.ca/Projects%20-%20mini/Puetz%20&%20%20Borchardt/Howell%20-%20comments%20on%20Puetz%20UWS,%20the%20greatest%20of%20cycles,%20human%20implications.odt http://www.billhowell.ca/Software%20programming%20&%20code/Qnial/ >> should link to web-page! Mostly the site looks really good! Leave the remaining correctipns for later... 24************************24 27Oct2020 after [restructure, rename]ing of d_web[Raw, Site] : webSite_list_htmlFiles qnial> bye qnial> lq_fileops ; loaddefs link d_Qndfs 'Website updates.ndf' ; webSite_convert ; webSite_update >> all seemed to work submenus : Home n/a Neural Nets none work (blue font in manu) Projects most work still including COVID-19, but NOT [MindCode, Lucas, Puetz, Randall] Software programming & code none work Professional & Resume Resume works, still not Education Publications & reports n/a Howell-produced videos none work (blue font in manu) Blogs all work Cool stuff n/a - just lists directory content Crazy themes and stories all work except still not deer Hosted sub-sites all work except Wickson Neil Howell's Art all work except Wickson Neural Nets none work (blue font in manu) I can't see why this won't work ??? Projects most work still including COVID-19, but NOT [MindCode, Lucas, Puetz, Randall] fix filenames of [MindCode, Puetz, Randell Mills] don't know why Lucas doesn't work removed COVID-19 - need a submenu for pandemics (later) Software programming & code none work might have been weird special character? Howell-produced videos none work (blue font in manu) I can't see why this won't work ??? Hosted sub-sites all work except Wickson I can't see why this won't work ??? qnial> webSite_convert ; webSite_update submenus : Home n/a Neural Nets same - none work (blue font in manu) Projects many work still, but NOT [MindCode, Lucas, Puetz, Randall, Icebreaker] Software programming & code none work still Professional & Resume Resume works, still not Education Publications & reports n/a Howell-produced videos none work (blue font in manu) Blogs all work Cool stuff n/a - just lists directory content Crazy themes and stories all work except still not deer Hosted sub-sites all work except Wickson Neil Howell's Art all work except Wickson >> Seems to be a problem with directories that are NOT directly under the menu host? Hosted sub-sites all work except Wickson Neil Howell's Art all work except Wickson added space after 'Steven' (doesn't make sense, try anyways) Neural Nets same - none work (blue font in manu) Howell-produced videos none work (blue font in manu) no idea of what the problem is maybe add space before \n>? Projects many work still, but NOT [MindCode, Lucas, Puetz, Randall, Icebreaker] I give up for now. Just refresh, and take a big break to do income taxes. qnial> lq_fileops ; loaddefs link d_Qndfs 'Website updates.ndf' ; webSite_convert ; webSite_update Software programming & code -> removed weird character in d_webSite >> NUTS!! It did work. The menue selections were blue because I hadn't tried them yet! >> Now all work. Neural Nets >> All work except [Neural Nets, MindCode] Status submenus : Home n/a Neural Nets All work except [Neural Nets, MindCode] Projects most work still, but NOT [MindCode, Puetz, Randall] Software programming & code all work Professional & Resume all work Publications & reports n/a Howell-produced videos all work Blogs all work Cool stuff n/a - just lists directory content Crazy themes and stories all work Hosted sub-sites all work Neil Howell's Art all work MindCode - change filename 10 Howell - MindCode Manifesto.odt Howell - comments on Puetz UWS, the greatest of cycles, human implications.odt Howell - review of Holverstott 2016 Randell Mills hydrino energy.pdf qnial> lq_fileops ; loaddefs link d_Qndfs 'Website updates.ndf' ; webSite_convert ; webSite_update >> Several errors : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial - Howells web-page.html ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial - Howells web-page.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : >> /media/bill/SWAPPER/Website - raw/webWork files/4_test Kyoto Premise - the scientists arent wearing any clothes (copy).html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : >> This was deleted, removed from '2_website p_webPageList.txt' 05-----05 OK - I should now remove write each file for [webSite_convert, webSite_update] so it's much easier to see the errors!!! Both are working very well now. qnial> lq_fileops ; loaddefs link d_Qndfs 'Website updates.ndf' ; webSite_convert ; webSite_update >>> loading start : file_ops.ndf <<< loading ended : file_ops.ndf >>> loading start : Website updates.ndf >>>>>> loading start : Website header.ndf <<<<<< loading ended : Website header.ndf <<< loading ended : Website updates.ndf ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/z_Archive/201027 13h47m09s backups/ /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/z_Archive/201027 13h47m09s backups/ /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?webPage_convert file unknown error, OR [d_htmlBackup, webPage] : /media/bill/SWAPPER/Website - raw/z_Archive/201027 13h47m09s backups/ /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial - Howells web-page.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/webWork files/fin footer.html /media/bill/ramdisk/stdTmp.txt ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html /media/bill/HOWELL_BASE/Website/Software programming & code/Qnial/MY_NDFS/email Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and De code Base64 Files, instructions.html /media/bill/HOWELL_BASE/Website/Software programming & code/Qnial/MY_NDFS/Thunderbird - Base64 Encode and Decode Base64 Files, instructions.html ?pinn_writeExecute_pouter file unknown error, OR [pinn, pouter] : /media/bill/SWAPPER/Website - raw/Software programming & code/Qnial/QNial - Howells web-page.html /media/bill/HOWELL_BASE/Website/Software programming & code/Qnial/QNial - Howells web-page.html >> This is much more useful. Status submenus : Home n/a Neural Nets All work except [Neural Nets] Projects most work still, but NOT [Puetz, Randall] Software programming & code all work Professional & Resume all work Publications & reports n/a Howell-produced videos all work Blogs all work Cool stuff n/a - just lists directory content Crazy themes and stories all work Hosted sub-sites all work Neil Howell's Art all work 24************************24 27Oct2020 Fix backups in fileops.ndf, pinn_writeExecute_pout Doesn't work for first level of webSite directory : backtrack -> should change insertion for depther = [0,-1] >>> continue := l ; depther := (gage shape (array_findAll_subArray `/ subDir)) - 1 ; IF (0 < depther) THEN backtrack := link (depther reshape (solitary '../')) ; ELSEIF (0 = depther) THEN backtrack := '' ; ELSEIF (-1 = depther) THEN backtrack := '' ; ELSE write '?pinn_writeExecute_pout error : depther out of range : ' depther ; continue := o ; ENDIF ; IF continue THEN <<< Now to do : qnial> webSite_convert >> Seems OK qnial> webSite_update >> Seems OK 05-----05 Problematic menus : Menu.html works for all except [Home, Hosted Web-pages, Neural nets] Menu blogs.html works for all Menu crazy themes and stories.html doesn't work for [Deer collision, ] Menu earth, sun, astro, history.html ?? not implemented in main menu (Home) Menu hosted subsites.html only works for Neil Howell, none of top menu work Menu Howell videos.html none of [top, video] menus work Menu Lies, Damned Lies, and Scientists.html Menu neural nets.html Menu professional and resume.html Menu projects.html Menu software programming.html 05-----05 Menu.html in sub-pages : Neural Nets none work (blue font in manu) Projects all work Software programming & code all work Professional & Resume none work (blue font in manu) Publications & reports all work Howell-produced videos none work (blue font in manu) Blogs all work Cool stuff n/a - just lists directory content Crazy themes and stories all work Hosted sub-sites all work Neil Howell's Art none work (blue font in manu) From now on, just assume blue font means broken link... as a first guess. Also - footer images [GNU, Creative Commons] don't work if top menu doesn't. 05-----05 pinn_writeExecute_pout - I probably had it right the first time... depther := (gage shape (array_findAll_subArray `/ subDir)) ; 24************************24 26Oct2020 path_insertIn_fHand [#!: path_insertIn_fHand (link d_webWork 'fin Head_two.html') fout I seem to remember removing an "inner path_insertIn_fHand" at sometime to help with debugging? In any case, I need that now! [#!: pinn_writeExecute_pout path d_inn d_out ; path_insertIn_fHand d_out fout ; 05-----05 Added to 'webPage_convertEncoding IS OP webPage' : sed_insertFix2 := link ';s|\[#!: path_insertIn_fHand (link d_webWork \(.*\)) fout ' '|[#!: pinn_writeExecute_pout (link d_webWork \1) stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; |' ; 05-----05 qnial> webPage_convertEncoding (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') >> At end of file : [#!: pinn_writeExecute_pout (link d_webWork 'fin Footer.html') stdTmp d_webRaw d_webSite ; path_insertIn_fHand stdTmp fout ; qnial> webPage_convert o (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') >> looks good... qnial> webPage_update (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') 05-----05 OK - convert the example to change the original file : qnial> webPage_convert l (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') qnial> webPage_update (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') STUPID!!! I change links to webPage_convert l (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') qnial> webPage_update (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') >> OK, everything now looks fine EXCEPT backtracks in the menus, which were NOT converted. Menus- I have the WRONG symbols in all but [Menu, Menu blogs] [#!; -> change to [#!; [#!: -> writeExecute [#=; -> menuHeadFoots 05-----05 Do whole website (should test more, but I'm gettign sick of this). qnial> webSite_convert >> Seemed to go well qnial> webSite_update >> Oops - menus etc , I removed : # old code % first update (execute embedded) menuHeadFoots, as they are used by the webPages ; % MHFs are saved in d_webWork or other directory of d_webRaw ; menuHeadFootList := list_readFrm_path p_menuHeadFootList ; FOR MHF WITH menuHeadFootList DO write MHF ; webPage_update MHF ; ENDFOR ; 05-----05 Problematic menus : Menu.html works for all except [Home, Hosted Web-pages, Neural nets] Menu blogs.html works for all Menu crazy themes and stories.html doesn't work for [Deer collision, ] Menu earth, sun, astro, history.html ?? not implemented in main menu (Home) Menu hosted subsites.html only works for Neil Howell, none of top menu work Menu Howell videos.html none of [top, video] menus work Menu Lies, Damned Lies, and Scientists.html Menu neural nets.html Menu professional and resume.html Menu projects.html Menu software programming.html >> OK, this doesn't work. Now to find the [problem, solution] I have one too many backtrack, so take out the "+ 1" that I added earlier today! Change : +.....+ depther := (gage shape (array_findAll_subArray `/ subDir)) + 1 ; +.....+ To : +.....+ depther := gage shape (array_findAll_subArray `/ subDir) ; +.....+ 05-----05 Re-do whole webSite qnial> webSite_convert >> Seemed to go well qnial> webSite_update >> YIKES, none of the menues work!! It may not be useful to keep running webSite_convert, unless problems pop up. By now all targeted files have been converted! Problematic menus : I looks like links come up too far (missing a .../ !?? which I just took out!? Change back to : depther := (gage shape (array_findAll_subArray `/ subDir)) + 1 ; lq_fileops qnial> webSite_update file:///media/bill/HOWELL_BASE/Website/page%20projects.html Neural Nets menu item : file:///media/bill/Neural%20nets/Neural%20Networks.html Now I'm up two levels, so I need to fix. Check d_webRaw first >> Idiot. No menus there! Of course. Try this, though it should crash in d_webRoot? depther := (gage shape (array_findAll_subArray `/ subDir)) - 1 ; >> Seems to work well?!!! 24************************24 25Oct2020 backtracks were not executed. Why, all of a sudden, don't they work? fileops.ndf flag_break : pinn_writeExecute_pout IS OP pinn d_inn d_out >> nuts : 1. subDir wasn't added! It is needed to go down from [d_webRaw, d_webSite] 2. menuHeadFoots must be executed as well! (I stupidly removed that code - but for conversions) 24************************24 25Oct2020 Corrections cycle in d_webRaw # Fix previous conversions of web-[pages, site] # 25Oct2020 - It's much easier just to : 1. dirBackup_restoreTo_paths webpages from an earlier date 2. add corrections to webPage_convertEncoding 3. webSite_convert 05-----05 25Oct2020 19:45 1. qnial> dirBackup_restoreTo_paths (link d_webRaw 'z_Archive/201025 19h11m26s backups/') p_webPageList cp: cannot stat '/media/bill/SWAPPER/Website - raw/z_Archive/201025 19h11m26s backups/201022 18h08m34s Howell - influenza virus.html': No such file or directory >> I deleted that file (again? - 3rd or 4th time) 2. add corrections to webPage_convertEncoding - sed_footLevels added [F,f] & capitalized Footer : sed_footLevels := ';s|fin [F,f]ooter[1-9]\.html|fin Footer\.html|' ; 3. webSite_convert >> seems OK? 05-----05 WebSite update - one-way flow of html files, so backups are not an issue 1. webSite_update >> None of menus work!!? - shouldn't have added 1? >> Ah Hah! - backtracks were not executed. Why all of a sudden they don't work? 24************************24 25Oct2020 05-----05 webPage_update IS OP webPage qnial> webPage_update (link d_webRaw 'Pandemics, health, and the Sun/influenza/Howell - influenza virus.html') >> It didn't work!! 05-----05 Now for the "Big Test" : qnial> webSite_update I need to update [menu, header, footer]s FIRST, then the others! eg split p_htmlFileList into [[menu, header, footer]s, regular webPages] CRAP! ALL html webPages have been destroyed! huge work to put back BUT - they look correct!!!? 05-----05 All the way back to : # dirBackup_restoreTo_paths (link d_webRaw 'z_Archive/201025 18h31m43sbackups/') p_webPageList >> seemed OK? qnial> webSite_convert ?path_backupTo_dir file unknown error, OR [path dirBackup] : /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/influenza/201022 18h08m34s Howell - influenza virus.html /media/bill/SWAPPER/Website - raw/z_Archive/201025 19h11m26s backups/ ?webPage_convertEncoding file unknown error, webPage : /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/influenza/201022 18h08m34s Howell - influenza virus.html [#!: path_insertIn_fHand (link d_webWork 'fin footer.html') fout Binary file (standard input) matches >> NUTS! - back to the same old shit! I'm going around in circles 24************************24 24Oct2020 05-----05 webSite_update d_webRaw Crap - I didn't fix up [webPage, webSite] update operators! Time to go to bed! 05-----05 So now try : webSite_convert d_webRaw >> didn't work at all?? forgot flag_overwrite webPage dirBackup webSite_convert d_webRaw >> all webPages returned ?webPage_convertEncoding file unknown error, webPage : /media/bill/SWAPPER/Website - raw/ A few corrections of cockups and it ran well. Now to check a random selection of updated webPages in d_webRaw 05-----05 webPage_convertEncoding - outputs message at end of output file : Binary file (standard input) matches Why?? - maybe missing apos at [start, end] of sed? I added them in >> didn't help, sed didn't seem to run? >> I was already using quote at [start, end] - which is good, so that wasn't the problem By now, the message is in the original file, so remove it and see what happens. >> OK, it's no longer there! So I'm ready to do entire website? Scary... I need a backup did rsync backup - looks good 05-----05 Test again webPage_convert # webPage_convert o (link d_webRaw 'Pandemics, health, and the Sun/_Pandemics, health, and the sun.html') (link d_Qndfs 'z_Archive/201024 backups/') >> OK # /media/bill/SWAPPER/Website - raw/Climate and sun/_Climate and sun.html \.\./ 3; :&file-insert &: 5; \[#=; backtrack ;=#\] 0; # webPage_convert o (link d_webRaw 'Climate and sun/_Climate and sun.html') (link d_Qndfs 'z_Archive/201024 backups/') >> mailtos not fixed, file notgenerated OK, now it's working well - BUT : 1. get strange "Binary file inpute" message at end of new file - ?? I don't know what the issue is?? 2. have [Menu1, fin footer1, etc] - I added [sed_menuLevels sed_footLevels] Try again : # webPage_convert o (link d_webRaw 'Pandemics, health, and the Sun/_Pandemics, health, and the sun.html') (link d_Qndfs 'z_Archive/201024 backups/') >> OK for [Menus, foot] but still get error at end of output : Binary file (standard input) matches Try a save, and test all [links, mailtos] : # webPage_convert l (link d_webRaw 'Pandemics, health, and the Sun/_Pandemics, health, and the sun.html') (link d_webRaw 'z_Archive/201024 backups/') >> Oops, sometimes "fin [F,f]ooter..." -> BUT, I only want to change fin Footer!! as the others aren't indexed A few links to my website are broken - fix later stage >> For some reson, the "junk message" doesn't show? (I don't get it) >> file overwritten - Yes, as it should >> path_backupTo_dir webPage dirBackup ; did NOT work!?!? flag_break := l ?.. host link 'mv "' P_temp_webpage_fixmailtos '" ' " Webpage '" ' -->[nextv] webPage /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/_Pandemics, health, and the sun.html -->[nextv] ?invalid host command man mv >> specify directory ONLY! nyet - I had swith apo & quote NUTS!! - I specified the WRONG d_bazckup!!! # webPage_convert l (link d_webRaw 'Pandemics, health, and the Sun/_Pandemics, health, and the sun.html') (link d_webRaw 'z_Archive/201024 backups/') >> OK - now it goes to the right directory >> still have issue with : Binary file (standard input) matches 05-----05 Testing 'webPage_convert' # webPage_convert o (link d_Qtest 'test- HELP.html') (link d_webRaw 'z_Archive/201024 backups/') 05-----05 Test pathListFile_findCountsBY_strList # list_readFrm_path p_htmlFileList # pathListFile_findCountsBY_strList p_htmlFileList ('"mailto' '../' ':&file-insert &:' '[#=; backtrack ;=#]') >> OK, works, but I doubt the "backtrack" counts per file # pListFiles_findCountsBY_strList p_htmlFileList ('"mailto' '\.\./' ':&file-insert &:' '\[#=; backtrack ;=#\]') p_findCountsBY_strList # good example of a file with embeddeds : /media/bill/SWAPPER/Website - raw/Pandemics, health, and the Sun/_Pandemics, health, and the sun.html \.\./ 2; :&file-insert &: 5; \[#=; backtrack ;=#\] 0; 05-----05 # p_temp := link d_temp 'update_encoding temp.txt' # example with ' ' # generate_menus_levels dw_base # cmd := link find "' strOld '" -maxdepth 3 -name "' pname '" | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '&file-insert &:' "FILE" | sed 's/:&file-insert &:*/:&menu-insert &:/' | sort -u # code thoughts ELSEIF (in_string ':&title-insert &:' line) THEN % insert the web-page title construct ; IF (~= null (line := strings_between '"' '"' line)) THEN line := execute line ; write line ; insertInPath_fHand line fout ; ENDIF ; # loaddefs link d_Qndfs 'Website updates.ndf' # enddoc