#]
#] *********************
#] "$d_bin"'webSite notes.txt' - [local, online] d_web update, notes
# www.BillHowell.ca 01Sep2023 initial, previous notes in QNial, etc ("see also" below Setup)
# used to be :
"$d_bin"'webSite update notes.txt' - bash-related webSite maintenance & tracking
www.BillHowell.ca 15May2020 initial
"$d_SysMaint"'webSite/0_website bash notes.txt'
# view in text editor, using constant-width font (eg courier), tabWidth = 3
see also list of related files :
"$d_bin"'webSite maintenance specific files.link.txt'
see "Old ToDos" at bottom of this file
#48************************************************48
#24************************24
# Table of Contents, generate with :
# $ grep "^#]" "$d_bin"'webSite notes.txt' | sed "s/^#\]/ /" >"$d_bin"'webSite TblOfContents notes.txt'
#
#24************************24
#] +-----+
#] ToDos active :
14Sep2023 pOvrClassL_get_pClassL pOvrClassL pHtmlClassAll_L - generate a pList of all classes
16Sep2023 must also trap [no-tab, empty] strP in pHum_sed_pCde? ...later...
don't think affects the current situation (no empty pCde),
but may make code more bullet-proof in general
23Sep2023 probably deleted, must get from backup :
/home/bill/web/eir3.gif
/home/bill/web/../eir_subscribe_button.gif
/home/bill/web/eirtoc/2000/eirtoc_2742.html
22Oct2023 need script to clean up pLnkBmkL.txt
23Feb2024 10:53$ bash "$d_bin"'webSite update local.sh'
extract webPages from htmlL: they have strTst
/home/bill/web/bin/fileops.sh: line 1363: warning: command substitution: ignored null byte in input
/home/bill/web/bin/fileops.sh: line 1363: warning: command substitution: ignored null byte in input
>> must fix two above errors (later)
>> Weird, why are [foot, head]er files updated???
27Feb2024 [SAFIRE, Aureon] webPages need work...
27Feb2024 Grossberg - pMenuTop[status, copyright] - add section
27Feb2024 'pGoodL_pFailL_bandAid TmpFail.txt' - empty, so it's not working
#] +-----+
#] ToDos old :
18Feb2023 process for webSiteLinks update : see
10Nov2023 setup separate update for ConfGuides to pHtmlConfGuideL.txt
20Sep2023 format 'status & updates' MenuTops [All_, TrNN]
I need to split [projmajor, projmini, pandemics, etc] - by color?
20Sep2023 MenuTop [copyright, help] are incomplete, fix classes
27Sep2023 steps of pWebPageL_pStrP_replaceGetBad() - checks should be logged by func!!
22Oct2023 I need to add code to clean up pLnkBad.txt
#] +-----+
48************************************************48
#08********08
#] ??Feb2024
#08********08
#] ??Feb2024
#08********08
#] ??Feb2024 update pMenuTop[Grossberg, status, copyright]
#08********08
#] ??Feb2024 add to Wickson subDir
#08********08
#] ??Feb2024
#08********08
#] ??Feb2024
#08********08
#] ??Feb2024
#08********08
#] 28Feb2024 upload webSite prematurely so I can emto Wickson for discussion
see "$d_PROJECTS"'bin - secure/webSite update online lftp notes.txt'
+-----+
29Feb2024
hmm - I shouldn't upload Isabelle/HOL program stuff!!
yikes!! need to add dExcl to pExc lftp!!
see "$d_PROJECTS"'bin - secure/webSite update online lftp notes.txt'
after lftp upload, must FileZilla check to delete [z_Archive, Isabelle, etc]
#08********08
#] 27Feb2024 Wickson [cover, image, figures, comments] add something
cover image - take picture
#08********08
#] 27Feb2024 update webSite with 'pStrPAll_L change.txt' (from item below)
Wickson moved files : add to pStrP
>> hope this works!
change povrL_pStrP_replace because of bookmarks: '^#] ' comment, not `#
>> didn't work, try again with menu fixes
"$d_bin"'webSite run.sh': activate webLocal_step[2, 3]
>> still problems
I fixed several mistakes in 'pStrPAll_L change.txt'
"$d_bin"'webSite run.sh': activate webLocal_step[2, 3]
>> quick checks Wickson 900y ONLY
>> seems OK?
27Feb2024 'pGoodL_pFailL_bandAid TmpFail.txt' - empty, so i's not working
#08********08
#] 27Feb2024 rename webOther subDirs to be consistent
changed following dirNames
added dirChanges to : "$d_webWork"'pStrPAll_L change.txt'
/webWork/Neil Howell/ /webWork/Howell, Neil/
/webWork/Paul L Vaughan/ /webWork/Vaughan, Paul/
/webWork/Wickson website/ /webWork/Wickson, Steven/
/webWork/Steven H Yaskell/ /webWork/Yaskell, Steven/
nemo :
cp "$d_web"'web/webOther/Paul L Vaughan/' to "$d_web"'web/webOther/Vaughan, Paul/'
skip files of same length
exception make renamed file, then cp :
Vaughan 101223 - Confirmation of Solar forcing of the semi-annual variation of length-of-day.odt
#08********08
#] 27Feb2024 spot check webPage links: [Grossberg, Kaal, Fischer]
+-----+
Grossberg :
menus : work
TblOfContents : doesn't work - regenerate
all other links: work
>> OK, all good
10:49$ ls -1 '/home/bill/web/Neural nets/Grossberg'
menus TblOfCont other links other
ART assess theories of consciousnes bad n/a n/a no content!
ART augmentation of other research. bad n/a n/a no content!
Grossbergs ART- Adaptive Resonance OK n/a n/a only has themeL!
Grossbergs [core, fun, strange] con OK consc item OK
Grossbergs list of [chapter, sectio OK OK OK
Grossbergs list of [figure, table]s OK OK OK
Grossbergs list of index.html OK none none
Grossbergs overview.html OK OK Comparision of rivalry (later)
Grossbergs paleontology.html OK fixed none
Grossbergs Principles, Principia.ht OK n/a n/a no content!
Grossbergs quoted text.html OK fixed none
Grossbergs what is consciousness.ht OK fixed OK almost no content
reader Howell notes.html OK OK mostOK
references- Grossberg.html OK n/a n/a no content!
references- non-Grossberg.html OK none OK
[use, modfication]s of c-ART.html OK n/a n/a no content!
why is cART unknown.html OK n/a none yet very initial
general issues :
n/a haven't done yet
ART [assess, augment] pMenuTop(.).html : some have pMenuTopMenu.html#Grossberg#TrNN_ART
Status - nothing for Grossberg!
problem links :
overview?
(Figure 2.03)
Membrane equations of neurophysiology. Shunting equation
"... Precursor of Shunting network model (Rail 1962) ...")
reader Howell :
"multiple conflicting hypothesis"-
>> overall, very high success rate for links - catch bad ones later with pHtmBmkL_get_pGoodL_pFailL
27Feb2024 Grossberg - pMenuTop[status, copyright]
27Feb2024 Grossberg - ask for text list of references
+-----+
Kaal :
menus : work
TblOfContents : doesn't work - regenerate
all other links: work
>> OK, all good
27Feb2024 SAFIRE webPage needs work...
+-----+
Fischer :
menus : work
TblOfContents : doesn't work - regenerate
all other links: work
>> OK, now works extremely well with proper TblOfContents
#08********08
#] 27Feb2024 [create, test] pGoodL_pFailL_bandAid(no args)
09:58$ bash "$d_bin"'webSite run.sh'
grep: /home/bill/web/webWork/pLnkFailExcl.txt:98: Invalid range end
>> many of these
pTmpFail - NO contents (failed exclude)
pTmpGood - 4411 lines, looks like that worked?
oops - had added back pTmpFail so I took that out of cat
change :
#grep --invert-match --file="$pLnkFailExc769" "$pLnkFail769" >"$pTmpFail"
to :
diff "$pLnkFail769" "$pLnkFailExc769" --suppress-common-lines | grep "<" | sed 's/< //' >"$pLnkFailDif769"
>> Awesome, looks good
>> probably many errors in pLnkFailExc769, but at least I can now work with pLnkFail769
will have to do spot manual checks with webPages...
#08********08
#] 26Feb2024 test pHtmBmkL_get_pGoodL_pBadL
18:58 leave it for another day... sort-of works, but not handy for making fixes
far too much webPage work needed anyways!
+-----+
08:47$ bash "$d_bin"'webSite run.sh'
/home/bill/web/bin/webSite run.sh: line 154: /home/bill/web/webWork/pHtmlPathAll_L.txt: No such file or directory
/home/bill/web/bin/webSite run.sh: line 156: syntax error near unexpected token `}'
/home/bill/web/bin/webSite run.sh: line 156: `}'
~
09:08$ bash "$d_bin"'webSite run.sh'
dWebWork_archiveLocal...
...
+-----+
many errors like :
/home/bill/web/bin/webSite.sh: line 118: : No such file or directory
grep: Invalid range end
+-----+
>> all 12,180 pLnkInt769 failed?
must sort -u pLnk[Good, Fail]L.txt
# tests
10:32$ pHtmBmk072='/home/bill/web/ProjMajor/Sun pandemics, health/_Pandemics, health, and the sun.html#Robert Prechter - Socionomics, the first quantitative sociology?'
~
10:36$ echo "$pHtmBmk072" | sed 's|\#.*||'
/home/bill/web/ProjMajor/Sun pandemics, health/_Pandemics, health, and the sun.html
~
>> OK, works
10:36$ pHtmBmk072='/home/bill/web/Neural nets/Grossberg/Grossbergs [core, fun, strange] concepts.html#ARTPHONE [gain control, working] memory'
~
10:38$ echo "$pHtmBmk072" | sed 's|\#.*||'
/home/bill/web/Neural nets/Grossberg/Grossbergs [core, fun, strange] concepts.html
~
>> OK, works
10:39$ pHtmBmk072='/home/bill/web/Neural nets/Grossberg/Grossbergs [core, fun, strange] concepts.html'
~
10:39$ echo "$pHtmBmk072" | sed 's|\#.*||'
/home/bill/web/Neural nets/Grossberg/Grossbergs [core, fun, strange] concepts.html
~
>> OK, works
10:39$ pHtmBmk072='/home/bill/web/ProjMajor/Sun pandemics, health/_Pandemics, health, and the sun.html#Robert Prechter - Socionomics, the first quantitative sociology?'
~
10:43$ echo "$pHtmBmk072" | sed 's|.*\#||'
Robert Prechter - Socionomics, the first quantitative sociology?
~
>> OK, works
pTmpBol__072="$d_temp"'pHtmBmkL_get_pGoodL_pBadL temp bolBmk.txt'
pth072='/home/bill/web/Neural nets/Grossberg/Grossbergs [core, fun, strange] concepts.html'
bmk072='Robert Prechter - Socionomics, the first quantitative sociology?'
grep -i "" "$pth072" >"$pTmpBol__072"
>> fails: forgot to use pHtmBmk072#?? or what
>> oops - no bmk 'Robert Prechter - Socionomics, the first quantitative sociology?'
pTmpBol__072="$d_temp"'pHtmBmkL_get_pGoodL_pBadL temp bolBmk.txt'
pth072='/home/bill/web/Neural nets/Grossberg/Grossbergs [core, fun, strange] concepts.html'
10:47$ bmk072='Principles, Principia'
~
10:51$ grep -i "" "$pth072" >"$pTmpBol__072"
~
pTmpBol__072 :
>> works
10:54$ if [ -s "$pTmpBol__072" ]; then echo 'works'; else echo 'no good'; fi
works
~
>> OK, works
10:54$ bmk072='Robert Prechter - Socionomics, the first quantitative sociology?'
~
10:56$ grep -i "" "$pth072" >"$pTmpBol__072"
~
10:56$ if [ -s "$pTmpBol__072" ]; then echo 'works'; else echo 'no good'; fi
no good
~
>> OK, good, as the bmk doesn't exist in that file
+-----+
NUTS!!!!
change : "$pGoodL__769" to : "$pLnkGood769"
>> runs OK, results suck :
>> pLnkGood.txt 602 lines
>> pLnkFail.txt 3,798 lines
>> grep: Invalid range end ~45 terminal lines
spot check pLnkFail.txt :
/home/bill/web/webWork/pMenuTopHelp.html#Theme webPage generation by bash script
/home/bill/web/webWork/pMenuTopStatus.html#Wickson
Grossberg - mostly png files
Kaal (3290 - 3243 + 1) = 48
/home/bill/web/ProjMajor/Electric Universe/Kaal SAM nucleus/images/Kaal EU2017 Carl Johnson - Statistical analysis of isotope masses.png
Schmidhuber - accounts for (3092 - 965 + 1) = 2,128 of 3,798
examples :
/home/bill/web/Neural nets/References/Schmidhuber 29Dec2022 Annotated history of modern AI and deep neural networks.html#TUR1
/home/bill/web/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html#NASC8
/home/bill/web/References/Neural Nets/Howell 110824 - Confabulation Theory - Plausible next sentence survey.pdf
I must only check html files for bookmarks!!
>> did nodifications to code
12:01$ bash "$d_bin"'webSite run.sh'
...
grep: Invalid range end
grep: Invalid range end
grep: Invalid range end
~
>> wow, only 3 terminal errors this time
>> pLnkGood.txt 1,711 lines
>> pLnkFail.txt 3,798 lines same as last time??
pLnkFail.txt seems to have many good pHtmBmk
check some :
/home/bill/web/20120 [before, after] running head-on into a semi-tractor trailor hauling propane.jpg
>> should be OK?
/home/bill/web/Bill Howells book [note, review]s/Wilson 1977 Cosmic trigger, Howells review.html#Comparison of [TradingView, Yahoo finance] data
>> bad Table of Contents
>> error generation ruins TblOfContents: why
>> formatting not done - what a mess
/home/bill/web/Bill Howells videos/120214 Venus et Mars, au dela d une histoire d amour/Mythology.ogv
/home/bill/web/Bill Howells videos/220331 Hydrogen future Alberta/scripts - hydrogen future Alberta.txt
>> why are these a problem? - pLnkFail.txt is FULL of good links!!
+-----+
Hand-test them to see
pTmpBol__072="$d_temp"'pHtmBmkL_get_pGoodL_pBadL temp bolBmk.txt'
pth072='/home/bill/web/20120 [before, after] running head-on into a semi-tractor trailor hauling propane.jpg'
bmk072='Robert Prechter - Socionomics, the first quantitative sociology?'
grep -i "" "$pth072" >"$pTmpBol__072"
I fixed pHtmBmkL_get_pGoodL_pBadL() :
# test if there is a bookmark, or not
bmk072=$( echo "$pHtmBmk072" | sed 's|.*\#||' )
if [[ "$pHtmBmk072" == "$bmk072" ]]; then
#there is NO bookmark
echo "$pHtmBmk072" >>"$pTmpGoodL072"
else
grep -i "" "$pth072" >"$pTmpBol__072"
if [ -s "$pTmpBol__072" ]; then
echo "$pHtmBmk072" >>"$pTmpGoodL072"
else echo "$pHtmBmk072" >>"$pTmpFailL072"
fi
fi
13:46$ bash "$d_bin"'webSite run.sh'
...
grep: Invalid range end
grep: Invalid range end
grep: Invalid range end
~
>> again, only 3 terminal errors this time
>> pLnkGood.txt 1816 lines
>> pLnkFail.txt 3799 lines same as last time??
>> ergo, problem NOT solved!??? most pLnkFail.txt are GOOD paths!
change :
grep -i "" "$pth072" >"$pTmpBol__072"
if [ -s "$pTmpBol__072" ]; then
echo "$pHtmBmk072" >>"$pTmpGoodL072"
else echo "$pHtmBmk072" >>"$pTmpFailL072"
fi
to :
grep -i "" "$pth072" >"$pTmpBol__072"
if [ -s "$pTmpBol__072" ]; then
echo "$pHtmBmk072" >>"$pTmpGoodL072"
else echo "$pHtmBmk072" >>"$pTmpFailL072"
fi
>> still doesn't work
14:15$ echo '/home/bill/web/webWork/pMenuTopStatus.html#projMini' | grep '#'
/home/bill/web/webWork/pMenuTopStatus.html#projMini
~
14:16$ echo '/home/bill/web/webWork/pMenuTopStatus.html' | grep '#'
~
>> OK
14:19$ if [ -n '/home/bill/web/webWork/pMenuTopStatus.html' ]; then echo 'good'; else echo 'bad'; fi
good
~
14:19$ if [ -n '' ]; then echo 'good'; else echo 'bad'; fi
bad
~
>> OK
pth_get_ext '/home/bill/web/webWork/pMenuTopStatus.html'
sh
>> OK
+--+
15:47$ pHtmBmk072='/home/bill/web/Bill Howells videos/120214 Venus et Mars, au dela d une histoire d amour/Mythology.ogv'
~
15:47$ pth072=$( echo "$pHtmBmk072" | sed 's|\#.*||' )
~
15:48$ echo "$pth072"
/home/bill/web/Bill Howells videos/120214 Venus et Mars, au dela d une histoire d amour/Mythology.ogv
~
15:48$ if [[ -f "$pth072" || -d "$pth072" ]]; then echo 'good'; else echo 'bad'; fi
good
~
15:49$ ext072=$( pth_get_ext "$pth072" )
pth_get_ext: command not found
~
15:50$ ext072='ogv'
~
15:51$ if [[ 'html' == "$ext072" ]]; then echo 'good'; else echo 'bad'; fi
bad
~
>> OK, makes sense, so pHtmBmk072 should be in pLnkGoodL.txt, but it isn't!!
>> YIKES!! yes it is
>> Why?
There are 3 copies of pHtmBmk072 in pLnkInt.txt
webSite_get_links() must sort -u
17:17$ bash "$d_bin"'webSite run.sh'
webSite_get_links...
grep: /home/bill/web/Neural nets/References/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html: binary file matches
... many
sort-grep start : 240226 17h25m05s
sort-grep end : 240226 17h25m05s
...
grep: Invalid range end
grep: Invalid range end
grep: Invalid range end
>> file lines
pLinkInt.txt 4,414
pLnkGood.txt 1,832
pLnkFail.txt 4,352
must be overlap big time, why?
+-----+
pLnk[Good, Fail].txt share many common paths
How does this happen? shouldn't
17:56$ comm -12 "$d_webWork"'pLnkGood.txt' "$d_webWork"'pLnkFail.txt' >"$d_temp"'pLnk[Good, Fail] common lines.txt'
~
>> 1,765 lines are common (pretty well all of pLnkGood.txt)
But how is the same file sent to both lists?
logic is either-or??? obviously not
temp files aren't rm !
pTmpGoodL072="$d_temp"'pHtmBmkL_get_pGoodL_pBadL temp Good.txt'
pTmpFailL072="$d_temp"'pHtmBmkL_get_pGoodL_pBadL temp Fail.txt'
18:06$ bash "$d_bin"'webSite run.sh'
...
grep: Invalid range end
grep: Invalid range end
grep: Invalid range end
~
18:08$
>> file lines
pLinkInt.txt 4,414
pLnkGood.txt 1,832
pLnkFail.txt 4,352
>> same as before ARRRGGGHHH!!
18:11$ comm -12 "$d_webWork"'pLnkGood.txt' "$d_webWork"'pLnkFail.txt' >"$d_temp"'pLnk[Good, Fail] common lines.txt'
>> 1,765 lines are common, no change from before
+-----+
For now, fudge a list of problematic links
18:41$ diff "$d_webWork"'pLnkFail.txt' "$d_webWork"'pLnkGood.txt' --suppress-common-lines | grep "<" | sed 's/^< //' >"$d_webWork"'pFailDiff.txt'
~
>> gives 2588 lines (too many?)
BUT pLnkGood.txt + pFailDiff.txt ~= pLinkInt.txt
1,832 + 2,588 ~= 4,420
leave it for another day... sort-of works, but not handy for making fixes
+-----+
olde code
# # 25Feb2024 REVAMP!!! file-exists no good if there is a bookmark!!!!
# while IFS='' read -u 366 pLnkIn_769; do
# bolBmk366=$( echo "$pLnkIn_769" | grep '#' )
# if [[ -n "$bolBmk366" ]]; then
# bolLnk=$( 'link check in place of -f????' )
#
# if ?? [[ == "$bolLnk" ]]; then
# echo "$pLnkIn_769" >>"$pLnkFail769"
# fi
# else
# if ! [[ -f "$pLnkIn_769" || -d "$pLnkIn_769" ]]; then
# echo "$pLnkIn_769" >>"$pLnkFail769"
# fi
# fi
# done 366<"$pLnkInt769"
#08********08
#] 25Feb2024 Howell : write bash function that does test for html files only
see "$d_bin"'webSite.sh'
+-----+
don't create - don't need for now :
# pHtmBmkL_is(pHtmBmkL) - local backups of key d_webWork files
# ?date? initial, 23Feb2024 convert to global env symbols in 'fileops header.sh'
# 23Feb2024 should eventually move into functions that change files (some already done)
pHtmBmk_is()
{
}
+-----+
olde code
# webSite_check_internalLinks(no args) - check internal links, 2nd set of bads to pLnkFailL
# to within webSite
# also : do BEFORE uploading online to webSite & fix problems
# QNial version
# urls_check IS OP linkType d_backup - ] create sublists of [internal,xternal] links
# >> I'm just checking if a path exists
# 14Sep2023 initial from QNial
webSite_check_internalLinks()
{
date_ymdhms=$(date +"%0y%0m%0d %0kh%0Mm%0Ss")
echo >>"$p_log" "$date_ymdhms webSite_check_internalLinks"
echo 'webSite_check_internalLinks...'
# 'webSite header.sh' : pLnk[Fail, Int]769
# pLog366="$d_webWork"'webSite_check_internalLinks log.txt'
if [ -f "$pLnkGood769" ]; then pinn_archiveLocalRm "$pLnkGood769"; fi
if [ -f "$pLnkFail769" ]; then pinn_archiveLocalRm "$pLnkFail769"; fi
# do all, even if a pHtmBmk has no bookmark, it is still "good" if the pth is good
pHtmBmkL_get_pGoodL_pBadL "$pWebYes769" "$pLnkGood769" "$pLnkFail769"
}
#08********08
#] 25Feb2024 [webSite_get_links, pWebL_get_pLnkL_run] fix for bookmarks
around & around in circles, rename [func, sym]s, pinn_archiveLocal [efficiency, complete]
look again at [pWebPag_sedStrt_get_pLnkL, webSite_get_links] :
OK, finally too tired, just try webLocal_step3 webSite_[get_link, check_internalLink]s() :
14:24$ bash "$d_bin"'webSite run.sh'
/home/bill/web/bin/webSite.sh: line 476: syntax error near unexpected token `done'
/home/bill/web/bin/webSite.sh: line 476: ` done'
/home/bill/web/bin/webSite run.sh: line 286: webSite_get_links: command not found
/home/bill/web/bin/webSite run.sh: line 323: webSite_check_internalLinks: command not found
>> oops...
+-----+
#] curl : can't get to work consistently : see "$d_SysMaint"'Linux/curl notes.txt'
+-----+
#] use file-exists test? nyet...
18:41$ putter='/home/bill/web/Bill Howells book [note, review]s/Wilson 1977 Cosmic trigger, Howells review.html#Summary comments'
~
18:41$ if [ -f "$putter" ]; then echo 'yes'; else echo 'no'; fi
no
~
>> not good enough
+-----+
#] search "Linux html test path with #"
still can't find anything good
+-----+
Howell : write bash function that does test for html files only
+-----+
olde code
pWebPag_sedStrt_get_pLnkL()
# grep "^#" "$pLnkTmp577" | sort -u >"$pLnkBmkInn769"
# grep "^[#]\(.*\)#" "$pLnkTmp577" | sort -u >"$pLnkBmkOut769"
# nItrMax=0
# if [[ "$nItr" -gt "$nItrMax" ]]; then nItrMax="$nItr"; fi
webSite_get_links()
grep -i "$sedLnk" "$pLnkTmp901" | sed 's|\(.*\)#.*|\1|' | sort -u >"$pLnkAll769"
'webSite header.sh' :
pLnkRaw769="$d_webWork"'pLnkRaw.txt'
#08********08
#] 24Feb2024 search for pLnkFail - doesn't work, is it the bookmarks?
this means a simple check for [ -f pth ] isn't enough!!?
#08********08
#] 24Feb2024 split webSite.sh from fileops.sh (as was before)
see "$d_bin"'fileops run.sh'
create webSite.sh from fileops.sh, adjust header files etc
yet aonther check of 'pStrPAll_L change.txt'
Now - 'pStrPAll_L change.txt' - are ^# lines ignred??
see "$d_bin"'fileops notes.txt'
>> should work now (I hope)
16:44$ bash "$d_bin"'webSite run.sh'
23Feb2024 pStrPAll_L change.txt - fix up what I can, also 24Feb2024
povrL_pStrP_replace(bolArXiv bolChrCd povrL pStrP) sed replace encoded (pStrP, povrL)
...
~
16:59$ bash "$d_bin"'webSite run.sh'
webSite_get_links...
/home/bill/web/bin/webSite.sh: line 513: : No such file or directory
cat: '': No such file or directory
/home/bill/web/bin/webSite.sh: line 516: : No such file or directory
"=FERH 9724
'=FERH 32
"=ferh 17243
'=ferh 6
"=CRS 405
'=CRS 0
"=crs 787
'=crs 35
webSite_check_internalLinks...
#08********08
#] 23Feb2024 corrections for "$d_webWork"'pStrPAll_L change.txt'
fix dWebWork_archiveLocal() : archive pL when working on webSite
pLnkL_diffNum_pLogs() - removed as is obsolete...
First [archive, update] * p[Htm, Web]L :
10:53$ bash "$d_bin"'webSite update local.sh'
dWebWork_archiveLocal...
find all html files
extract webPages from htmlL: they have strTst
/home/bill/web/bin/fileops.sh: line 1363: warning: command substitution: ignored null byte in input
/home/bill/web/bin/fileops.sh: line 1363: warning: command substitution: ignored null byte in input
>> must fix two above errors (later)
>> Weird, why are [foot, head]er files updated???
>> Overall, pWeb[YesL, Diff].txt look OK
continue : corrections for "$d_webWork"'pStrPAll_L change.txt'
+-----+
olde code
get rid of redirection : simplify, can always use new 'filops header.sh'
# webSite_get_links(no args) - extract links, note pLnkBadL.txt captures initial bads
# normal use for my webSite, globals to define args (not worth passing variables)
# see also webSite_get_links_test() in "$d_bin"'0_test/fileops/fileops test.sh'
# usually for website management, I start only with base path, worry about bookmarks later
# 03Sep2023 initial, needs a tiny bit of manual work, also
#08********08
#] 22Feb2024 [curl, lftp, wget] for visitor to download a directory listing of my webSite
ouch! not working
did I set permissions too low
if I allow ftp-download, is my website vulnerable to uploads?
can I even allow that? or is it more money for a different type of site?
#08********08
#] 22Feb2024 pLnkFail.txt 94 lines -> [fix pMenuTop(.*), pStrP for povrL_pStrP_replace]
should be able to get very low number of "visible" errors
+-----+
pStrP for povrL_pStrP_replace
povrL_pStrP_replace(bolArXiv bolChrCd povrL pStrP) sed replace encoded (pStrP, povrL)
does it NOT process a file (change timestamp) if search term doesn't appear?
08:56$ cp -p "$d_webWork"'pLnkFail.txt' "$d_webWork"'240222 pLnkFail to pStrP.txt'
create initial pStrP : geany regexpr search : (.*) replace : \1\t\1\n
now edits using nemo filemgr
+-----+
webSite_get_links: speed-up
+-----+
fix pMenuTop(.*) if necessary
Nah - don't piss around ; do pStrP first
#08********08
#] 21Feb2024 webSite_check_internalLinks - find failed internal links
17:51$ bash "$d_bin"'webSite update local.sh'
webSite_check_internalLinks...
>> pLnkFail.txt : 735 lines!!
>> huge number of /TrNNs_ART/ still
>> pLnkInt.txt must be way out of date????
[/TrNNs_ART/, corrected] dirs both listed
I can't find where NEW pLnkInt.txt is generated!!!???
check webSite_get_links()
grep -i --invert-match "$sedLnk" "$pLnkTmp901" | grep --invert-match '???\|\.\/\|\");' | sort -u >"$pLnkBad901"
grep -i "$sedLnk" "$pLnkTmp901" | sed 's|\(.*\)#.*|\1|' | sort -u >"$pLnkAll901"
grep "^\/home\/bill\/web\/" "$pLnkAll901" >"$pLnkInt901"
grep "^http:\/\/\|^https:\/\/" "$pLnkAll901" >"$pLnkExt901"
So pLnkAll is source of problem... but how did /TrNNs_ART/ sneak back in?
it's like there is a fantom old copy of 'Grossbergs list of [figure, table]s.html'
NOT in pHtmAllL.txt
maybe z_Arcvhives are being included?
dWeb_get_pHtmWebYesNonL()
find "$d_web" -type f -name "*.html" | grep --invert-match -i "z_Old\|z_Archive\|z_history" | sort -u >"$pHtmAll769"
nyet
$ cat "$d_webWork"'pLnkAll.txt' | tr \\n \\0 | xargs -0 -IFILE grep --with-filename --line-number '\/Neural nets\/TrNNs_ART\/' "FILE" | sed "s|\/home\/bill\/web\/||;s|:.*||" | sort -u >"$d_temp"'pLnkAll grep TrNNs_ART.txt'
>> huge terminal output of errors
+--+
...
grep: http://www.oiq.qc.ca/bulletins/25/images/zaki_ghavitian.jpg: No such file or directory
grep: http://www.oism.org/pproject/: No such file or directory
grep: http://www.oism.org/pproject/s33p36.htm: No such file or directory
grep: http://www.olsc.ca/: No such file or directory
grep: http://www.ptep-online.com/2015/PP-41-13.PDF: No such file or directory
grep: http://www.ptep-online.com/2018/PP-53-01.PDF: No such file or directory
...
+--+
"$d_temp"'pLnkAll grep TrNNs_ART.txt' :
bin/image
bin/webSite update notes.txt
Neural nets/callerID-SNNs/callerID-SNNs.html
Neural nets/MindCode/MindCode webPage.html
webOther/Wickson website/webWork/pMenuTopCopyright Wickson.html
webOther/Wickson website/webWork/pMenuTopHelp Wickson.html
webOther/Wickson website/webWork/pMenuTopMenu Wickson.html
webOther/Wickson website/webWork/pMenuTopStatus Wickson.html
webWork/pMenuTopHelp.html
+--+
>> fixed up mostly Wickson webSite - a lot of Grossberg's stuff
webLocal_step3 # webSite LOCAL, [get, check] *links
21:06$ bash "$d_bin"'webSite update local.sh'
webSite_get_links...
webSite_check_internalLinks...
>> pLnkFail.txt 94 lines - make pStrP from them, should be able to get very low "visible" errors
#08********08
#] 21Feb2024 webSite_get_links() - clean up bad links
ran - see stats in "$d_bin"'webSite update local.sh'
pLnkBad.txt, need to grep-out :
line =
whitespace
???
./ these should be fixed to [track, fix] links, but will work supposedly
");
%20, examples:
Howell%20-%20Are%20we%20ready%20for%20global%20cooling%2014Mar06%20longer%20version.pdf
Top%2075%20Immunotherapy%20startups_files/
Schmidhuber%2024Sep2021%20Scientific%20Integrity,%20the%202021%20Turing%20Lecture,%20and%20the%202018%20Turing%20Award%20for%20Deep%20Learning_files/critique-turing-award-lecture754x288.png
Campbell,%20Grossman,%20Turner%2004Sep2019%20monthly%20British%20stock%20market,%201829-1929_files/AdobeStock_87854486.jpeg
Schmidhuber%2024Sep2021%20Scientific%20Integrity,%20the%202021%20Turing%20Lecture,%20and%20the%202018%20Turing%20Award%20for%20Deep%20Learning_files/critique-turing-award-lecture754x288.png
Gregory Aug07 - Climate Change Science_files/header.png
maybe half of bad links related to juergen. Examples :
ftp://ftp.idsia.ch/pub/juergen/IJCAI07sequence.pdf
[#=; backtrack ;=#]Neural nets/Conference guides/2020 WCCI Glasgow/logo_IEEE.png
[#=; backtrack ;=#]Neural nets/Conference guides/2020 WCCI Glasgow/logo_IEEE-CIS.png
[#=; backtrack ;=#]Neural nets/Conference guides/2019 IJCNN Budapest/logo_INNS.png
[#=; backtrack ;=#]Neural nets/Conference guides/2020 WCCI Glasgow/logo_IET.jpg
[#=; backtrack ;=#]Neural nets/Conference guides/2020 WCCI Glasgow/logo_EPS.png
[#=; backtrack ;=#]Neural nets/Conference guides/2020 WCCI Glasgow/logo_NapierU.jpg
[#=; backtrack ;=#]Neural nets/Conference guides/2019 IJCNN Budapest/logo Nokia Bell Labs.png
[#=; backtrack ;=#]Neural nets/Conference guides/2019 IJCNN Budapest/logo-bscs.png
[#=; backtrack ;=#]Neural nets/Conference guides/2019 IJCNN Budapest/logo MDPI Information and Algorithms.jpg
[#=; backtrack ;=#]Neural nets/Conference guides/2019 IJCNN Budapest/logo Genisama.png
+-----+
Clean up original links in webPages, then save remaining to filter out.
It's far easier (for me) to simply create (pStrP, povrL) files for :
povrL_pStrP_replace(bolArXiv bolChrCd povrL pStrP) sed replace encoded (pStrP, povrL)
webSite_get_links_run() changeto :
grep -i --invert-match "$sedLnk" "$pLnkTmp901" | grep --invert-match '???\|\.\/\|\");' >"$pLnkBad901"
15:33$ bash "$d_bin"'webSite update local.sh'
21Feb2024 pLnkBad.txt - fix up what I can
povrL_pStrP_replace(bolArXiv bolChrCd povrL pStrP) sed replace encoded (pStrP, povrL)
...
grep: (standard input): binary file matches
grep: (standard input): binary file matches
16:48$ bash "$d_bin"'webSite update local.sh'
webSite_get_links...
>> this did NOT help at all!
all ther old LnkBads still appear.
Huge waste of time that distracts me from fixing errors.
+-----+
try manually (now in webSite_get_links_run()) :
# track bad links - many to much work to fix, not critical
$ sedLnk="^\/home\/bill\/web/\|^http:\/\/\|^https:\/\/\|^#\|^[#]\(.*\)#"
$ grep -i --invert-match "$sedLnk" "$pLnkTmp901" | grep --invert-match '???\|\.\/\|\");' | sort -u >"$pLnkBad901"
start with
$ pLnkBad901="$d_webWork"'pLnkBad.txt'
$ pLnkBadStd901="$d_webWork"'pLnkBadStd.txt'
$ pLnkBadDif901="$d_webWork"'pLnkBadDif.txt'
$ sort -u "$pLnkBad901" >"$pLnkBadStd901"
$ diff "$pLnkBadStd901" "$pLnkBadStd901" --suppress-common-lines >"$pLnkBadDif901"
>> excellent, "$d_webWork"'pLnkBadDif.txt' has zero content.
>> only ~1239 lines in "$d_webWork"'pLnkBadStd.txt'
At least I will be able to easily see CHANGES without huge cleanup effort
#08********08
#] 20Feb2024 fix [link, pTopMenu]s in 'Neural nets/[consciousness, Grossberg]/' :
+-----+
mv [fil, dir]s from conscioussness to Grossberg :
consciousness - Grossberg-related thems, which must mv to Grossberg? :
y "$d_neural"'consciousness/ART assess theories of consciousness.html
y "$d_neural"'consciousness/ART augmentation of other research.html
"$d_neural"'consciousness/[definitions, models] of consciousness.html
"$d_neural"'consciousness/For whom the bell tolls.html
y "$d_neural"'consciousness/Grossbergs what is consciousness.html
"$d_neural"'consciousness/Introduction.html
"$d_neural"'consciousness/machine consciousness, the need.html
"$d_neural"'consciousness/opinions- Blake Lemoine, others.html
"$d_neural"'consciousness/Pribram 1993 quantum fields and consciousness proceedings.html
"$d_neural"'consciousness/Quantum consciousness.html
"$d_neural"'consciousness/Taylors consciousness.html
"$d_neural"'consciousness/TrNN controls need consciousness.html
"$d_neural"'consciousness/TrNNs augment by cART.html
"$d_neural"'consciousness/TrNNs have incipient consciousness.html
y "$d_neural"'consciousness/videoProdn/Grossbergs ART- Adaptive Resonance Theory workCore.html
y "$d_neural"'consciousness/videoProdn/Grossbergs ART- Adaptive Resonance Theory workCull.html
y "$d_neural"'consciousness/videoProdn/Grossbergs Consciousness: video script.html
>> mv ALL "$d_neural"'consciousness/videoProdn/'
"$d_neural"'consciousness/Walter Freemans chaos.html
"$d_neural"'consciousness/webWork/bash script: put [caption, reference]s on [figure, table]s.html
y "$d_neural"'consciousness/webWork/Grossbergs [core, fun, strange] concepts header file.html
"$d_neural"'consciousness/webWork/pMenuTopCopyright TrNNs_ART.html
"$d_neural"'consciousness/webWork/pMenuTopHelp TrNNs_ART.html
"$d_neural"'consciousness/webWork/pMenuTopMenu TrNNs_ART.html
"$d_neural"'consciousness/webWork/pMenuTopStatus TrNNs_ART.html
"$d_neural"'consciousness/What is consciousness: from historical to Grossberg.html
y "$d_neural"'consciousness/why is cART unknown.html
+-----+
NYET!!! : TRASH separate Menus to [Grossberg, consciousness]
maximize simplicity for webMaintainer (me)
but not for user
>> done, works
21Feb2024 [copy, transform]
from : "$d_[consciousness, Grossberg]"'webWork/pHtmlClass[consciousness, Grossberg].txt
to : "$d_webWork"'pHtmlClassAll_L.txt'
>> actually, just morphed "$d_Grossberg"'webWork/pHtmlClassGrossberg.txt'
+-----+
fix povrL for d_[Grossberg, consciousness] :
$ grep '/Neural nets/consciousness/' "$d_webWork"'pWebYesL.txt' >"$d_webWork"'240221 webPageL consciousness.txt'
$ grep '/Neural nets/Grossberg/' "$d_webWork"'pWebYesL.txt' >"$d_webWork"'240221 webPageL Grossberg.txt'
>> errors in list, must regenerate pWebYesL
>> done - OK, povrL look OK
+-----+
make consciousness changes with :
povrL_pStrP_replace(bolArXiv bolChrCd povrL pStrP) sed replace encoded (pStrP, povrL)
see webLocal_step2() in "$d_bin"'webSite update local.sh'
did consciousness first to see if any issues
oops - forgot class fix :
consciousness.html#TrNN_ART"> consciousness.html#consciousness">
try again :
>> seems OK
>> click through menu
hah - still forgot
+--+
change :
/Neural nets/TrNNs_ART/ /Neural nets/consciousness/
pMenuTopMenu TrNNs_ART.html pMenuTopMenu consciousness.html
pMenuTopStatus TrNNs_ART.html pMenuTopStatus consciousness.html
pMenuTopCopyright TrNNs_ART.html pMenuTopCopyright consciousness.html
pMenuTopHelp TrNNs_ART.html pMenuTopHelp consciousness.html
to:
/Neural nets/TrNNs_ART/ /Neural nets/consciousness/
/web/Neural nets/consciousness/webWork/pMenuTopMenu consciousness.html#consciousness /web/webWork/pMenuTopMenu.html#consciousness
/web/Neural nets/consciousness/webWork/pMenuTopStatus consciousness.html#consciousness /web/webWork/pMenuTopStatus.html#consciousness
/web/Neural nets/consciousness/webWork/pMenuTopCopyright consciousness.html#consciousness pMenuTopCopyright.html#consciousness
/web/Neural nets/consciousness/webWork/pMenuTopHelp consciousness.html#consciousness /web/webWork/pMenuTopHelp.html#consciousness
+--+
>> OK, now many more fuckups
change-to :
/home/billpMenuTopCopyright.html#consciousness /home/bill/web/webWork/pMenuTopCopyright.html#consciousness
>> OK, other problems leave for another day of detail
+-----+
make Grossberg changes with :
povrL_pStrP_replace(bolArXiv bolChrCd povrL pStrP) sed replace encoded (pStrP, povrL)
see webLocal_step2() in "$d_bin"'webSite update local.sh'
"$d_webWork"'240220 Grossberg changes.txt'
/Neural nets/TrNNs_ART/ /Neural nets/Grossberg/
/web/Neural nets/pMenuTopMenu TrNNs_ART.html /web/webWork/pMenuTopMenu.html#Grossberg
/web/Neural nets/pMenuTopStatus TrNNs_ART.html /web/webWork/pMenuTopStatus.html#Grossberg
/web/Neural nets/pMenuTopCopyright TrNNs_ART.html /web/webWork/pMenuTopCopyright.html#Grossberg
/web/Neural nets/pMenuTopHelp TrNNs_ART.html /web/webWork/pMenuTopHelp.html#Grossberg
2 more iterations as recorded in "$d_webWork"'240220 Grossberg changes.txt'
>> now looks good
reeady to re-extract links
#08********08
#] 20Feb2024 mv /TrNN/ to USB backup
done - fix crap later...
#08********08
#] 20Feb2024 'pLnkBad.txt' - pre-extract "clean inPage" bookmarks (line starts with `#)
webSite_get_links_run()
change :
grep "#" "$pLnkTmp901" | sort -u >"$pLnkBmk901"
sedLnk="^\/home\/bill\/web/\|^http:\/\/\|^https:\/\/"
to :
grep "^#" "$pLnkTmp901" | sort -u >"$pLnkBmkInn901"
sedLnk="^\/home\/bill\/web/\|^http:\/\/\|^https:\/\/\|^#"
>> original gets bkmks, but not necessarily "clean inPage" ones
OK, that's a start.
However, all the pLnkBads that I checked are full lineds with legitimate links!!!
>> why is this happening?
pInnL_get_pLnkL_run()
change :
nsedRvrs577=$( echo "$linMix577" grep -o "$sedRvrs577" | wc -l )
to :
nsedRvrs577=$( echo "$linMix577" | grep -o "$sedRvrs577" | wc -l )
see what happens now???
HUGE improvement in nLnkBad!!!!
from 31933 down to 2325
>> much easier to [work with, fix]!!!
many links /Neural nets/TrNNs_ART/
#08********08
#] 16Feb2024 [fix, update] pLnk stuff
run webSite_get_links() in "$d_bin"'webSite update local.sh'
changed pHtmAllL (too many links!!) to pHtmIncL
19:55$ bash "$d_bin"'webSite update local.sh'
webSite_get_links...
grep: /home/bill/web/Neural nets/References/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html: binary file matches
grep: /home/bill/web/Neural nets/References/Schmidhuber 26Mar2022 Neural nets learn to program neural nets with with fast weights (1991).html: binary file matches
grep: /home/bill/web/Neural nets/References/Schmidhuber 29Dec2022 Annotated history of modern AI and deep neural networks.html: binary file matches
grep: /home/bill/web/Neural nets/References/Scientific Integrity and the History of Deep Learning The 2021 Turing Lecture, and the 2018 Turing Award.html: binary file matches
grep: /home/bill/web/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning_files/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html: binary file matches
grep: /home/bill/web/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning_files/Schmidhuber 26Mar2022 Neural nets learn to program neural nets with with fast weights (1991).html: binary file matches
grep: /home/bill/web/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning_files/Schmidhuber 29Dec2022 Annotated history of modern AI and deep neural networks.html: binary file matches
grep: /home/bill/web/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning_files/Scientific Integrity and the History of Deep Learning The 2021 Turing Lecture, and the 2018 Turing Award.html: binary file matches
grep: /home/bill/web/Neural nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html: binary file matches
grep: /home/bill/web/References/Climate/HTML Quick List - HTML Code Tutorial.html: binary file matches
grep: /home/bill/web/References/Neural Nets/Schmidhuber 24Sep2021 Scientific Integrity, the 2021 Turing Lecture, and the 2018 Turing Award for Deep Learning.html: binary file matches
sedRvrs = "=FERH ; nsedRvrsAll = 11053
frequent messages : binary file matches
sedRvrs nsedRvrs pHtmInc769
22Oct2023 16Feb2024
"=FERH 12067 11053
'=FERH 32 32
"=ferh 14457 17275
'=ferh 1 6
"=CRS 177 414
'=CRS 0 0
"=crs 518 810
'=crs 27 35
run time 06m06s 08m15s
nLnkAll 5422
nLnkInt 1309
nLnkExt 4113
nLnkBad 31933
pLnkFail.txt :
/home/bill/web/bin/starter/start_app.sh
/home/bill/web/eir3.gif
/home/bill/web/../eir_subscribe_button.gif
/home/bill/web/eirtoc/2000/eirtoc_2742.html
/home/bill/web/My sports & clubs/natural- CNS/240102 Howell emto Blair: transmutation of fission wastes.html
/home/bill/web/Neural nets/Mind2023/
/home/bill/web/Neural nets/Mind2023/voice musings/
/home/bill/web/Neural nets/Mind2023/voice musings/231106_2212 Glenn Borchardts concept of infinity, avoid myopic thinking.mp3
/home/bill/web/Neural nets/MindCode/???
/home/bill/web/Neural nets/MindCode/10 Howell - MindCode Manifesto.odt
/home/bill/web/Neural nets/TrNNs_ART/Social media/Howell 111230 - Social graphs, social sets, and social media.doc
/home/bill/web/Neural nets/TrNNs_ART/videoProdn/Grossbergs Consciousness: video script.html
/home/bill/web/Personal/130726 Deer collison/Car collision with a deer.html
/home/bill/web/Personal/181211 Van versus Semi collision/181211 Van versus Semi collision.html
/home/bill/web/Personal/Thoughts/Howell - A bag of [random, scattered] quasi-principles for commentaries.html
/home/bill/web/pubinfo.html
/home/bill/web/Qnial/MY_NDFS/MindCode/2_MindCode [data, optr]s.txt
/home/bill/web/webOther/Wickson website/Grossbergs [core, fun, strange] concepts.html
/home/bill/web/webOther/Wickson website/Grossbergs list of [chapter, section]s.html
/home/bill/web/webOther/Wickson website/images- captioned/
/home/bill/web/webOther/Wickson website/images- captioned/cover image.png
/home/bill/web/webOther/Wickson website/webWork/pMenuTopCopyright TrNNs_ART.html
/home/bill/web/webOther/Wickson website/webWork/pMenuTopHelp TrNNs_ART.html
/home/bill/web/webOther/Wickson website/webWork/pMenuTopMenu TrNNs_ART.html
/home/bill/web/webOther/Wickson website/webWork/pMenuTopStatus TrNNs_ART.html
/home/bill/web/webOther/Wickson website/What is consciousness: from historical to Grossberg.html
/home/bill/web/webWork/fileops run commentary.html
/home/bill/web/webWork/fileops run commentary, webSite.html
/home/bill/web/webWork/fileops run webSite general.sh
#08********08
#] 16Feb2024 add back [href, img] pthLsts, maybe [, non-] Howell html file [list, num], [extern]
problem what that exclude file MIST hacve at least one entry, or nothing passes --invert-match
>> now seems to work well
#08********08
#] 14Feb2024 re-implement 'pHtmlPathExclL *.txt' capabilities
should be [part of, precede] fileops.sh, dWeb_get_pWebPageL()
these act on fNamL only :
dWeb_get_pWebPageL()
find "$d_web" -type f -name "*.html" | grep --invert-match -i "z_Old\|z_Archive\|z_history" | sort -u >"$pAllL__769"
pinnL_idx_strTst_split_pYes_pNo()
pinnL_idx_strTst_split_pYes_pNo "$pAllL__769" "$index__769" "$strTst_769" "$pWebYes769" "$pWebNo_769" "$pDifYes769" "$pDifNo_769"
nyet, excludes ARE fNamLs! : has to kick in when listing files,
BEFORE pinnL_lineIdx_strTst_split_pYes_pNo
do it in dWeb_get_pWebPageL, exclude BOTH [intern, extern]?
see "$d_SysMaint"'Linux/grep notes.txt'
in "$d_SysMaint""Linux/grep summary.txt" :
# exclude pExcludeL from a pthLst
# grep --invert-match --file="$d_webWork"'pHtmlPathExclL webPage.txt' -i "$d_temp"'pHtml test.txt' >"$d_temp"'pHtml no excludes.txt'
#08********08
#] 13Feb2024 [fix, improve] bash code to extract webSite links
other paths from d_webWork :
nu pHtmlPathExclL non-webPage.txt
nu pHtmlPathExclL [non-webPage, yes].txt
nu pHtmlPathExclL webPage.txt
nu pHtmlPathExcl [non-webPage, yes].txt
webUpdateCmdL_local()
nu = not used in [fileops.sh, webSite update local.sh]
mv into header!!
keep pHtmlPathWebPage no.txt
keep pHtmlPathWebPage yes.txt
+-----+
14Feb2024
oops - current code does NOT exclude any files!!!
see "$d_bin"'webSite update local.sh' - process to update [, sub]webSites
webUpdateTmp_local(no args) - avoid link stuff [list current webPages, check changes, etc]
+-----+
# $ bash "$d_bin"'webSite update local.sh' - process to update [, sub]webSites
# ToDos - see "$d_bin"'webSite update notes.txt'
# bad_to_worse() - split pLnkBad.txt into pLnkBad[Bkmk, Goog, Jrgn, , ].txt
# put this into webLocal_step1() when it is working properly
# webUpdateTmp_local(no args) - avoid link stuff [list current webPages, check changes, etc]
# un-comment parts that apply
# 22Jan2024 change webPage [code, title]s
# 22Jan2024 find <:class:> codes so I can fix them (usually in title)
# 22Jan2024 find webPage titles with 'Howell' so I can fix them
# 29Jan2024 check for webPages with missing space in first line
# webUpdateCmdL_local(no args) - don't RUN this! just copy-paste, this is an easy record
# 22Jan2024 search for known problems [www.BillHowell.ca, theme code]s
# +-----+
# webLocal_step1() - webSite LOCAL, [archive, [get, change, check]*[webPage, link]s],
# diff 'HtmlPathAll_L[ archive, ].txt'
# +-----+
# OnLine work, including [dWebHtmlOnly, lftp upload]
# dHtmlPathAllL_notIn_dHtmlWebOnlyL(no args) - use FileZilla for fast mv of dirs as needed
# webPage_rm_dWebHtmlOnly(no args) - rm html files in dWebOnly to make sure it's "clean"
# "$d_bin"'rsync web to d_webOnly (NOT webOnline).sh' - current pHtmlPathAll_L.txt
# pHtmlPathAll_L upload onLine - online webPages, this MUST be run ONLY from :
+-----+
#08********08
#] 13Feb2024 fix dWeb_get_pWebPageL() in fileops.sh: correct [set, use] * [dir, pth] names
see above "Current exclude file subDirL"
see "$d_bin"'fileops notes.txt'
work being done in fileops.sh
#08********08
#] 13Feb2024 Grossberg and rest of webSite- fix links etc
It looks like recent updates have changed key files, but many aren't used in [fileops.sh, webSite update local.sh].
Did I simply use command-line for the early Feb updates?
11Jan2024 fix webSite_get_links_run()
>> not in "$d_bin"'fileops run.sh'
>> webSite_get_links is in "$d_bin"'webSite update local.sh'
>> "$d_bin"'webSite loPriority.sh', not helpful here :
# +-----+
# OnLine work, including [dWebHtmlOnly, lftp upload]
# dHtmlPathAllL_notIn_dHtmlWebOnlyL(no args) - use FileZilla for fast mv of dirs as needed
# webPage_rm_dWebHtmlOnly(no args) - rm html files in dWebOnly to make sure it's "clean"
# "$d_bin"'rsync web to d_webOnly (NOT webOnline).sh' - current pHtmlPathAll_L.txt
# pHtmlPathAll_L upload onLine - online webPages, this MUST be run ONLY from :
07Jan2024 update local webSite
using "Live template" "$d_bin"'fileops run webSite 240107.sh'
this should provide a stable record of what was done & when
>> can't find "$d_bin"'fileops run webSite 240107.sh'!!!!
>> "$d_bin"'z_Archive/' -> fileops run webSite 240107 20h41m22s.sh
htmlHeadings_to_TblOfContents -> [moved or renamed] to
"$d_bin"'webSite loPriority.sh'
>> "$d_bin"'webSite loPriority.sh' - was being re-written, not much done yet
19Jan2024 dWeb_get_pWebPageL - fubarred! in "$d_bin"'fileops notes.txt'
What a mess! I must have done things adhoc
Probably need to fix dWeb_get_pWebPageL() in fileops.sh
correct [set, use] * [dir, pth] names
#08********08
#] 23Jan2024 Kaal webPage - draft done, update whole webSite
"$d_PROJECTS"'bin - secure/webSite update online lftp.sh'
I uncommented both "onLine update" functions :
dWeb_uploadNonWebPage_online
pWebPageL_upload_online
$ bash "$d_PROJECTS"'bin - secure/webSite update online lftp.sh'
#08********08
#] 22Jan2024 ToDos
y add [class, comment]s : to pMenuTop[Copyright, Help, Menu, Status].html for each webPage
y remove Howell :