"$d_bin"'webSite maintenance specific files.link.txt' www.BillHowell.ca ?date? initial # 12Sep2023 see : # "$d_webWork"'fileops run.sh' - especially webSite section # "$d_webWork"'fileops run webSite.sh' - process to update [, sub]webSites # "$d_PROJECTS"'bin - secure/lftp update specified dir.sh' - actual uploading of files # "$d_bin"'webSite update notes.txt' - LOCAL d_web update, notes # "$d_bin"'webSite update.sh' - LOCAL d_web update # "$d_bin""webSite check for [z_Archive, z_Old].sh" - make sure these are not in lists # "$d_bin"'webSite maintenance specific files.link.txt' # "$d_SysMaint"'Linux/permissions notes.link.txt' - permissions issue (problematic!) 09********08 ?date? incomplete listing of webSite-related functions : see "$d_bin"'fileops.sh' : # +-----+ # webSite maintenance specific # htmlHeadings_to_TblOfContents() - create a TblOfContents from headings in html file # dWeb_addHdrFtr_dTmp() - prepare files in dWeb for upload # html files MUST be pre-processed, others just copied # dWeb_update_all - update one-way only! separate [, non-]html files # see "$d_PROJECTS""bin - secure/lftp update specified dir.sh" # bash auto-generation of html TableOfContents # grep ' ||;s|["]||g;s|\(.*\)|
  • \n\t\t\t\t \1|' # # str_inWebPages_pout() - list according to str arg # webSite_fixes_noChrBads() - website fixes of [changed, moved] subDirs # with noChrBads like [apo, quote, &] # webSite_errLinkLines() - extract linkErrorLines, and # sub-select linkNotErrorOnly '!!linkError!!' # this is also done by webSite_link_counts n Qnial operator : # 'webSite maintain [menu, header, footer, body] links, TableOfContents.ndf'