# $ echo "$d_txted" "'""$d_Qndfs""MindCode/0_MindCode programming notes.txt""'" | xargs -0 sh -c & # www.BillHowell.ca 26Feb2020 initial - did I lose the original file? [architecture, data, functions, processes, operating systems, conciousness, behaviours, personalities, creativity] This is for programmming-related work see "$d_PROJECTS""MindCode/0_MindCode reports.txt" for [writeups, documentation, web] thinking # To search for variables : # $ mvar="fireStdL" # $ find "$d_PROJECTS""Qnial/MY_NDFS/MindCode" -type f -name "*.ndf" | tr \\n \\0 | xargs -0 -ILINE grep --with-filename "$mvar" LINE # # operators : # $ find "$d_PROJECTS""Qnial/MY_NDFS/MindCode" -type f -name "*.ndf" | tr \\n \\0 | xargs -0 -ILINE grep --with-filename "^#]" LINE | grep --invert-match 'z_Archive' "$d_temp""5_MindCode globals temp.txt" | grep --invert-match 'z_Old' | sed 's/.*#]//' # # transpose mix neurIdentL neurClassL dendIdentL dendDelayL axonIdentL axonDelayL # transpose mix clusIdentL clusClassL clusNeurnL Quick copy NONLOCALS NONLOCAL clusIdentL clusClassL clusNeurnL ; NONLOCAL neurIdentL neurClassL neurStateL dendIdentL dendDelayL dendInhibL axonIdentL axonDelayL axonInhibL ; *************************************************** 08********08 15Apr2021 split ndfs for [series, integer] clusters - toy with idea of "universal integer" of variable length, circular stack +-----+ # olde code % not done at creation - only later ; % clusIntStartL := link clusIntStartL 0 ; % clusIntStoppL := link clusIntStoppL 0 ; # logic-only diagram : -carryIn-|-NOR---|-input2- -input1--|-XOR---| |-NAND2--> carryover ||-XOR-| | |-NAND1-|-NOR- |--> intBit |-NAND0---------------------|-XOR- | |-NAND2---> carryOut overflow bit # count-neuron diagram : -overIn-|-ZERO--> 0 to intBit -input1-|-ONE---> 1 to intBit -input2-|-TWO---| 0 to intbit | 1 to overout |-THREE-| 1 to intBit | 1 to overout # set clusInt= = 0, add bits directly to clusInt one by one, at level of bit 08********08 10Apr2021 integer cluster - toy with idea of "universal integer" of variable length, circular stack +-----+ olde code clusInt_create IS { LOCAL neurIdent clusIdent clusClass clusNeurn i n_neurons ; NONLOCAL neurIdentL clusIdentL clusClassL clusNeurnL ; % ; % Create cluster & add to [global, local, etc] lists ; clusIdent := clus_create "int ; neur0_to5 := clusNeurs_create clusIdent 6 ; % ; clusIdent } 08********08 06Apr2021 https://www.informit.com/articles/article.aspx?p=31670&seqNum=2 Programming Languages: The Early Years May 2, 2003 At their lowest level, computers cannot subtract, multiply, or divide. Neither can calculators. The world's largest and fastest supercomputer can only add—that's it. It performs the addition at the bit level. Binary arithmetic is the only means by which any electronic digital computing machine can perform arithmetic. http://people.ece.umn.edu/users/parhi/SLIDES/chap13.pdf Chapter 13: Bit LevelArithmetic Architectures Keshab K. Parhi +-----+ olde code % "fire" back-connection ; % dendInhib := axonInhibL@clusIdent2 ; % axonInhib := axonInhibL@clusIdent2 ; 08********08 03Apr2021 $ diff "$d_MindCode""clusters/logic gates.ndf" "$d_MindCode""neurons/logic gates.ndf" --suppress-common-lines https://www.computerhope.com/jargon/l/logioper.htm >> good list & description ofogic operators AND OR NOT * NAND = not and * NOR = not or * XOR -> returns true if either of its inputs differ, and false if they are all the same. * XNOR -> returns true if either of its inputs are the same, and false if either of them differ [NAND, NOR] have the distinction of being the two "universal" logic gates because any other logic operation can be created using only one of these gates. +-----+ olde code #] +-----+ #] Connections - dendIdents axonIdents # IF flag_debug THEN write 'loading checkConnects' ; ENDIF ; #] checkConnects IS - checkConnects IS { LOCAL ; NONLOCAL ; % ; % Add this cluster's ID to connected [dend, axon] lists ; % This requires that cables have suitable [number, ordering] of synpses to connect to this cluster ; % must check for "type" consistency ; IF (~= null dendIdents) THEN FOR ID WITH dendIdents DO IF (= neurClass neurClassIDL@ID) THEN link dendIdentL@ID clusID ; ELSE write link 'addInteger clusAdd = ' neurClass ', dendIdent = ' ID ', classDend = ' dendIdentL@ID ; ENDIF ; ENDFOR ; ENDIF ; % ; IF (~= null axonIdents) THEN FOR ID WITH axonIdents DO IF (= class classIDL@ID) THEN link axonIdentL@ID clusID ; ELSE write link 'addInteger clusAdd = ' class ', axonIdent = ' ID ', classAxon = ' axonIdentL@ID ; ENDIF ; ENDFOR ; ENDIF ; } % add neurons, using neurClass="intBit, null for [axonIdents axonDelays dendIdents dendDelays] ; % [dendIdents upstream, axonIdents, all delays] to be defined later but are null initial code here ; n_neurons := 6 ; clusNeurn := null ; FOR i WITH (tell n_neurons) DO neurIdent := neur_create ; clusNeurn := clusNeurn link neurIdent ; ENDFOR ; clusNeurnL@clusIdent := clusNeurn ; 08********08 01Apr2021 clusInt_add IS OP clusIdent # should I add a "fire" connection to [neuron, cluster]s to invoke output? [attention, vigilance] some may simply change state without firing? # 02Apr2021 Note that 2 synapses per neuron-to-non connections are defined # - (maybe robustness?), # - with inital random time delays # - only one connection for intra-cluster neurons? # 02Apr2021 I need distict firing values for [0,1], how about double firing? : # - if true : fire 1 , then no fire at second delay OR double fire # - if false : no fire, then fire 1 at second delay OR single fire # - use "fire" back-connection for this? (sensory - automatic?) # lowest bit is first 02Apr2020 I MUST have McCulloch & Pitts logic-type neurons!!! neural networks can NEVER work without them!!! +-----+ olde code # 01Apr2021 not needed? % At time of creation, do NOT add external [dend, axon]IDs to this cluster's lists ; % neur_create adds null values ; % Do I need these clus variables? or simply neuron assigns are OK? ; clusNeurL clusDenIDL clusDelayL clusAxoIDL clusAelayL := clusNeurL clusDenIDL clusDelayL clusAxoIDL clusAelayL EACHBOTH link null null null null null ; 08********08 01Apr2021 'integer add.ndf' & 'symbols.ndf' Maybe feedback to confirm fire? Could help with [0,1] states of integers to set downstream neurons. First is a reset to 0. If there is a second, sets to 1. cluster of [cluster,neuron]s -> may need [super-cluster, system]s etc? probably redundant? treats conections as bundles of neurons, with connections to other clusters ?and individual neurons? some neurons are "captive" with connections ONLY within the cluster other neurons connect to external [cluster, neuron]s assumptions : neurons can store a state - but how? each needs a "fire line" to compel it to fire without other inputs! non-myelinated axons allow indirect stimulation of neurons - need inputs for that neuron state right now, only 0 or 1 later - bursting, pattern etc (multi-state) +-----+ olde code IF flag_debug THEN write 'loading clus_intAdd_create' ; ENDIF ; #] clus_intAdd_create IS - initially, just for two input integers, but could generalize # 01Apr2021 initial clus_intAdd_create IS { LOCAL clusIdent0 clusClass0 clusNeurn0 clusIdent1 clusClass1 clusNeurn1 clusIdent2 clusClass2 clusNeurn2 ; NONLOCAL clusIdentL clusClassL clusNeurnL ; % ; clusIdent0 := clus_integer_create ; clusIdent1 := clus_integer_create ; clusIdent2 := clus_integer_create ; % ; % bit_add neurons ; } 08********08 31Mar2021 link d_MindCode 'symbols.ndf' create a first example of a net /media/bill/Dell2/Website - raw/Qnial/MY_NDFS/MindCode/nets/integer add.ndf a zero state of a neuron still produces a spike pattern for zero!!! non-firing states don't output 08********08 23Mar2021 link d_MindCode 'symbols.ndf' # These "get" optrs aren't terribly useful - just use short form! #] get_neuronClass IS OP neuronID - res ipsa loquitor get_neuronClass IS OP neuronID { classIDL@neuronID } #] get_neuronAxonIDs IS OP neuronID - res ipsa loquitor get_neuronAxonIDs IS OP neuronID { axonL@neuronID } #] get_neuronDendIDs IS OP neuronID - res ipsa loquitor get_neuronDendIDs IS OP neuronID { dendL@neuronID } ******** 10Sep2020 continue with symbols problem then go back to [finders, taskers, formulators, commanders] 10Sep2020 Recent peer review that I did has great pertinence! : NEUNET-D-20-00790 p Garcia etal - Small Universal Spiking Neural P Systems with dendritic-axonal delays and dendritic trunk-feedback.pdf scattered thoughts from WCCI2020 : ART? : !6 24 I-SS55 Xiaobo Liu, Graph Convolutional Extreme Learning Machine [#21160] 18Aug2020 WCCI20202 "newish highlights" for me : Plenary : Kay Chen Tan, Evolutionary transfer optimisation Keynote : Johan Suykens, Deep Learning NNs and Kernel machines, new synergies (awesome path forward!!!) Keynote : Michael Bronstein, Geometric deep learning, going beyond Euclidean data [#24057]: Santiago Gonzalez, Risto Miikkulainen, Improved Training Speed, Accuracy, and Data Utilization via Loss Function Optimization Keynote : Alexander Gorban, Augmented AI- correctors of errors and social networks of AI (blessing of dimensionality) Tutorial 30: Claudio Gallicchio, Simone Scardapane, Deep randomized neural networks Workshop 9: Artificial Intelligence for Mental Disorders (fascinating - after years of questionable claims, the quality of research seems to be improving, but this is hard!) Workshop 6: Design, Implementation, and Applications of Spiking Neural Networks and Neuromorphic Systems Extensions of solid historical career work - good context : Keynote : Stephen Grossberg, From designs for autonomous adaptive agents to clinical disorders- Linking cortically-mediated learning to Alzheimer’s disease, autism, amnesia, and sleep Keynote : Kunihiko Fukushima - Deep CNN Neocognitron for Artificial Vision Not so much, even though it was interesting : Yoshua Bengio, Deep Learning 2 - kind of nice, and I really like Bengio's work over the years, but : - I think he's splashing around a bit (like everyone else), and seems to be "cornered" by demands to show the way forward at perhaps too early a stage? - he doesn't refer to a lot of thinking going back 70+ years (consciousness etc) - to me he has completely missed the base (as has almost everyone else so far) I do agree with his emphasis on consciousness, but not right now for Deep Learning (especially CNNs), as they need to be concpetually upgraded big time. Consciousness doesn't need to wait for Deep Learning - it's past time to go forward after John Taylor's work & others. I'm lukewarm on Bernard Baars' "Global workspace theory" (GWT), but it plus other authors provide good context. Of course - hundreds of other great papers. ********** 25May2020 symbols - data vs operators see ""MindCode/code develop/symbols - data vs operators notes.txt"" ********** 15May2020 from ToDos : Current work : create a system to handle "formulae" see "MindCode/code develop/symbols - data vs operators notes.txt" big issue - Robert Hecht-Nielson, Asim Roy ************* 28Feb2020 [Split, tiny clean-up] of "The MindCode Manifesto" It is much easier to work with a number of sub-documents, rather than endlessly wander around a big one. Table of Contents Summary 1 Introduction - Conceptual pseudo-basis for MindCode 8 MindCode components 11 [Neurological, biological] basis for epiDNA coding 13 Historical [DNA, Protein, Evolutionary Computing, ANN] hybrid basis for epiDNA-NNs 16 MindCode - arbitrary selections from "Multiple Conflicting Hypothesis" 18 Assumed "rules of the game" 21 Static epiDNA-neuron coding [architecture, data, functions, processes, operating systems, conciousness, behaviours, personalities] 23 Static epiDNA-NN coding of [architecture, data, functions, processes, operating systems, conciousness, behaviours, personalities] 30 Static MindCode coding of [architecture, data, functions, processes, operating systems, conciousness, behaviours, personalities] 32 Dynamic epiDNA-NN coding of [architecture, data, functions, processes, operating systems, conciousness, behaviours, personalities] 34 Ontogeny 36 Specialized epiDNA-NNs for MindCode 37 Hybrids of [algorithms, conventional computing, ANNs, MindCode] 39 Questions, not answers 40 ­ # enddoc