#] #] ********************* #] "$d_web"'MindCode/01_MindRelated notes.txt' - other [author, literature] concepts # www.BillHowell.ca 08Jun2015 initial version see also link d_QNial_mine 'MindCode' [architecture, data, functions, processes, operating systems, conciousness, behaviours, personalities, creativity] 13Mar2022 Combine? : Liquid-State Networks (LSNs) Wolfgang Maass Super-Turing Machines (STMs) Hava Sieglemann She built a language!! 24************************24 24************************24 # Table of Contents, generated by : # $ grep "^#]" "$d_web"'MindCode/01_MindRelated notes.txt' | sed 's/^#\]/ /' ********************* "$d_web"'MindCode/01_MindRelated notes.txt' 10Mar2022 Liquid State Machines (LSMs) use Spiking Nerual Netorks! (Wolgang Mass 06Mar2021 Rademacher and Gaussian Complexities 24Feb2020 QNial basis 21Feb2020 Initial QNial programming (no more [yap, arm-waving]) 23Sep2019 HOX genes (architecture) 12Aug2019 Spikeless information transfer? 11Aug2019 Architectures & Function 10Aug2019 mRNA circulation - soma to synapse 25Jul2019 Alice Parker. Spiking Neural Networks & DNA. USC. Los Angeles. USA 07Mar2018 DNA storage for real computers #24************************24 # Principles 02Nov2023 [intra, inter]-cellular processes: intra - [DNA, RNA, etc], inter - multi-neuron 02Nov2023 keep data as bitL for easy manipulation 02Nov2023 DNA -> RNA polymerase -> transcribe to RNA, mRNA for protein coding. I must use all RNA "floats off" (NYET!! - see microtubules in "transport" below!!!), it doesn't follow a DNA, microtubule? DNA: ATGC - adenine, thymine, guanine, and cytosine: AT and CG this is boolean (binary) code RNA: transcript carries the same information as the non-template (coding) strand of DNA, but it contains the base uracil (U) instead of thymine (T) start codon (ATG), end: 5' cap and poly-A tail 02Nov2023 RNA transport https://theconversation.com/how-does-rna-know-where-to-go-in-the-city-of-the-cell-using-cellular-zip-codes-and-postal-carrier-routes-191155 Matthew Taliaferro ZIP codes that send RNAs to: neurites - precursors to the axons and dendrites on neurons that transmit and receive electrical signals epithelial cells Proteins Unkempt protein regulates neurite production LARP1, responsible for the transport of RNAs containing a particular ZIP code to both neurites and the bottom end of epithelial cells microtubules cellular streets, train track (bi-directionality) https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6650148/ ...motor-driven transport along MT tracks, with instantaneous velocities ranging from 0.5-5 μm/s [46]. Howell: like a wave propagation, inventory of mRNA would allow very high speeds of getting a protein to intra-neuron site? #24************************24 # Setup, ToDos, 24************************24 #08********08 #] 02Nov2023 can I identify "MindCode useful" DNA sequences? (code first, then look?) 08********08 #] ??Nov2023 08********08 #] ??Nov2023 08********08 #] ??Nov2023 08********08 #] ??Nov2023 08********08 #] ??Nov2023 08********08 #] ??Nov2023 08********08 #] 20Nov2023 cut voice file see "$d_bin"'audio cut.sh' "$d_web"'Neural nets/MindCode/voice musings/231111_1622 mind without neurons need to cut.mp3' 08********08 #] 31Oct2023 create ndf for [step-wise, procedural] program sequence BOTH [inter, intra]-cellular approaches p443sec LIST PARSE: A laminar model of variable-length sequences image p444fig12.42 LIST PARSE: Laminar cortical model of working memory and list chunking image p445fig12.43 LIST PARSE laminar cortical Cognitive Working Memory circuit, homologous to the LAMINART circuit circuit that models aspects of how visual cortex sees image p436fig12.30 The conscious ARTWORD, or cARTWORD, laminar cortical speech model simulates how future context can disambiguate noisy past speech sounds LIST PARSE is what I would start with for programs? >> maybe not, just steal von Neuman architecture? 08********08 #] 06Aug2023 What is a Spike? ROLES of neuron architecture : axon - power cable, tap into [glial, other]? * [flow battery, capacitor]'s dendrite synapse neuro-transmitter [axon, dendrite]s can do both jobs? MODES of signal interpretation [spike, continuous, other] : spiking mode - less sensitive to the strength of the signal that does get through continuous mode - neuron A's spike is other - electromagnetic field theories combined modes - consistently or depending on the situation? connections - synaptic, nerve [fibre, branch]s, SYNAPTIC_PATTERN of signals from ONE specific spike of neuron A, tells connected neuron B the : IDENTITY of neuron A : specifically that the pattern of neuron B's incoming synaptic signals, is from neuron A's spike, as distinguished thousands of other neurons that neuron B is connected to. In other words, the [ID, address, identifier] of neuron A is well-recognized by neuron B. TIMING of neuron A's spike, in neuron B's absolute time clock : in spite of the lack of am "event timing signal" independant of neuron B's system. Over time as neuron B learns about neuron A, the estimated timing of neuron A's spike firing can be estimated from : the time of neuron B's first synaptic signal (implies "shortest synaptic distance" A-to-B) A-to-B-to-A "circular spike timing tuning training events" as part of a vast set of neuron [tune, learn, evolve] with respect to the network itself, as opposed to the exogenous information carried in the flow context of [shortest synaptic distance (SSD), circular spike timing tuning training events (CST)] results for other neurons that neuron B is connected to AMPLITUDE of neuron A's spike : Thresholding of weaker synaptic signals input to neuron B, changes neuron A's input synaptic pattern to neuron B, by keeping stronger parts of the SAME pattern but losing weaker parts. perhaps this thresholding could be [modulated, adaptive]? IMPLICATIONS of synaptic_pattern : RETENTION of [identity, timing, amplitude] properties of synaptic_patterns from spiking : are to some degree even amongst a cacophony of spikes from many neurons with synaptic connections (or not) to neuron B SPIKE-PATTERN of neuron A's spiking sequence over time, tells neuron B : 08********08 #] 15Mar2022 Reproducing Kernel Hilbert Space (RKHS) [description, reference]s see : "$d_web"'Mathematics/5_Math concepts, functions.odt' 17Mar2022 Aurel A. Lazar, Yevgeniy Slutskiy Nov2013 "Functional Identification of Spike-Processing Neural Circuits (RKHS)" November 2013Neural Computation 26(2), SourcePubMed Columbia University https://www.researchgate.net/publication/258425593_Functional_Identification_of_Spike-Processing_Neural_Circuits "... Employing the reproducing kernel Hilbert space (RKHS) of trigonometric polynomials to describe input stimuli, we quantitatively describe the relationship between underlying circuit parameters and their projections. ..." Lazar, Slutskiy Nov2013 Functional Identification of Spike-Processing Neural Circuits (RKHS).pdf Antonio R.C. Paiva, Il Park and Jose C. Principe 2008 "Reproducing kernel Hilbert Spaces for Spike Train Analysis" https://www.sci.utah.edu/~arpaiva/pubs/2008a_presentation.pdf {arpaiva, memming, principe}@cnel.ufl.edu Computational NeuroEngineering Laboratory University of Florida, Gainesville, FL32611 Paiva, Park, Principe 2008 Reproducing kernel Hilbert Spaces for Spike Train Analysis 08********08 #] 13Mar2022 Neto, Sieglemann, Costa 2003 "Symbolic processing in neural networks" Joao Neto, Hava Sieglemann, Felix Costa 2003 "Symbolic processing in neural networks" Journal of Brazilian Computer Society, 20pp With dynamical systems in general we have computation without programmability, i.e., the extra power these systems exhibit has to do with the decoupling between programming and computation. Up to the power of Turing machines, computations are describable by programs that correspond to the prescription by finite means of some rational parameters of the system. Beyond Turing power we have computations that are not describable by finite means: computation without a program. In this paper we want to shed some light on the programmability of neural nets. Combine? : Liquid-State Networks (LSNs) Wolfgang Maass Super-Turing Machines (STMs) Hava Sieglemann She built a language!! 08********08 #] 10Mar2022 Liquid State Machines (LSMs) use Spiking Nerual Netorks! (Wolgang Mass Wolfgang Maass, Thomas Natschläger, and Henry Markram. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation, 14:2531–2560, 2002. from IJCNN2022 peer review of : "$d_PROJECTS"'2022 WCCI Padua, Italy/2099 r Ground Reaction Force Estimation in a Quadruped Robot via Liquid State Networks.txt' 08********08 #] 06Mar2021 Rademacher and Gaussian Complexities >> reminds me of Bronstein WCCI2020 plenary on Deep geometry question arising from IJCNN2021 peer review : "1364 p Zhang, Zhu, Wu, Zheng - BDU-net, Towards Accurate Segmentation of Dental Image using Border Guidance and Feature Map Distortion.pdf" http://www.ai.mit.edu/projects/jmlr/papers/volume3/bartlett02a/bartlett02a.pdf Journal of Machine Learning Research 3 (2002) 463-482Submitted 11/01; Published 11/02 Rademacher and Gaussian Complexities: Risk Bounds and Structural Results Peter L. Bartlett Peter.Bartlett@anu.edu.au Shahar Mendelson shahar@csl.anu.edu.au Research School of Information Sciences and Engineering, Australian National University, Canberra 0200, Australia Editor:Philip M. Long Abstract - We investigate the use of certain data-dependent estimates of the complexity of a function class, called Rademacher and Gaussian complexities. In a decision theoretic setting, we prove general risk bounds in terms of these complexities. We consider function classes that can be expressed as combinations of functions from basis classes and show how the Rademacher and Gaussian complexities of such a function class can be bounded in terms of the complexity of the basis classes. We give examples of the application of these techniques in finding data-dependent risk bounds for decision trees, neural networks and support vector machines. Keywords: Error Bounds, Data-Dependent Complexity, Rademacher Averages, Maximum Discrepancy Definition 2 Let μ be a probability distribution on a set X and suppose that X1,... ,Xn are independent samples selected according to μ. Let F be a class of functions mapping from X to R. Define the maximum discrepancy of F as the random variable ˆDn(F) = supf∈F2nn/2∑i=1f(Xi)−2nn∑i=n/2+1f(Xi). Denote the expected maximum discrepancy of F by Dn(F)=EˆDn(F). Define the random variable ˆRn(F)=E[supf∈F∣∣∣∣∣2nn∑i=1σif(Xi)∣∣∣∣∣∣∣∣∣∣X1,... ,Xn], where σ1,... ,σn are independent uniform {±1}-valued random variables. Then the Rademacher complexity of F is Rn(F)=EˆRn(F). Similarly, define the random variable ˆGn(F)=E[supf∈F∣∣∣∣∣2nn∑i=1gif(Xi)∣∣∣∣∣∣∣∣∣∣X1,... ,Xn], where g1,... ,gn are independent Gaussian N(0,1) random variables. The Gaussian complexity of F is Gn(F)=EˆGn(F). 08********08 #] 24Feb2020 QNial basis see link d_QNial_mine 'MindCode/1_MindCode summary ref.txt' to test spiking, I must build some microNNs but first build a few basic arithmetic operators 08********08 #] 21Feb2020 Initial QNial programming (no more [yap, arm-waving]) Izhikevich-like neuron - with [DNA-mRNA-epi, addressable synapses] loaddefs link d_QNial_mine 'MindCode/Izhikevich-like neuron - with [DNA-mRNA-epi, addressable synapses].ndf' 08********08 #] 25Jul2019 Alice Parker. Spiking Neural Networks & DNA. USC. Los Angeles. USA - Mindcode Lunch at restaurant after IJCNN2019 in Budapest I sent an email about DNA-SNNs /media/bill/SWAPPER/Projects - big/MindCode/Howell 050824 Junk DNA & NeuralNetworks, conjecture on directions and implications, IJCNN05 workshop panel presentation.ppt /media/bill/SWAPPER/Projects - big/MindCode/Howell 060215 Genetic specification of neural networks, draft concepts and implications.odt /media/bill/SWAPPER/Projects - big/MindCode/Howell 060215 Genetic specification of neural networks, draft concepts and implications.pdf /media/bill/SWAPPER/Projects - big/MindCode/Howell 060716 Genetic specification of recurrent neural networks, Initial thoughts, WCCI 2006 paper 1341.ppt /media/bill/SWAPPER/Projects - big/MindCode/Howell 060721 Genetic Specification of Recurrent Neural Networks Initial Thoughts, WCCI 2006 presentation.ppt /media/bill/SWAPPER/Projects - big/MindCode/Howell 150225 - MindCode Manifesto.odt /media/bill/SWAPPER/Neural Nets/Confabulation/Howell 110903 - Confabulation Theory, Plausible next sentence survey.pdf /media/bill/SWAPPER/Website/Social media/Howell 110902 – Systems design issues for social media.pdf /media/bill/SWAPPER/Website/Social media/Howell 111006 – Semantics beyond search.pdf /media/bill/SWAPPER/Website/Social media/Howell 111117 - How to set up & use data mining with Social media.pdf /media/bill/SWAPPER/Website/Social media/Howell 111230 – Social graphs, social sets, and social media.pdf http://www.billhowell.ca/Neural%20nets/MindCode/Howell%20050824%20Junk%20DNA%20&%20NeuralNetworks,%20conjecture%20on%20directions%20and%20implications,%20IJCNN05%20workshop%20panel%20presentation.ppt http://www.billhowell.ca/Neural%20nets/MindCode/Howell%20060215%20Genetic%20specification%20of%20neural%20networks,%20draft%20concepts%20and%20implications.odt http://www.billhowell.ca/Neural%20nets/MindCode/Howell%20060215%20Genetic%20specification%20of%20neural%20networks,%20draft%20concepts%20and%20implications.pdf http://www.billhowell.ca/Neural%20nets/MindCode/Howell%20060716%20Genetic%20specification%20of%20recurrent%20neural%20networks,%20Initial%20thoughts,%20WCCI%202006%20paper%201341.pdf http://www.billhowell.ca/Neural%20nets/MindCode/Howell%20060721%20Genetic%20Specification%20of%20Recurrent%20Neural%20Networks%20Initial%20Thoughts,%20WCCI%202006%20presentation.ppt http://www.billhowell.ca/Neural%20nets/MindCode/Howell%20150225%20-%20MindCode%20Manifesto.odt http://www.billhowell.ca/Social%20media/Howell%20111230%20–%20Social%20graphs,%20social%20sets,%20and%20social%20media.pdf http://www.billhowell.ca/Social%20media/Howell%20110902%20–%20Systems%20design%20issues%20for%20social%20media.pdf http://www.billhowell.ca/Social%20media/Howell%20111006%20-%20SPINE,%20Semantics%20beyond%20search.pdf http://www.billhowell.ca/Social%20media/Howell%20111117%20-%20How%20to%20set%20up%20&%20use%20data%20mining%20with%20Social%20media.pdf 08********08 #] 07Mar2018 DNA storage for real computers This DIRECTLY relates to MindCode https://spectrum.ieee.org/the-human-os/biomedical/devices/dna-data-storage-gets-random-access?utm_source=thehumanosalert&utm_campaign=thehumanosalert-03-07-18&utm_medium=email Prachi Patel 20Feb2018 | 15:00 GMT DNA Data Storage Gets Random Access : Researchers have devised a system to recover targeted files from 200 megabytes of data encoded in DNA enddoc