## Tags - Part of: [[Neuroscience]] [[Cognitive science]] - Related: - Includes: - Additional: ## Main resources - <iframe src="https://en.wikipedia.org/wiki/Computational_neuroscience" allow="fullscreen" allowfullscreen="" style="height:100%;width:100%; aspect-ratio: 16 / 5; "></iframe> Computational neuroscience lectures [Computational Cognitive Neuroscience, 2020 - YouTube](https://youtube.com/playlist?list=PLu02O8xRZn7xtNx03Rlq6xMRdYcQgEpar&si=igAUTf8AvDCfZOXX) [BT6270 Computational Neuroscience Aug-Nov 2021 - YouTube](https://youtube.com/playlist?list=PLqFGS_otF-c47OamOxXEu8ZmlY-NXZnMy&si=DGu637ZmMJyDBLTC) [BT6270 Introduction to Computational Neuroscience - YouTube](https://youtube.com/playlist?list=PLqFGS_otF-c5SZ4XPaR2sUNq5EKCKnyGJ&si=ucCK8AOmz_iVylrV) [MIT 9.40 Introduction to Neural Computation, Spring 2018 - YouTube](https://youtube.com/playlist?list=PLUl4u3cNGP61I4aI5T6OaFfRK2gihjiMm&si=4SkRL88K54Nxe8L8) [Computational Cognitive Neuroscience, 2020 - YouTube](https://www.youtube.com/playlist?list=PLu02O8xRZn7xtNx03Rlq6xMRdYcQgEpar) ## Brainstorming [[Thoughts (computational) neuroscience brain]] ## Resources [[Resources computational neuroscience]] [[Links (computational) neuroscience brain]] ## Deep dives - [Frontiers | Integrated world modeling theory expanded: Implications for the future of consciousness](https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2022.642397/full) [Frontiers | An Integrated World Modeling Theory (IWMT) of Consciousness: Combining Integrated Information and Global Neuronal Workspace Theories With the Free Energy Principle and Active Inference Framework; Toward Solving the Hard Problem and Characterizing Agentic Causation](https://www.frontiersin.org/journals/artificial-intelligence/articles/10.3389/frai.2020.00030/full) - “Figure 2. Depiction of the human brain in terms of phenomenological correspondences, as well as computational (or functional), algorithmic, and implementational levels of analysis (Reprinted from Safron, 2021b). Depiction of the human brain in terms of entailed aspects of experience (i.e., phenomenology), as well as computational (or functional), algorithmic, and implementational levels of analysis (Marr, 1983; Safron, 2020b). A phenomenological level is specified to provide mappings between consciousness and these complementary/supervenient levels of analysis. Modal depictions connotate the radically embodied nature of mind, but not all images are meant to indicate conscious experiences. Phenomenal consciousness may solely be generated by hierarchies centered on posterior medial cortex, supramarginal gyrus, and angular gyrus as respective visuospatial (cf. consciousness as projective geometric modeling) (Rudrauf et al., 2017; Williford et al., 2018), somatic (cf. grounded cognition and intermediate level theory) (Varela et al., 1992; Barsalou, 2010; Prinz, 2017), and intentional/attentional phenomenology (cf. Attention Schema Theory) (Graziano, 2019). Computationally, various brain functions are identified according to particular modal aspects, either with respect to generating perception (both unconscious and conscious) or action (both unconscious and potentially conscious, via posterior generative models). (Note: Action selection can also occur via affordance competition in posterior cortices (Cisek, 2007), and frontal generative models could be interpreted as a kind of forward-looking (unconscious) perception, made conscious as imaginings via parameterizing the inversion of posterior generative models). On the algorithmic level, these functions are mapped onto variants of machine learning architectures—e.g., autoencoders and generative adversarial networks, graph neural networks (GNNs), recurrent reservoirs and liquid state machines—organized according to potential realization by neural systems. GNN-structured latent spaces are suggested as a potentially important architectural principle (Zhou et al., 2019), largely due to efficiency for emulating physical processes (Battaglia et al., 2018; Bapst et al., 2020; Cranmer et al., 2020). Hexagonally organized grid graph GNNs are depicted in posterior medial cortices as contributing to quasi-Cartesian spatial modeling (and potentially experience) (Haun and Tononi, 2019; Haun, 2020), as well as in dorsomedial, and ventromedial PFCs for agentic control. With respect to AI systems, such representations could be used to implement not just modeling of external spaces, but of consciousness as internal space (or blackboard), which could potentially be leveraged for reasoning processes with correspondences to category theory, analogy making via structured representations, and possibly causal inference. Neuroimaging evidence suggests these grids may be dynamically coupled in various ways (Faul et al., 2020), contributing to higher-order cognition as a kind of navigation/search process through generalized space (Hills et al., 2010; Kaplan and Friston, 2018; Çatal et al., 2021). A further GNN is speculatively adduced to reside in supramarginal gyrus as a mesh grid placed on top of a transformed representation of the primary sensorimotor homunculus (cf. body image/schema for the sake of efficient motor control/inference). This quasi-homuncular GNN may have some scaled correspondence to embodiment as felt from within, potentially morphed/re-represented to better correspond with externally viewed embodiments (potentially both resulting from and enabling “mirroring” with other agents for coordination and inference) (Rochat, 2010). Speculatively, this partial translation into a quasi-Cartesian reference frame may provide more effective couplings (or information-sharing) with semi-topographically organized representations in posterior medial cortices. Angular gyrus is depicted as containing a ring-shaped GNN to reflect a further level of abstraction and hierarchical control over action-oriented body schemas—which may potentially mediate coherent functional couplings between the “lived body” and the “mind’s eye”—functionally entailing vectors/tensors over attentional (and potentially intentional) processes (Graziano, 2018). Frontal homologs to posterior GNNs are also depicted, which may provide a variety of higher-order modeling abilities, including epistemic access for extended/distributed self-processes and intentional control mechanisms. These higher-order functionalities may be achieved via frontal cortices being more capable of temporally extended generative modeling (Parr et al., 2019c), and potentially also by virtue of being located further from primary sensory cortices, so affording (“counterfactually rich”) dynamics that are more decoupled from immediate sensorimotor contingencies. Further, these frontal control hierarchies afford multi-scale goal-oriented behavior via bidirectional effective connectivity with the basal ganglia (i.e., winner-take-all dynamics and facilitation of sequential operations) and canalization via diffuse neuro-modulator nuclei of the brainstem (i.e., implicit policies and value signals) (Houk et al., 2007; Humphries and Prescott, 2010; Stephenson-Jones et al., 2011; Dabney et al., 2020; Morrens et al., 2020). Finally, the frontal pole is described as a highly non-linear recurrent system capable of shaping overall activity via bifurcating capacities (Tani, 2016; Wang et al., 2018)—with potentially astronomical combinatorics—providing sources of novelty and rapid adaptation via situation-specific attractor dynamics. While the modal character of prefrontal computation is depicted at the phenomenological level of analysis, IWMT proposes frontal cortices might only indirectly contribute to consciousness via influencing dynamics in posterior cortices. Speculatively, functional analogs for ring-shaped GNN salience/relevance maps may potentially be found in the central complexes of insects and the tectums of all vertebrates (Honkanen et al., 2019), although it is unclear whether those structures would be associated with any kind of subjective experience. Even more speculatively, if these functional mappings were realized in a human-mimetic, neuromorphic AI, then it may have both flexible general intelligence and consciousness. In this way, this figure is a sort of pseudocode for (partially human-interpretable) AGI with “System 2” capacities (Bengio, 2017; Thomas et al., 2018), and possibly also phenomenal consciousness. (Note: The language of predictive processing provides bridges between implementational and computational (and also phenomenological) levels, but descriptions such as vector fields and attracting manifolds could have alternatively been used to remain agnostic as to which implicit algorithms might be entailed by physical dynamics). On the implementational level, biological realizations of algorithmic processes are depicted as corresponding to flows of activity and interactions between neuronal populations, canalized by the formation of metastable synchronous complexes (i.e., “self-organizing harmonic modes”; Safron, 2020a). (Note: The other models discussed in this manuscript do not depend on the accuracy of these putative mappings, nor the hypothesized mechanisms of centralized homunculi and “Cartesian theaters” with semi-topographic correspondences with phenomenology).” [[Images/e6daefd3b6a96fb3b8a0704f8db287a6_MD5.jpeg|Open: Pasted image 20240920043523.png]] ![[Images/e6daefd3b6a96fb3b8a0704f8db287a6_MD5.jpeg]] - An integrative, multiscale view on neural theories of consciousness Combination of global neuronal workspace theory + integrated information theory + recurrent processing theory + predictive processing theory + neurorepresentationalism + dendritic integration theory [An integrative, multiscale view on neural theories of consciousness: Neuron](https://www.cell.com/neuron/fulltext/S0896-6273(24)00088-6) ![[Pasted image 20240920043334.png]] ## Written by AI (may include factually incorrect information)may include incorrect information) ### Map 1 **Comprehensive Map of Computational Neuroscience** --- **1. Introduction to Computational Neuroscience** - **Definition and Scope** - Study of brain function in terms of information processing and computational principles. - Intersection of neuroscience, psychology, computer science, physics, and mathematics. - **Historical Background** - Early models: McCulloch-Pitts neuron (1943). - Hodgkin-Huxley model of the action potential (1952). - Development of artificial neural networks. - **Interdisciplinary Nature** - **Neuroscience**: Understanding biological neural systems. - **Computer Science**: Algorithms and computational models. - **Mathematics**: Mathematical modeling and analysis. - **Physics**: Biophysical properties of neurons. - **Psychology**: Cognitive functions and behaviors. --- **2. Foundational Concepts** - **Neuron Structure and Function** - **Dendrites**: Receive synaptic inputs. - **Soma**: Cell body integrating signals. - **Axon**: Transmits action potentials. - **Synapses**: Junctions for neurotransmission. - **Neural Signaling** - **Action Potentials**: Electrical impulses. - **Synaptic Transmission**: Chemical/electrical signaling. - **Neural Coding** - **Rate Coding**: Information in firing rates. - **Temporal Coding**: Information in spike timing. - **Population Coding**: Collective neuron activity. - **Neuroanatomy** - **Central Nervous System (CNS)**: Brain and spinal cord. - **Peripheral Nervous System (PNS)**: Sensory and motor neurons. - **Brain Regions**: Cortex, subcortex, cerebellum, etc. --- **3. Mathematical and Computational Tools** - **Differential Equations** - **ODEs**: Modeling neuron dynamics. - **PDEs**: Modeling spatial aspects (e.g., dendrites). - **Dynamical Systems Theory** - Stability, bifurcations, chaos. - **Probability and Statistics** - **Stochastic Processes**: Random neuronal activity. - **Bayesian Inference**: Probabilistic reasoning. - **Information Theory** - **Entropy**, **Mutual Information**: Quantifying information. - **Machine Learning Techniques** - **Supervised/Unsupervised Learning**: Pattern recognition. - **Reinforcement Learning**: Learning from feedback. - **Signal Processing** - **Fourier Analysis**, **Wavelets**: Analyzing neural signals. --- **4. Single Neuron Modeling** - **Hodgkin-Huxley Model** - Detailed ion channel kinetics. - **Simplified Models** - **Integrate-and-Fire**: Threshold-based spiking. - **FitzHugh-Nagumo**, **Morris-Lecar**: Simplified dynamics. - **Dendritic Computations** - **Cable Theory**: Signal propagation in dendrites. - **Compartmental Models**: Spatial neuron modeling. - **Ion Channel Modeling** - **Markov Models**: State transitions of channels. --- **5. Network Modeling** - **Artificial Neural Networks** - **Perceptrons**, **Multi-layer Networks**. - **Deep Learning**: CNNs, RNNs. - **Recurrent Neural Networks** - **Hopfield Networks**: Associative memory. - **Boltzmann Machines**: Probabilistic models. - **Spiking Neural Networks** - **Leaky Integrate-and-Fire Networks**. - **Synchronization and Oscillations**. - **Synaptic Plasticity** - **Hebbian Learning**: "Fire together, wire together". - **STDP**, **LTP/LTD**: Timing-dependent changes. - **Network Dynamics** - **Oscillations**, **Synchronization**, **Chaos**. --- **6. Neural Coding and Decoding** - **Encoding Information** - **Population Codes**, **Sparse Coding**. - **Decoding Neural Activity** - Algorithms for interpreting neural signals. - **Information Processing** - **Sensory Processing**, **Decision Making**, **Motor Control**. --- **7. Learning and Plasticity** - **Learning Rules** - **Delta Rule**, **Backpropagation**. - **Reinforcement Learning Models**. - **Plasticity Mechanisms** - **Synaptic**, **Structural Plasticity**. - **Models of Learning and Memory** - **Associative Memory**, **Sequence Learning**. --- **8. Neural Development and Evolution** - **Developmental Models** - **Axon Guidance**, **Synaptic Pruning**. - **Evolutionary Computation** - **Genetic Algorithms**. - Evolution of neural circuits. --- **9. Sensory Systems Modeling** - **Visual System** - **Retina Models**, **Visual Cortex Processing**. - **Auditory System** - **Cochlea Models**, **Auditory Cortex**. - **Somatosensory System** - Tactile and pain processing. - **Olfactory and Gustatory Systems** - Smell and taste encoding. --- **10. Motor Control Models** - **Motor Cortex Modeling** - Neural control of movement. - **Basal Ganglia Function** - Action selection mechanisms. - **Cerebellar Models** - Coordination and timing. - **Sensorimotor Integration** - Combining sensory inputs with motor actions. --- **11. Cognitive Neuroscience Models** - **Attention** - Mechanisms of focus and selection. - **Memory Systems** - **Working Memory**, **Long-term Memory**. - **Decision Making** - **Drift-Diffusion Models**, **Reinforcement Learning**. - **Executive Function** - Planning, problem-solving models. - **Language Processing** - Neural basis of language comprehension and production. --- **12. Neural Disorders and Computational Models** - **Epilepsy** - **Seizure Dynamics**. - **Parkinson's Disease** - **Basal Ganglia Dysfunction Models**. - **Alzheimer's Disease** - Models of neurodegeneration. - **Schizophrenia** - **Dysconnectivity Hypotheses**. - **Computational Psychiatry** - Modeling mental disorders. --- **13. Emergent Properties and Complex Systems** - **Self-Organization** - Neural map formation. - **Criticality in Neural Systems** - Balance between order and chaos. - **Complex Network Theory** - **Small-world**, **Scale-free Networks**. --- **14. Methods and Techniques** - **Numerical Simulation** - **Software Tools**: NEURON, NEST, Brian. - **Data Analysis** - **PCA**, **ICA**, **Clustering**. - **Experimental Techniques Integration** - **Electrophysiology**, **Imaging Data**. - **High-Performance Computing** - **Parallel**, **GPU Computing**. --- **15. Brain-Computer Interfaces (BCIs)** - **Neural Prosthetics** - Restoring lost functions. - **Decoding Motor Intentions** - Translating neural activity to actions. - **Closed-loop Systems** - Feedback-based BCIs. --- **16. Neuromorphic Computing** - **Analog vs. Digital Systems** - Hardware mimicking neural processes. - **Spiking Neural Network Hardware** - Neuromorphic chips. - **Applications** - Low-power computing, robotics. --- **17. Ethical Considerations** - **Computational Models and Ethics** - Responsible AI development. - **Data Privacy** - Handling sensitive neural data. - **AI and Consciousness** - Philosophical implications. --- **18. Future Directions** - **Large-scale Brain Simulations** - Whole-brain modeling. - **Connectomics** - Mapping neural connections. - **Integrative Approaches** - Combining computational and experimental methods. - **Interdisciplinary Collaborations** - Bridging multiple scientific fields. --- **19. Key Researchers and Institutions** - **Notable Scientists** - Hodgkin & Huxley, Hubel & Wiesel, Sejnowski. - **Leading Research Institutes** - **Human Brain Project**, **Allen Institute**, **Blue Brain Project**. - **Universities** - MIT, Stanford, University College London. --- **20. Resources and Further Reading** - **Textbooks** - *Theoretical Neuroscience* by Dayan & Abbott. - *Neuronal Dynamics* by Gerstner et al. - **Journals** - *Journal of Computational Neuroscience*, *Neural Computation*. - **Conferences** - COSYNE, SfN, NeurIPS. --- **Detailed Expansion** --- **Foundational Concepts:** - **Neural Signaling** - **Action Potentials** - **Initiation**: Threshold crossing at the axon hillock. - **Propagation**: Saltatory conduction in myelinated axons. - **Synaptic Transmission** - **Chemical Synapses** - Neurotransmitter release and receptor binding. - **Electrical Synapses** - Direct ionic current flow through gap junctions. - **Neural Coding** - **Rate Coding** - **Mean Firing Rate**: Average spikes per unit time. - **Temporal Coding** - **Spike Timing**: Precise timing carries information. - **Phase Coding**: Information encoded in relation to oscillation phases. - **Population Coding** - **Distributed Representation**: Information spread over many neurons. - **Vector Coding**: Directional information in motor cortex. --- **Mathematical and Computational Tools:** - **Dynamical Systems Theory** - **Fixed Points**: Stable and unstable equilibria. - **Limit Cycles**: Periodic oscillations in neuron models. - **Chaos Theory**: Sensitivity to initial conditions. - **Probability and Statistics** - **Poisson Processes**: Modeling spike trains. - **Hidden Markov Models**: Sequential neural data. - **Machine Learning Techniques** - **Deep Learning Architectures** - **CNNs**: For image and visual data. - **LSTM Networks**: Modeling temporal sequences. - **Reinforcement Learning** - **Policy Gradients**: Learning optimal actions. --- **Single Neuron Modeling:** - **Ion Channel Dynamics** - **Voltage-Gated Channels**: Na+, K+, Ca2+ channels. - **Ligand-Gated Channels**: Neurotransmitter activation. - **Dendritic Processing** - **Backpropagation of Action Potentials**: From soma to dendrites. - **Dendritic Spikes**: Local processing units. --- **Network Modeling:** - **Artificial Neural Networks** - **Autoencoders**: Dimensionality reduction. - **Generative Adversarial Networks (GANs)**: Learning data distributions. - **Spiking Neural Networks** - **Event-Driven Simulation**: Time-stepped vs. event-based methods. - **Plasticity in Spiking Networks**: Learning rules specific to spike timing. --- **Learning and Plasticity:** - **Biophysical Models of Plasticity** - **Calcium Dynamics**: Role in synaptic changes. - **Metaplasticity**: Plasticity of synaptic plasticity. - **Memory Models** - **Hebbian Assembly Formation**: Cell assemblies encoding memories. - **Attractor Networks**: Stable patterns representing memory states. --- **Cognitive Neuroscience Models:** - **Attention** - **Biased Competition Models**: Neural basis of selective attention. - **Top-down vs. Bottom-up Processing**: Influence of goals vs. stimuli. - **Decision Making** - **Neural Circuit Models**: Cortical-basal ganglia-thalamic loops. - **Value-based Decision Making**: Reward prediction errors. --- **Methods and Techniques:** - **Data Analysis** - **Spike Sorting**: Identifying neurons from electrode recordings. - **Time-Frequency Analysis**: EEG/MEG signal processing. - **Imaging Techniques** - **Two-Photon Microscopy**: Imaging neuronal activity. - **Optogenetics**: Controlling neurons with light. --- **Neuromorphic Computing:** - **Hardware Implementations** - **Memristors**: Devices mimicking synapses. - **Spintronics**: Utilizing electron spin for computing. - **Applications** - **Robotics**: Neuromorphic control systems. - **Edge Computing**: Low-power, real-time processing. --- **Future Directions:** - **Brain Simulation Projects** - **Human Brain Project**: European initiative for brain modeling. - **OpenWorm**: Simulating the nervous system of C. elegans. - **Advances in Technology** - **Quantum Computing**: Potential applications in neural modeling. - **Artificial General Intelligence (AGI)**: Bridging AI and neuroscience. --- **Ethical Considerations:** - **Neuroethics** - **Cognitive Enhancement**: Ethical implications of enhancing brain function. - **Neuroprivacy**: Protecting thoughts and neural data. --- **Conclusion** This map provides a comprehensive overview of computational neuroscience, encompassing the vast array of topics, concepts, models, and methodologies that define the field. It illustrates the interdisciplinary nature of the field, bridging gaps between biology, physics, computer science, and psychology to unravel the complexities of neural systems. As research progresses, computational neuroscience continues to evolve, integrating new technologies and theories to enhance our understanding of the brain. ### Map 2 - Computational [[Neuroscience]] is a multidisciplinary field that employs [[Mathematics|mathematical]] models, theoretical analysis, and computer simulations to understand the structure, dynamics, and functions of the nervous system and the brain. Here's a comprehensive list of various branches and topics within computational neuroscience: ### 1. Neural Network Models - Artificial Neural Networks - Spiking Neural Networks - Recurrent Neural Networks - Convolutional Neural Networks - Self-Organizing Maps - Hopfield Networks ### 2. Neuronal Dynamics Models - Hodgkin-Huxley Model - Integrate-and-Fire Models - FitzHugh-Nagumo Model - Leaky Integrate-and-Fire Model - Izhikevich Neuron Model - Wilson-Cowan Model ### 3. Synaptic Plasticity and Learning - Hebbian Learning - Spike-Timing-Dependent Plasticity (STDP) - Synaptic Scaling and Homeostasis - Long-Term Potentiation and Depression (LTP/LTD) ### 4. Neural Coding and Information Theory - Rate Coding and Temporal Coding - Population Coding - Information Theory in Neural Systems - Neural Decoding ### 5. Computational Models of Sensory Systems - Visual System Modeling - Auditory System Modeling - Somatosensory System Modeling - Olfactory System Modeling - Gustatory System Modeling ### 6. Systems and Integrative Neuroscience Models - Thalamocortical Circuitry - Basal Ganglia Models - Cerebellar Models - Hippocampal Models - Cortical Column and Network Models ### 7. Cognitive and Behavioral Neuroscience Models - Decision-Making Models - Memory and Learning Models - Attention and Executive Function Models - Emotion and Motivation Models - Language and Semantic Processing Models ### 8. Computational Neuroanatomy - Neural Network Topology - Connectomics - Brain Parcellation and Segmentation - Diffusion Tensor Imaging (DTI) Analysis ### 9. Neuroinformatics - Databases and Data Mining in Neuroscience - Tools for Neural Data Analysis - Brain Atlases and Mapping - Neurogenomics and Transcriptomics ### 10. Computational Psychopathology - Models of Neural Disorders - Computational Psychiatry - Neural Basis of Mental Disorders ### 11. Neuropharmacology Modeling - Computational Models of Drug Effects - Neurotransmitter Systems Modeling - Receptor and Ion Channel Modeling ### 12. Brain-Computer Interfaces and Neuroprosthetics - Decoding Neural Signals - Brain Stimulation and Modulation Models - Neural Control Interfaces ### 13. Computational Developmental Neuroscience - Neural Development and Growth Models - Axonal Pathfinding and Synaptogenesis - Cortical Development and Plasticity ### 14. Machine Learning and AI in Neuroscience - Deep Learning Applications in Neuroscience - Reinforcement Learning and Neural Control - Machine Learning for Neuroimaging Analysis ### 15. Multi-Scale Modeling - Bridging Molecular, Cellular, and Systems-Level Models - Integrative Modeling of Brain Function Computational neuroscience is a highly interdisciplinary field, leveraging techniques and concepts from neuroscience, physics, mathematics, computer science, and engineering. It plays a crucial role in translating experimental findings into quantitative theories and in driving forward our understanding of the nervous system and brain.