Gerstner, Wulfram

Neuronal dynamics : from single neurons to networks and models of cognition - , Cambridge, United Kingdom, Cambridge University Press 2014 - 577P:


Machine generated contents note: pt. ONE FOUNDATIONS OF NEURONAL DYNAMICS
1. Introduction: neurons and mathematics
1.1. Elements of neuronal systems
1.2. Elements of neuronal dynamics
1.3. Integrate-and-fire models
1.4. Limitations of the leaky integrate-and-fire model
1.5. What can we expect from integrate-and-fire models?
1.6. Summary
2. Ion channels and the Hodgkin
Huxley model
2.1. Equilibrium potential
2.2. Hodgkin
Huxley model
2.3. The zoo of ion channels
2.4. Summary
3. Dendrites and synapses
3.1. Synapses
3.2. Spatial structure: the dendritic tree
3.3. Spatial structure: axons
3.4.Compartmental models
3.5. Summary
4. Dimensionality reduction and phase plane analysis
4.1. Threshold effects
4.2. Reduction to two dimensions
4.3. Phase plane analysis
4.4. Type I and type II neuron models
4.5. Threshold and excitability
4.6. Separation of time scales and reduction to one dimension
4.7. Summary. Contents note continued: pt. TWO GENERALIZED INTEGRATE-AND-FIRE NEURONS
5. Nonlinear integrate-and-fire models
5.1. Thresholds in a nonlinear integrate-and-fire model
5.2. Exponential integrate-and-fire model
5.3. Quadratic integrate and fire
5.4. Summary
6. Adaptation and firing patterns
6.1. Adaptive exponential integrate-and-fire
6.2. Firing patterns
6.3. Biophysical origin of adaptation
6.4. Spike Response Model (SRM)
6.5. Summary
7. Variability of spike trains and neural codes
7.1. Spike-train variability
7.2. Mean firing rate
7.3. Interval distribution and coefficient of variation
7.4. Autocorrelation function and noise spectrum
7.5. Renewal statistics
7.6. The problem of neural coding
7.7. Summary
8. Noisy input models: barrage of spike arrivals
8.1. Noise input
8.2. Stochastic spike arrival
8.3. Subthreshold vs. superthreshold regime
8.4. Diffusion limit and Fokker
Planck equation (*)
8.5. Summary. Contents note continued: 9. Noisy output: escape rate and soft threshold
9.1. Escape noise
9.2. Likelihood of a spike train
9.3. Renewal approximation of the Spike Response Model
9.4. From noisy inputs to escape noise
9.5. Summary
10. Estimating parameters of probabilistic neuron models
10.1. Parameter optimization in linear and nonlinear models
10.2. Statistical formulation of encoding models
10.3. Evaluating goodness-of-fit
10.4. Closed-loop stimulus design
10.5. Summary
11. Encoding and decoding with stochastic neuron models
11.1. Encoding models for intracellular recordings
11.2. Encoding models in systems neuroscience
11.3. Decoding
11.4. Summary
pt. THREE NETWORKS OF NEURONS AND POPULATION ACTIVITY
12. Neuronal populations
12.1. Columnar organization
12.2. Identical neurons: a mathematical abstraction
12.3. Connectivity schemes
12.4. From microscopic to macroscopic
12.5. Summary. Contents note continued: 13. Continuity equation and the Fokker
Planck approach
13.1. Continuity equation
13.2. Stochastic spike arrival
13.3. Fokker
Planck equation
13.4.Networks of leaky integrate-and-fire neurons
13.5.Networks of nonlinear integrate-and-fire neurons
13.6. Neuronal adaptation and synaptic conductance
13.7. Summary
14. Quasi-renewal theory and the integral-equation approach
14.1. Population activity equations
14.2. Recurrent networks and interacting populations
14.3. Linear response to time-dependent input
14.4. Density equations vs. integral equations
14.5. Adaptation in population equations
14.6. Heterogeneity and finite size
14.7. Summary
15. Fast transients and rate models
15.1. How fast are population responses?
15.2. Fast transients vs. slow transients in models
15.3. Rate models
15.4. Summary
pt. FOUR DYNAMICS OF COGNITION
16.Competing populations and decision making
16.1. Perceptual decision making. Contents note continued: 16.2.Competition through common inhibition
16.3. Dynamics of decision making
16.4. Alternative decision models
16.5. Human decisions, determinism, and free will
16.6. Summary
17. Memory and attractor dynamics
17.1. Associations and memory
17.2. Hopfield model
17.3. Memory networks with spiking neurons
17.4. Summary
18. Cortical field models for perception
18.1. Spatial continuum model
18.2. Input-driven regime and sensory cortex models
18.3. Bump attractors and spontaneous pattern formation
18.4. Summary
19. Synaptic plasticity and learning
19.1. Hebb rule and experiments
19.2. Models of Hebbian learning
19.3. Unsupervised learning
19.4. Reward-based learning
19.5. Summary
20. Outlook: dynamics in plastic networks
20.1. Reservoir computing
20.2. Oscillations: good or bad?
20.3. Helping patients
20.4. Summary

This solid introduction uses the principles of physics and the tools of mathematics to approach fundamental questions of neuroscience

9781107635197

612.8 / GER