The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon’s mathematical theory of information is used to explore the computational efficiency of neurons, with special reference to visual perception and the efficient coding hypothesis. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural processing; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, and a list of annotated Further Readings, this book is an ideal introduction to the principles of neural information theory.


Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency (Tutorial Introductions)
Original price was: $34.95.$21.83Current price is: $21.83.
This book offers an advanced introduction to neural information theory, suitable for students studying computer science, biology, and computational neuroscience.
Additional information
Weight | 0.338 lbs |
---|---|
Dimensions | 15.2 × 1.2 × 22.9 in |
Reviews
There are no reviews yet.