The human brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the computational efficiency of neurons, with special reference to visual perception. A diverse range of examples is used to show how information theory effectively defines fundamental and unbreachable limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary and tutorial appendices, this book is ideal for novices who wish to understand the essential principles of neural information theory