Search for a Book
HELP US TO HELP YOU Add a new Book


A primer on information theory, with applications to neuroscience

Language: 



Author:  Felix Effenberger 
















Description 
Given the constant rise in quantity and quality of data obtained from neural systems on all scales, informationtheoretic analyses became more and more popular over the last decades in the neurosciences. Such analyses can provide deep insights into the functioning of such systems and also be of help in the characterization and analysis of neural dysfunction, a topic that has come into the focus of research in the computational neurosciences recently.
This chapter is supposed to give a short introduction to the fundamentals of information theory; not only, but especially suited for people having a less firm background in mathematics and probability theory. Regarding applications, the focus will be on neuroscientific topics.
We start by reviewing fundamentals of probability theory such as the notion of probability, probability distributions and random variables. We will then discuss the concepts of information and entropy (in the sense of Shannon), mutual information and transfer entropy (sometimes also referred to as conditional mutual information). As these quantities cannot be computed for measured data in practice, we discuss estimationtechniques for informationtheoretic quantities.
We conclude with a discussion of applications of information theory in the field of neuroscience, including questions of possible medical applications and a short review of software packages that can be used for informationtheoretic analyses of neural data. 

Similar Books 



Home  Authors  About  Contact Us  Email
Copyright © 20022013 FreeScience.info.
Best viewed with Mozilla 1.X 1024x768
free scientific books
