I develop statistical methods for analysing data from neuroimaging and electrophysiological experiments. I have a particular interest in practical applications of information theoretic methods.
Research Strands
Developing information theoretic analysis tools
Information theory provides an elegant unified statistical framework but estimating information theoretic quantities in practise from limited data is not straightforward. During my PhD I developed pyEntropy, an open source Python library which implements a range of bias corrected estimates for discrete (i.e. binned) data.
I have recently developed a new bin-less method for estimating information theoretic quantities (GCMI, Ince et al. 2017) which is much less sensitive to bias effects. This estimator is robust, computationally efficient, and is ideally suited to signals such as those recorded with EEG and MEG. In particular, it allows estimation of information theoretic quantities on multivariate spaces that would be impossible with binned methods. This allows practical estimation of quantities like interaction information (below), and conditional mutual information.
Quantifying representational interactions between neuroimaging responses
If two different neuroimaging responses (different spatial/temporal/spectral regions, or different recording modalities) are found to be modulated by a stimulus, a natural question is whether they represent the stimulus in the same way. I believe such questions can be addressed with information theoretic notions of redundancy (representational overlap or shared information) and synergy (representation in interaction); calculated through variants of interaction information (Ince et al. 2017). Redundancy indicates both responses represent the same information about the simulus. Synergy indicates that the two responses convey more information together than they do alone; the relationship between them is informative. However, interaction information conflates synergy and redundancy quantifying only the net resultant effect. A technique called the Partial Information Decomposition (PID) has been proposed to properly separate synergy and redundancy, but finding a practical implementation of the theoretical concepts has proved difficult (Ince 2017). I propose a switch of perspective, to first decompose entropy, which reveals the cause for some of the difficulties with the PID, and provides a principled and practical alternative approach (Ince 2017). Currently the only analyses methods which address these types of questions are Representational Similarity Analysis and the temporal generalization decoding method. I hope that information theoretic approaches can complement these techniques, by widening the number of situations in which such questions can be addressed.
Information transmission in MEG data
Network level analyses of neuroimaging data are now well established. However, the connectivity measures which are used to obtain functional networks are usually agnostic to specific information content; they detect the presence of communication between regions but do not account for the content of that communication (e.g. whether it is stimulus driven, task relevant etc.). We have developed a measure which quantifies the causal communication about a specific stimulus feature (Ince et al. 2015). This measure is based on directed information (often called transfer entropy). We hope this content-based functional connectivity measure will allow network analyses of neuroimaging data to focus more directly on information processing functions.
Other methods
I am interested in combining information theoretic approaches with supervised learning, or other dimensionality reduction approaches to allow application to high dimensional response spaces. One dimensionality reduction approach which I believe is particularly promising is the combination of mutual information (MI) and non-negative matrix factorization (NMF). NMF and MI are ideal partners: MI images are non-negative and with a high signal to noise ratio, while NMF provides a meaningful parts-based decomposition, but does not normally incorporate any task or response specific knowledge. I believe combining them provides a simple but flexible approach for task-driven dimensionality reduction.
Collaborators
University of Glasgow
External
- Joachim Gross, University of Muenster
- Hyojin Park, University of Birmingham
- Christoph Kayser, Bielefeld University
- Bruno Giordano, CNRS / Aix-Marseille University
- Stefano Panzeri, Italian Institute of Technology, Rovereto
- Rasmus Petersen, University of Manchester
- Michael Bale, University of Sussex
Publications
2018
- T Xu, SH Scholte, RAA Ince, PG Schyns
Using psychophysical methods to understand mechanisms of face identification in a deep neural network
CVPR (2018)
[LINK] - JW Kay, RAA Ince
Exact partial information decompositions for Gaussian systems based on dependency constraints
Entropy (2018) 20 (4), p. 240
[LINK] - K Jaworska, F Yi, RAA Ince, NJ van Rijsbergen, PG Schyns, GA Rousselet
Neural processing of the same behaviourally relevant face features is delayed by 40ms in healthy aging
bioRxiv - J Zhan, RAA Ince, NJ van Rijsbergen, PG Schyns
Dynamic construction of reduced representations in the brain for perceptual decision behavior
bioRxiv
2017
- JW Kay, RAA Ince, B Dering, WA Phillips
Partial and entropic information decompositions of a neuronal modulatory interaction
Entropy (2017) 19 (11) p. 560
[LINK] - RAA Ince
Measuring multivariate redundant information with pointwise common change in surprisal
Entropy (2017) 19 (7) p. 318
[LINK] [code] - RAA Ince, BL Giordano, C Kayser, GA Rousselet, J Gross, PG Schyns
A statistical framework for neuroimaging data analysis based on mutual information estimated via a Gaussian copula
Human Brain Mapping (2017) 38 (3) p. 1541-1573
[LINK] [toolbox] [code] - RAA Ince
The Partial Entropy Decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal
arXiv:1702.01591 [cs.IT] (2017)
[LINK] [code] - BL Giordano, RAA Ince, J Gross, S Panzeri, PG Schyns, C Kayser
Contributions of local speech encoding and functional connectivity to audio-visual speech integration
eLife (2017) 6
[LINK] - A Keitel, RAA Ince, J Gross, C Kayser
Auditory cortical delta-entrainment interacts with oscillatory power in multiple fronto-parietal networks
NeuroImage (2017) 147 p. 32-42
[LINK] - H Park, RAA Ince, PG Schyns, G Thut, J Gross
Entrained audiovisual speech integration implemented by two independent computational mechanisms: Redundancy in left posterior superior temporal gyrus and Synergy in left motor cortex
bioRxiv
2016
- RAA Ince, K Jaworska, J Gross, S Panzeri, NJ van Rijsbergen, GA Rousselet, PG Schyns
The deceptively simple N170 reflects network information processing mechanisms involving visual feature coding and transfer across hemispheres
Cerebral Cortex (2016) 26(11) p. 4123-4135
[LINK]
2015
- RAA Ince, NJ van Rijsbergen, G Thut, GA Rousselet, J Gross, S Panzeri and PG Schyns
Tracing the flow of perceptual features in an algorithmic brain network
Scientific Reports (2015) 5 p. 17681
[ LINK (Open Access) ] - SJ Kayser, RAA Ince, J Gross and C Kayser
Irregular speech rate dissociates auditory cortical entrainment, evoked responses, and frontal alpha
Journal of Neuroscience (2015) 35(44) p. 14691-14701
[ LINK (Open Access) ] - MR Bale*, RAA Ince*, G Santagata and RS Petersen
Efficient population coding of naturalistic whisker motion in the ventro-posterior medial thalamus based on precise spike timing
Frontiers in Neural Circuits (2015) 9(50)
[ LINK (Open Access) ] - H Park, RAA Ince, PG Schyns, G Thut and J Gross
Frontal top-down signals increase coupling of auditory low-frequency oscillations to continuous speech in human listeners
Current Biology (2015) 25p. 1649-1653
[ LINK (Open Access) ]
2014
- GA Rousselet, RAA Ince, NJ van Rijsbergen and PG Schyns
Eye coding mechanisms in early human face event-related potentials
Journal of Vision (2014) 14(13);7 p. 1-24
[ LINK (Open Access) ] - RAA Ince, S Panzeri and SR Schultz
Summary of information theoretic quantities
Encyclopedia of Computational Neuroscience (2014)
[ LINK (arXiv) ] - RAA Ince, SR Schultz and S Panzeri
Estimating information theoretic quantities
Encyclopedia of Computational Neuroscience (2014)
[ LINK (arXiv) ] - SR Schultz, RAA Ince and S Panzeri
Applications of information theory to analysis of neural data
Encyclopedia of Computational Neuroscience (2014)
[ LINK (arXiv) ] - S Panzeri, RAA Ince, ME Diamond and C Kayser
Reading spike timing without a clock: intrinsic decoding of spike trains
Phil. Trans. B (2014) 369 20120467
[ LINK (Open Access) ]
2013
- RAA Ince, S Panzeri and C Kayser
Neural codes formed by small and temporally precise populations in auditory cortex
Journal of Neuroscience (2013) 33(46) p. 18277-18287
[ LINK (Subscription Required) | PDF ] - MR Bale*, K Davies*, OJ Freeman, RAA Ince and RS Petersen
Low-dimensional sensory feature representation by trigeminal primary afferents
Journal of Neuroscience (2013) 33(29) p. 12003-12012
[ LINK (Open Access) ]
2012
- RAA Ince
Open-source software for studying neural codes
in S Panzeri and R Quian Quiroga (Eds) Principles of Neural Coding, CRC Press (in press)
[ LINK (Amazon) ] - C Kayser, RAA Ince and S Panzeri
Analysis of slow (theta) oscillations as a potential temporal reference frame for information coding in sensory cortex
PLoS Computational Biology (2012) 8(10) e1002717
[ LINK (Open Access) ] - RAA Ince*, A Mazzoni*, A Bartels, NK Logothetis and S Panzeri
A novel test to determine the significance of neural selectivity to single and multiple potentially correlated features
Journal of Neuroscience Methods (2012) 210:1 p. 49-65
[ LINK (Subscription Required) | PDF ]
2011
- S Panzeri and RAA Ince (2011)
Information theoretic approaches to the analysis of neural population recordings
in N Kriegeskorte and G Kreiman (Eds) Visual population codes: toward a common multivariate framework for cell recording and functional imaging, MIT Press
[ LINK (amazon) ]
2010
- RAA Ince, R Senatore, E Arabzadeh, F Montani, ME Diamond and S Panzeri
Information theoretic methods for studying population codes
Neural Networks (2010) 23:6 p. 713-727
[ LINK (Subscription Required) | PDF ] -
RAA Ince, A Mazzoni, R Petersen and S Panzeri
Open source tools for the information theoretic analysis of neural data
Frontiers in Neuroscience (2010) 4:1 p. 62-70
[ LINK (Open Access) ]
2009
-
RAA Ince, F Montani, E Arabzadeh, ME Diamond and S Panzeri
On the presence of high-order interactions in somatosensory cortex and their effect on information transmission
Journal of Physics: Conference Series (2009) 197 012013 (1pp)
[ LINK (Open Access) ] -
RAA Ince, C Bartolozzi and S Panzeri
An information theoretic library for analysis of neural codes
The Neuromorphic Engineer (2009) doi: 10.2417/1200906.1663
[ LINK (Open Access) ] -
F Montani*, RAA Ince*, R Senatore, E Arabzadeh, ME Diamond and S Panzeri
The impact of high-order interactions on the rate of synchronous discharge and information transmission in somatosensory cortex
Philosophical Transactions of the Royal Society A (2009) 367:1901 p. 3297-3310
[ LINK (Subscription Required) | PDF ] -
RAA Ince, RS Petersen, DC Swan and S Panzeri
Python for information theoretic analysis of neural data
Frontiers in Neuroinformatics (2009) 3:4.
[ LINK (Open Access) ]
* - These authors contributed equally to this work.