Connectivity in the human brain dissociates entropy and complexity of auditory inputs ☆

Abstract

Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. Introduction Theoretical and experimental work in the fields of psychology and complexity science has arrived at two separate approaches for describing how stimuli may be encoded and what constitutes a complex stimulus (see On the other hand, the second, more recent view (e.g., Crutchfield, 2012) holds that simplicity/complexity depends on how demanding it is to model the underlying system that generated a particular stimulus or signal via the interactions of its states. From this perspective, there is a convex, inverse U-shaped relation between disorder and complexity. This is because highly ordered and highly disordered signals are typically generated by succinct, easily describable systems, whereas more sophisticated, or complex, systems generally convey intermediate levels of entropy. 1 Note that in this latter approach, complexity does not capture how difficult it is to veridically encode or reproduce any specific stimulus or signal, but rather how computationally demanding it is to model the system or source generating that signal. As can be appreciated, the two views described above are independent, and graphs depicting 1 For instance, ABCDABCD can be thought of as generated by a system (e.g., a transition matrix) that transitions between four states deterministically (a simple explanation), while a random stimulus can be characterized by a system where all state transitions are equally likely (a similarly simple explanation). http://d

    Similar works