8,046 research outputs found
Entropy and Energy in Characterizing the Organization of Concept Maps in Learning Science
The coherence and connectivity of such knowledge representations is known to be closely related to knowledge production, acquisition and processing. In this study we use network theory in making the clustering and cohesion of concept maps measurable, and show how the distribution of these properties can be interpreted through the Maximum Entropy (MaxEnt) method. This approach allows to introduce new concepts of the “energy of cognitive load” and the “entropy of knowledge organization” to describe the organization of knowledge in the concept mapsPeer reviewe
Infotropism as the underlying principle of perceptual organization
Whether perceptual organization favors the simplest or most likely interpretation of a distal stimulus has long been debated. An unbridgeable gulf has seemed to separate these, the Gestalt and Helmholtzian viewpoints. But in recent decades, the proposal that likelihood and simplicity are two sides of the same coin has been gaining ground, to the extent that their equivalence is now widely assumed. What then arises is a desire to know whether the two principles can be reduced to one. Applying Occam's Razor in this way is particularly desirable given that, as things stand, an account referencing one principle alone cannot be completely satisfactory. The present paper argues that unification of the two principles is possible, and that it can be achieved in terms of an incremental notion of `information seeking' (infotropism). Perceptual processing that is infotropic can be shown to target both simplicity and likelihood. The ability to see perceptual organization as governed by either objective can then be explained in terms of it being an infotropic process. Infotropism can be identified as the principle which underlies, and thus generalizes the principles of likelihood and simplicity
Toward a multilevel representation of protein molecules: comparative approaches to the aggregation/folding propensity problem
This paper builds upon the fundamental work of Niwa et al. [34], which
provides the unique possibility to analyze the relative aggregation/folding
propensity of the elements of the entire Escherichia coli (E. coli) proteome in
a cell-free standardized microenvironment. The hardness of the problem comes
from the superposition between the driving forces of intra- and inter-molecule
interactions and it is mirrored by the evidences of shift from folding to
aggregation phenotypes by single-point mutations [10]. Here we apply several
state-of-the-art classification methods coming from the field of structural
pattern recognition, with the aim to compare different representations of the
same proteins gathered from the Niwa et al. data base; such representations
include sequences and labeled (contact) graphs enriched with chemico-physical
attributes. By this comparison, we are able to identify also some interesting
general properties of proteins. Notably, (i) we suggest a threshold around 250
residues discriminating "easily foldable" from "hardly foldable" molecules
consistent with other independent experiments, and (ii) we highlight the
relevance of contact graph spectra for folding behavior discrimination and
characterization of the E. coli solubility data. The soundness of the
experimental results presented in this paper is proved by the statistically
relevant relationships discovered among the chemico-physical description of
proteins and the developed cost matrix of substitution used in the various
discrimination systems.Comment: 17 pages, 3 figures, 46 reference
Data-Driven Shape Analysis and Processing
Data-driven methods play an increasingly important role in discovering
geometric, structural, and semantic relationships between 3D shapes in
collections, and applying this analysis to support intelligent modeling,
editing, and visualization of geometric data. In contrast to traditional
approaches, a key feature of data-driven approaches is that they aggregate
information from a collection of shapes to improve the analysis and processing
of individual shapes. In addition, they are able to learn models that reason
about properties and relationships of shapes without relying on hard-coded
rules or explicitly programmed instructions. We provide an overview of the main
concepts and components of these techniques, and discuss their application to
shape classification, segmentation, matching, reconstruction, modeling and
exploration, as well as scene analysis and synthesis, through reviewing the
literature and relating the existing works with both qualitative and numerical
comparisons. We conclude our report with ideas that can inspire future research
in data-driven shape analysis and processing.Comment: 10 pages, 19 figure
The Origins of Computational Mechanics: A Brief Intellectual History and Several Clarifications
The principle goal of computational mechanics is to define pattern and
structure so that the organization of complex systems can be detected and
quantified. Computational mechanics developed from efforts in the 1970s and
early 1980s to identify strange attractors as the mechanism driving weak fluid
turbulence via the method of reconstructing attractor geometry from measurement
time series and in the mid-1980s to estimate equations of motion directly from
complex time series. In providing a mathematical and operational definition of
structure it addressed weaknesses of these early approaches to discovering
patterns in natural systems.
Since then, computational mechanics has led to a range of results from
theoretical physics and nonlinear mathematics to diverse applications---from
closed-form analysis of Markov and non-Markov stochastic processes that are
ergodic or nonergodic and their measures of information and intrinsic
computation to complex materials and deterministic chaos and intelligence in
Maxwellian demons to quantum compression of classical processes and the
evolution of computation and language.
This brief review clarifies several misunderstandings and addresses concerns
recently raised regarding early works in the field (1980s). We show that
misguided evaluations of the contributions of computational mechanics are
groundless and stem from a lack of familiarity with its basic goals and from a
failure to consider its historical context. For all practical purposes, its
modern methods and results largely supersede the early works. This not only
renders recent criticism moot and shows the solid ground on which computational
mechanics stands but, most importantly, shows the significant progress achieved
over three decades and points to the many intriguing and outstanding challenges
in understanding the computational nature of complex dynamic systems.Comment: 11 pages, 123 citations;
http://csc.ucdavis.edu/~cmg/compmech/pubs/cmr.ht
- …