6,895 research outputs found
Emerging Consciousness as a Result of Complex-Dynamical Interaction Process
A quite general interaction process within a multi-component system is analysed by the extended effective potential method, liberated from usual limitations of perturbation theory or integrable model. The obtained causally complete solution of the many-body problem reveals the phenomenon of dynamic multivaluedness, or redundance, of emerging, incompatible system realisations and dynamic entanglement of system components within each realisation. The ensuing concept of dynamic complexity (and related intrinsic chaoticity) is absolutely universal and can be applied to the problem of consciousness that emerges now as a high enough, properly specified level of unreduced complexity of a suitable interaction process. This complexity level can be identified with the appearance of bound, permanently localised states in the multivalued brain dynamics from strongly chaotic states of unconscious intelligence, by analogy with classical behaviour emergence from quantum states at much lower levels of world dynamics. We show that the main properties of this dynamically emerging consciousness (and intelligence, at the preceding complexity level) correspond to empirically derived properties of natural versions and obtain causally substantiated conclusions about their artificial realisation, including the fundamentally justified paradigm of genuine machine consciousness. This rigorously defined machine consciousness is different from both natural consciousness and any mechanistic, dynamically single-valued imitation of the latter. We use then the same, truly universal concept of complexity to derive equally rigorous conclusions about mental and social implications of the machine consciousness paradigm, demonstrating its indispensable role in the next stage of civilisation development
What Makes Complex Systems Complex?
This paper explores some of the factors that make complex systems complex. We first examine the history of complex systems. It was Aristotle’s insight that how elements are joined together helps determine the properties of the resulting whole. We find (a) that scientific reductionism does not provide a sufficient explanation; (b) that to understand complex systems, one must identify and trace energy flows; and (c) that disproportionate causality, including global tipping points, are all around us. Disproportionate causality results from the wide availability of energy stores. We discuss three categories of emergent phenomena—static, dynamic, and
adaptive—and recommend retiring the term emergent, except perhaps as a synonym for creative. Finally, we find that virtually all communication is stigmergic
Causality, Information and Biological Computation: An algorithmic software approach to life, disease and the immune system
Biology has taken strong steps towards becoming a computer science aiming at
reprogramming nature after the realisation that nature herself has reprogrammed
organisms by harnessing the power of natural selection and the digital
prescriptive nature of replicating DNA. Here we further unpack ideas related to
computability, algorithmic information theory and software engineering, in the
context of the extent to which biology can be (re)programmed, and with how we
may go about doing so in a more systematic way with all the tools and concepts
offered by theoretical computer science in a translation exercise from
computing to molecular biology and back. These concepts provide a means to a
hierarchical organization thereby blurring previously clear-cut lines between
concepts like matter and life, or between tumour types that are otherwise taken
as different and may not have however a different cause. This does not diminish
the properties of life or make its components and functions less interesting.
On the contrary, this approach makes for a more encompassing and integrated
view of nature, one that subsumes observer and observed within the same system,
and can generate new perspectives and tools with which to view complex diseases
like cancer, approaching them afresh from a software-engineering viewpoint that
casts evolution in the role of programmer, cells as computing machines, DNA and
genes as instructions and computer programs, viruses as hacking devices, the
immune system as a software debugging tool, and diseases as an
information-theoretic battlefield where all these forces deploy. We show how
information theory and algorithmic programming may explain fundamental
mechanisms of life and death.Comment: 30 pages, 8 figures. Invited chapter contribution to Information and
Causality: From Matter to Life. Sara I. Walker, Paul C.W. Davies and George
Ellis (eds.), Cambridge University Pres
The Minimal Modal Interpretation of Quantum Theory
We introduce a realist, unextravagant interpretation of quantum theory that
builds on the existing physical structure of the theory and allows experiments
to have definite outcomes, but leaves the theory's basic dynamical content
essentially intact. Much as classical systems have specific states that evolve
along definite trajectories through configuration spaces, the traditional
formulation of quantum theory asserts that closed quantum systems have specific
states that evolve unitarily along definite trajectories through Hilbert
spaces, and our interpretation extends this intuitive picture of states and
Hilbert-space trajectories to the case of open quantum systems as well. We
provide independent justification for the partial-trace operation for density
matrices, reformulate wave-function collapse in terms of an underlying
interpolating dynamics, derive the Born rule from deeper principles, resolve
several open questions regarding ontological stability and dynamics, address a
number of familiar no-go theorems, and argue that our interpretation is
ultimately compatible with Lorentz invariance. Along the way, we also
investigate a number of unexplored features of quantum theory, including an
interesting geometrical structure---which we call subsystem space---that we
believe merits further study. We include an appendix that briefly reviews the
traditional Copenhagen interpretation and the measurement problem of quantum
theory, as well as the instrumentalist approach and a collection of
foundational theorems not otherwise discussed in the main text.Comment: 73 pages + references, 9 figures; cosmetic changes, added figure,
updated references, generalized conditional probabilities with attendant
changes to the sections on the EPR-Bohm thought experiment and Lorentz
invariance; for a concise summary, see the companion letter at
arXiv:1405.675
Sums over geometries and improvements on the mean field approximation
The saddle points of a Lagrangian due to Efetov are analyzed. This Lagrangian
was originally proposed as a tool for calculating systematic corrections to the
Bethe approximation, a mean-field approximation which is important in
statistical mechanics, glasses, coding theory, and combinatorial optimization.
Detailed analysis shows that the trivial saddle point generates a sum over
geometries reminiscent of dynamically triangulated quantum gravity, which
suggests new possibilities to design sums over geometries for the specific
purpose of obtaining improved mean field approximations to -dimensional
theories. In the case of the Efetov theory, the dominant geometries are locally
tree-like, and the sum over geometries diverges in a way that is similar to
quantum gravity's divergence when all topologies are included. Expertise from
the field of dynamically triangulated quantum gravity about sums over
geometries may be able to remedy these defects and fulfill the Efetov theory's
original promise. The other saddle points of the Efetov Lagrangian are also
analyzed; the Hessian at these points is nonnormal and pseudo-Hermitian, which
is unusual for bosonic theories. The standard formula for Gaussian integrals is
generalized to nonnormal kernels.Comment: Accepted for publication in Physical Review D, probably in November
2007. At the reviewer's request, material was added which made the article
more assertive, confident, and clear. No changes in substanc
Complexity, BioComplexity, the Connectionist Conjecture and Ontology of Complexity\ud
This paper develops and integrates major ideas and concepts on complexity and biocomplexity - the connectionist conjecture, universal ontology of complexity, irreducible complexity of totality & inherent randomness, perpetual evolution of information, emergence of criticality and equivalence of symmetry & complexity. This paper introduces the Connectionist Conjecture which states that the one and only representation of Totality is the connectionist one i.e. in terms of nodes and edges. This paper also introduces an idea of Universal Ontology of Complexity and develops concepts in that direction. The paper also develops ideas and concepts on the perpetual evolution of information, irreducibility and computability of totality, all in the context of the Connectionist Conjecture. The paper indicates that the control and communication are the prime functionals that are responsible for the symmetry and complexity of complex phenomenon. The paper takes the stand that the phenomenon of life (including its evolution) is probably the nearest to what we can describe with the term “complexity”. The paper also assumes that signaling and communication within the living world and of the living world with the environment creates the connectionist structure of the biocomplexity. With life and its evolution as the substrate, the paper develops ideas towards the ontology of complexity. The paper introduces new complexity theoretic interpretations of fundamental biomolecular parameters. The paper also develops ideas on the methodology to determine the complexity of “true” complex phenomena.\u
- …