228 research outputs found

    A Third Transition in Science?

    Get PDF
    Since Newton, classical and quantum physics depend upon the "Newtonian Paradigm". The relevant variables of the system are identified. For example, we identify the position and momentum of classical particles. Laws of motion in differential form connecting the variables are formulated. An example is Newton's three Laws of Motion. The boundary conditions creating the phase space of all possible values of the variables are defined. Then, given any initial condition, the differential equations of motion are integrated to yield an entailed trajectory in the pre-stated phase space. It is fundamental to the Newtonian Paradigm that the set of possibilities that constitute the phase space is always definable and fixed ahead of time. This fails for the diachronic evolution of ever-new adaptations in any biosphere. Living cells achieve Constraint Closure and construct themselves. Thus, living cells, evolving via heritable variation and Natural selection, adaptively construct new-in-the-universe possibilities. We can neither define nor deduce the evolving phase space: We can use no mathematics based on Set Theory to do so. We cannot write or solve differential equations for the diachronic evolution of ever-new adaptations in a biosphere. Evolving biospheres are outside the Newtonian Paradigm. There can be no Theory of Everything that entails all that comes to exist. We face a third major transition in science beyond the Pythagorean dream that ``All is Number'' echoed by Newtonian physics. However, we begin to understand the emergent creativity of an evolving biosphere: Emergence is not engineering

    What is consciousness? Artificial intelligence, real intelligence, quantum mind and qualia

    Get PDF
    We approach the question ‘What is consciousness?’ in a new way, not as Descartes’ ‘systematic doubt’, but as how organisms find their way in their world. Finding one’s way involves finding possible uses of features of the world that might be beneficial or avoiding those that might be harmful. ‘Possible uses of X to accomplish Y’ are ‘affordances’. The number of uses of X is indefinite (or unknown), the different uses are unordered, are not listable, and are not deducible from one another. All biological adaptations are either affordances seized by heritable variation and selection or, far faster, by the organism acting in its world finding uses of X to accomplish Y. Based on this, we reach rather astonishing conclusions: 1. Artificial general intelligence based on universal Turing machines (UTMs) is not possible, since UTMs cannot ‘find’ novel affordances. 2. Brain-mind is not purely classical physics for no classical physics system can be an analogue computer whose dynamical behaviour can be isomorphic to ‘possible uses’. 3. Brain-mind must be partly quantum—supported by increasing evidence at 6.0 to 7.3 sigma. 4. Based on Heisenberg’s interpretation of the quantum state as ‘potentia’ converted to ‘actuals’ by measurement, where this interpretation is not a substance dualism, a natural hypothesis is that mind actualizes potentia. This is supported at 5.2 sigma. Then mind’s actualizations of entangled brain-mind-world states are experienced as qualia and allow ‘seeing’ or ‘perceiving’ of uses of X to accomplish Y. We can and do jury-rig. Computers cannot. 5. Beyond familiar quantum computers, we discuss the potentialities of trans-Turing systems

    Beyond the Newtonian Paradigm: A Statistical Mechanics of Emergence

    Get PDF
    Since Newton, all classical and quantum physics depends upon the "Newtonian Paradigm". Here the relevant variables of the system are identified. The boundary conditions creating the phase space of all possible values of the variables are defined. Then, given any initial condition, the differential equations of motion are integrated to yield an entailed trajectory in the phase space. It is fundamental to the Newtonian Paradigm that the set of possibilities that constitute the phase space is always definable and fixed ahead of time. All of this fails for the diachronic evolution of ever new adaptations in any biosphere. The central reason is that living cells achieve Constraint Closure and construct themselves. Living cells, evolving via heritable variation and Natural selection, adaptively construct new in the universe possibilities. The new possibilities are opportunities for new adaptations thereafter seized by heritable variation and Natural Selection. Surprisingly, we can neither define nor deduce the evolving phase spaces ahead of time. We can use no mathematics based on Set Theory to do so. These ever-new adaptations with ever-new relevant variables constitute the ever-changing phase space of evolving biospheres. Because of this, evolving biospheres are entirely outside the Newtonian Paradigm. One consequence is that for any universe such as ours there can be no Final Theory that entails all that comes to exist. The implications are large. We face a third major transition in science beyond the Pythagorean dream that "All is Number". We must give up deducing the diachronic evolution of the biosphere. All of physics, classical and quantum, however, apply to the analysis of existing life, a synchronic analysis. We begin to better understand the emergent creativity of an evolving biosphere. Thus, we are on the edge of inventing a physics-like new statistical mechanics of emergence

    The world is not a theorem

    Get PDF
    The evolution of the biosphere unfolds as a luxuriant generative process of new living forms and functions. Organisms adapt to their environment, exploit novel opportunities that are created in this continuous blooming dynamics. Affordances play a fundamental role in the evolution of the biosphere, for organisms can exploit them for new morphological and behavioral adaptations achieved by heritable variations and selection. This way, the opportunities offered by affordances are then actualized as ever novel adaptations. In this paper we maintain that affordances elude a formalization that relies on set theory: we argue that it is not possible to apply set theory to affordances, therefore we cannot devise a set-based mathematical theory of the diachronic evolution of the biosphere.Comment: Minor changes in abstract and sec.2 Added a paragraph in sec.3 on the role of non-ergodicity; added also a paragraph on Turing machines and syntactic information. Sec.5 added a paragraph for clarifying the difference between our statement and the case of chaotic dynamical system

    Emergence of Organisms.

    Get PDF
    Since early cybernetics studies by Wiener, Pask, and Ashby, the properties of living systems are subject to deep investigations. The goals of this endeavour are both understanding and building: abstract models and general principles are sought for describing organisms, their dynamics and their ability to produce adaptive behavior. This research has achieved prominent results in fields such as artificial intelligence and artificial life. For example, today we have robots capable of exploring hostile environments with high level of self-sufficiency, planning capabilities and able to learn. Nevertheless, the discrepancy between the emergence and evolution of life and artificial systems is still huge. In this paper, we identify the fundamental elements that characterize the evolution of the biosphere and open-ended evolution, and we illustrate their implications for the evolution of artificial systems. Subsequently, we discuss the most relevant issues and questions that this viewpoint poses both for biological and artificial systems

    Evolving always‐critical networks

    Get PDF
    Living beings share several common features at the molecular level, but there are very few large‐scale “operating principles” which hold for all (or almost all) organisms. However, biology is subject to a deluge of data, and as such, general concepts such as this would be extremely valuable. One interesting candidate is the “criticality” principle, which claims that biological evolution favors those dynamical regimes that are intermediaries between ordered and disordered states (i.e., “at the edge of chaos”). The reasons why this should be the case and experimental evidence are briefly discussed, observing that gene regulatory networks are indeed often found on, or close to, the critical boundaries. Therefore, assuming that criticality provides an edge, it is important to ascertain whether systems that are critical can further evolve while remaining critical. In order to explore the possibility of achieving such “always‐critical” evolution, we resort to simulated evolution, by suitably modifying a genetic algorithm in such a way that the newly‐generated individuals are constrained to be critical. It is then shown that these modified genetic algorithms can actually develop critical gene regulatory networks with two interesting (and quite different) features of biological significance, involving, in one case, the average gene activation values and, in the other case, the response to perturbations. These two cases suggest that it is often possible to evolve networks with interesting properties without losing the advantages of criticality. The evolved networks also show some interesting features which are discussed

    On the Criticality of Adaptive Boolean Network Robots

    Get PDF
    Systems poised at a dynamical critical regime, between order and disorder, have been shown capable of exhibiting complex dynamics that balance robustness to external perturbations and rich repertoires of responses to inputs. This property has been exploited in artificial network classifiers, and preliminary results have also been attained in the context of robots controlled by Boolean networks. In this work, we investigate the role of dynamical criticality in robots undergoing online adaptation, i.e., robots that adapt some of their internal parameters to improve a performance metric over time during their activity. We study the behavior of robots controlled by random Boolean networks, which are either adapted in their coupling with robot sensors and actuators or in their structure or both. We observe that robots controlled by critical random Boolean networks have higher average and maximum performance than that of robots controlled by ordered and disordered nets. Notably, in general, adaptation by change of couplings produces robots with slightly higher performance than those adapted by changing their structure. Moreover, we observe that when adapted in their structure, ordered networks tend to move to the critical dynamical regime. These results provide further support to the conjecture that critical regimes favor adaptation and indicate the advantage of calibrating robot control systems at dynamical critical states

    The Model of Perceived Organizational Support and Employee Involvement with Organizational Identification (OI) as a Mediating Variable (A Study at PT. Jasa Raharja (Persero), Branch of Aceh Province, Indonesia)

    Get PDF
    This study examined the impact of the perceived organizational support (POS) to the job involvement of employees (JI) with organizational identification (OI) as a mediator variable. Samples numbered 98 employees of PT. Jasa Raharja (Persero) Branch of Aceh, which was taken based on saturated samples (census). Primary data was obtained by the method of the distribution of questionnaires to all samples. Methods of data analysis using Structural Equation Modeling (SEM) with application Analysis of Moment Structures (AMOS). The results showed no influence perceived of organizational support on job involvement of employees, organizational identification acts as a mediating variable between perceived influence of organizational support and job involvement of employees. The analysis also found that organizational identification variable mediating role as full/perfect. Keywords: Organizational Support, Job Involvement, Organizational Identification,  Mediating Rol

    ImageNet-Patch: A dataset for benchmarking machine learning robustness against adversarial patches

    Get PDF
    Adversarial patches are optimized contiguous pixel blocks in an input image that cause a machine-learning model to misclassify it. However, their optimization is computationally demanding, and requires careful hyperparameter tuning, potentially leading to suboptimal robustness evaluations. To overcome these issues, we propose ImageNet-Patch, a dataset to benchmark machine-learning models against adversarial patches. The dataset is built by first optimizing a set of adversarial patches against an ensemble of models, using a state-of-the-art attack that creates transferable patches. The corresponding patches are then randomly rotated and translated, and finally applied to the ImageNet data. We use ImageNet-Patch to benchmark the robustness of 127 models against patch attacks, and also validate the effectiveness of the given patches in the physical domain (i.e., by printing and applying them to real-world objects). We conclude by discussing how our dataset could be used as a benchmark for robustness, and how our methodology can be generalized to other domains. We open source our dataset and evaluation code at https://github.com/pralab/ImageNet-Patch
    • 

    corecore