1,158 research outputs found

    Categorical Ontology of Complex Systems, Meta-Systems and Theory of Levels: The Emergence of Life, Human Consciousness and Society

    Get PDF
    Single cell interactomics in simpler organisms, as well as somatic cell interactomics in multicellular organisms, involve biomolecular interactions in complex signalling pathways that were recently represented in modular terms by quantum automata with ‘reversible behavior’ representing normal cell cycling and division. Other implications of such quantum automata, modular modeling of signaling pathways and cell differentiation during development are in the fields of neural plasticity and brain development leading to quantum-weave dynamic patterns and specific molecular processes underlying extensive memory, learning, anticipation mechanisms and the emergence of human consciousness during the early brain development in children. Cell interactomics is here represented for the first time as a mixture of ‘classical’ states that determine molecular dynamics subject to Boltzmann statistics and ‘steady-state’, metabolic (multi-stable) manifolds, together with ‘configuration’ spaces of metastable quantum states emerging from complex quantum dynamics of interacting networks of biomolecules, such as proteins and nucleic acids that are now collectively defined as quantum interactomics. On the other hand, the time dependent evolution over several generations of cancer cells --that are generally known to undergo frequent and extensive genetic mutations and, indeed, suffer genomic transformations at the chromosome level (such as extensive chromosomal aberrations found in many colon cancers)-- cannot be correctly represented in the ‘standard’ terms of quantum automaton modules, as the normal somatic cells can. This significant difference at the cancer cell genomic level is therefore reflected in major changes in cancer cell interactomics often from one cancer cell ‘cycle’ to the next, and thus it requires substantial changes in the modeling strategies, mathematical tools and experimental designs aimed at understanding cancer mechanisms. Novel solutions to this important problem in carcinogenesis are proposed and experimental validation procedures are suggested. From a medical research and clinical standpoint, this approach has important consequences for addressing and preventing the development of cancer resistance to medical therapy in ongoing clinical trials involving stage III cancer patients, as well as improving the designs of future clinical trials for cancer treatments.\ud \ud \ud KEYWORDS: Emergence of Life and Human Consciousness;\ud Proteomics; Artificial Intelligence; Complex Systems Dynamics; Quantum Automata models and Quantum Interactomics; quantum-weave dynamic patterns underlying human consciousness; specific molecular processes underlying extensive memory, learning, anticipation mechanisms and human consciousness; emergence of human consciousness during the early brain development in children; Cancer cell ‘cycling’; interacting networks of proteins and nucleic acids; genetic mutations and chromosomal aberrations in cancers, such as colon cancer; development of cancer resistance to therapy; ongoing clinical trials involving stage III cancer patients’ possible improvements of the designs for future clinical trials and cancer treatments. \ud \u

    Spatial cognitive processes involved in electronic circuit interpretation and translation: their use as powerful pedagogical tools within an education scenario

    Get PDF
    While there is much research concerning the interpretation of diagrams such as geographical maps and networks for information systems, there is very little on the diagrams involved in electrical and electronic engineering. Such research is important not only because it supports arguments made for other types of diagrams but also because it informs on the cognitive processes going on while learning electrical and electronic engineering domains, which are generally considered difficult to teach and learn. Such insight is useful to have as a pedagogical tool for teachers. It might also benefit would be self-learners, entrepreneurs, and hobbyists in the field because it can guide self-learning practices. When cognitive practices specific to this knowledge domain are more understood, they might give rise to automated intelligent tutor systems which could be used to augment teaching and learning practices in the education of electrical and electronic engineering. This research analyses the spatial cognitive processes involved in the translation of an electronic circuit schematic diagram into an iconic representation of the same circuit. The work shows that the cognitive affordances of proximity and paths perceived from a circuit schematic diagram have great influence on the design of an iconic diagram, or assembly diagram, representing a topologically equivalent electronic circuit. Such cognitive affordances reflect and affect thought and can be used as powerful pedagogical tools within an educational scenario

    A theory of structural model validity in simulation.

    Get PDF
    During the last decennia, the practice of simulation has become increasingly popular among many system analysts, model builders and general scientists for the purpose of studying complex systems that surpass the operability of analytical solution techniques. As a consequence of the pragmatic orientation of simulation, a vital stage for a successful application is the issue of validating a constructed simulation model. Employing the model as an effective instrument for assessing the benefit of structural changes or for predicting future observations makes validation an essential part of any productive simulation study. The diversity of the employment field of simulation however brings about that there exists an irrefutable level of ambiguity concerning the principal subject of this validation process. Further, the literature has come up with a plethora of ad hoc validation techniques that have mostly been inherited from standard statistical analysis. It lies within the aim of this paper to reflect on the issue of validation in simulation and to present the reader with a topological parallelism of the classical philosophical polarity of objectivism versus relativism. First, we will position validation in relation to verification and accreditation and elaborate on the prime actors in validation, i.e. a conceptual model, a formal model and behaviour. Next, we will formally derive a topological interpretation of structural validation for both objectivists and relativists. As will be seen, recent advances in the domain of fuzzy topology allow for a valuable metaphor of a relativistic attitude towards modelling and structural validation. Finally, we will discuss several general types of modelling errors that may occur and examine their repercussion on the natural topological spaces of objectivists and relativists. We end this paper with a formal, topological oriented definition of structural model validity for both objectivists and relativists. The paper is concluded with summarising the most important findings and giving a direction for future research.Model; Simulation; Theory; Scientists; Processes; Statistical analysis;

    Clifford Algebra: A Case for Geometric and Ontological Unification

    Get PDF
    Robert Batterman’s ontological insights (2002, 2004, 2005) are apt: Nature abhors singularities. “So should we,” responds the physicist. However, the epistemic assessments of Batterman concerning the matter prove to be less clear, for in the same vein he write that singularities play an essential role in certain classes of physical theories referring to certain types of critical phenomena. I devise a procedure (“methodological fundamentalism”) which exhibits how singularities, at least in principle, may be avoided within the same classes of formalisms discussed by Batterman. I show that we need not accept some divergence between explanation and reduction (Batterman 2002), or between epistemological and ontological fundamentalism (Batterman 2004, 2005). Though I remain sympathetic to the ‘principle of charity’ (Frisch (2005)), which appears to favor a pluralist outlook, I nevertheless call into question some of the forms such pluralist implications take in Robert Batterman’s conclusions. It is difficult to reconcile some of the pluralist assessments that he and some of his contemporaries advocate with what appears to be a countervailing trend in a burgeoning research tradition known as Clifford (or geometric) algebra. In my critical chapters (2 and 3) I use some of the demonstrated formal unity of Clifford algebra to argue that Batterman (2002) equivocates a physical theory’s ontology with its purely mathematical content. Carefully distinguishing the two, and employing Clifford algebraic methods reveals a symmetry between reduction and explanation that Batterman overlooks. I refine this point by indicating that geometric algebraic methods are an active area of research in computational fluid dynamics, and applied in modeling the behavior of droplet-formation appear to instantiate a “methodologically fundamental” approach. I argue in my introductory and concluding chapters that the model of inter-theoretic reduction and explanation offered by Fritz Rohrlich (1988, 1994) provides the best framework for accommodating the burgeoning pluralism in philosophical studies of physics, with the presumed claims of formal unification demonstrated by physicists choices of mathematical formalisms such as Clifford algebra. I show how Batterman’s insights can be reconstructed in Rohrlich’s framework, preserving Batterman’s important philosophical work, minus what I consider are his incorrect conclusions

    A treatment of stereochemistry in computer aided organic synthesis

    Get PDF
    This thesis describes the author’s contributions to a new stereochemical processing module constructed for the ARChem retrosynthesis program. The purpose of the module is to add the ability to perform enantioselective and diastereoselective retrosynthetic disconnections and generate appropriate precursor molecules. The module uses evidence based rules generated from a large database of literature reactions. Chapter 1 provides an introduction and critical review of the published body of work for computer aided synthesis design. The role of computer perception of key structural features (rings, functions groups etc.) and the construction and use of reaction transforms for generating precursors is discussed. Emphasis is also given to the application of strategies in retrosynthetic analysis. The availability of large reaction databases has enabled a new generation of retrosynthesis design programs to be developed that use automatically generated transforms assembled from published reactions. A brief description of the transform generation method employed by ARChem is given. Chapter 2 describes the algorithms devised by the author for handling the computer recognition and representation of the stereochemical features found in molecule and reaction scheme diagrams. The approach is generalised and uses flexible recognition patterns to transform information found in chemical diagrams into concise stereo descriptors for computer processing. An algorithm for efficiently comparing and classifying pairs of stereo descriptors is described. This algorithm is central for solving the stereochemical constraints in a variety of substructure matching problems addressed in chapter 3. The concise representation of reactions and transform rules as hyperstructure graphs is described. Chapter 3 is concerned with the efficient and reliable detection of stereochemical symmetry in both molecules, reactions and rules. A novel symmetry perception algorithm, based on a constraints satisfaction problem (CSP) solver, is described. The use of a CSP solver to implement an isomorph‐free matching algorithm for stereochemical substructure matching is detailed. The prime function of this algorithm is to seek out unique retron locations in target molecules and then to generate precursor molecules without duplications due to symmetry. Novel algorithms for classifying asymmetric, pseudo‐asymmetric and symmetric stereocentres; meso, centro, and C2 symmetric molecules; and the stereotopicity of trigonal (sp2) centres are described. Chapter 4 introduces and formalises the annotated structural language used to create both retrosynthetic rules and the patterns used for functional group recognition. A novel functional group recognition package is described along with its use to detect important electronic features such as electron‐withdrawing or donating groups and leaving groups. The functional groups and electronic features are used as constraints in retron rules to improve transform relevance. Chapter 5 details the approach taken to design detailed stereoselective and substrate controlled transforms from organised hierarchies of rules. The rules employ a rich set of constraints annotations that concisely describe the keying retrons. The application of the transforms for collating evidence based scoring parameters from published reaction examples is described. A survey of available reaction databases and the techniques for mining stereoselective reactions is demonstrated. A data mining tool was developed for finding the best reputable stereoselective reaction types for coding as transforms. For various reasons it was not possible during the research period to fully integrate this work with the ARChem program. Instead, Chapter 6 introduces a novel one‐step retrosynthesis module to test the developed transforms. The retrosynthesis algorithms use the organisation of the transform rule hierarchy to efficiently locate the best retron matches using all applicable stereoselective transforms. This module was tested using a small set of selected target molecules and the generated routes were ranked using a series of measured parameters including: stereocentre clearance and bond cleavage; example reputation; estimated stereoselectivity with reliability; and evidence of tolerated functional groups. In addition a method for detecting regioselectivity issues is presented. This work presents a number of algorithms using common set and graph theory operations and notations. Appendix A lists the set theory symbols and meanings. Appendix B summarises and defines the common graph theory terminology used throughout this thesis

    Doubly-Special Relativity: Facts, Myths and Some Key Open Issues

    Full text link
    I report, emphasizing some key open issues and some aspects that are particularly relevant for phenomenology, on the status of the development of "doubly-special" relativistic ("DSR") theories with both an observer-independent high-velocity scale and an observer-independent small-length/large-momentum scale, possibly relevant for the Planck-scale/quantum-gravity realm. I also give a true/false characterization of the structure of these theories. In particular, I discuss a DSR scenario without modification of the energy-momentum dispersion relation and without the Îș\kappa-Poincar\'e Hopf algebra, a scenario with deformed Poincar\'e symmetries which is not a DSR scenario, some scenarios with both an invariant length scale and an invariant velocity scale which are not DSR scenarios, and a DSR scenario in which it is easy to verify that some observable relativistic (but non-special-relativistic) features are insensitive to possible nonlinear redefinitions of symmetry generators.Comment: This is the preprint version of a paper prepared for a special issue "Feature Papers: Symmetry Concepts and Applications" of the journal Symmetr

    Aerospace Medicine and Biology: A continuing bibliography with indexes, supplement 199

    Get PDF
    This bibliography lists 82 reports, articles, and other documents introduced into the NASA scientific and technical information system in October 1979

    Scale invariance in natural and artificial collective systems : a review

    Get PDF
    Self-organized collective coordinated behaviour is an impressive phenomenon, observed in a variety of natural and artificial systems, in which coherent global structures or dynamics emerge from local interactions between individual parts. If the degree of collective integration of a system does not depend on size, its level of robustness and adaptivity is typically increased and we refer to it as scale-invariant. In this review, we first identify three main types of self-organized scale-invariant systems: scale-invariant spatial structures, scale-invariant topologies and scale-invariant dynamics. We then provide examples of scale invariance from different domains in science, describe their origins and main features and discuss potential challenges and approaches for designing and engineering artificial systems with scale-invariant properties
    • 

    corecore