113,077 research outputs found
A generalized mechanistic codon model.
Models of codon evolution have attracted particular interest because of their unique capabilities to detect selection forces and their high fit when applied to sequence evolution. We described here a novel approach for modeling codon evolution, which is based on Kronecker product of matrices. The 61 × 61 codon substitution rate matrix is created using Kronecker product of three 4 × 4 nucleotide substitution matrices, the equilibrium frequency of codons, and the selection rate parameter. The entities of the nucleotide substitution matrices and selection rate are considered as parameters of the model, which are optimized by maximum likelihood. Our fully mechanistic model allows the instantaneous substitution matrix between codons to be fully estimated with only 19 parameters instead of 3,721, by using the biological interdependence existing between positions within codons. We illustrate the properties of our models using computer simulations and assessed its relevance by comparing the AICc measures of our model and other models of codon evolution on simulations and a large range of empirical data sets. We show that our model fits most biological data better compared with the current codon models. Furthermore, the parameters in our model can be interpreted in a similar way as the exchangeability rates found in empirical codon models
NOSQL design for analytical workloads: Variability matters
Big Data has recently gained popularity and has strongly questioned relational databases as universal storage systems, especially in the presence of analytical workloads. As result, co-relational alternatives, commonly known as NOSQL (Not Only SQL) databases, are extensively used for Big Data. As the primary focus of NOSQL is on performance, NOSQL databases are directly designed at the physical level, and consequently the resulting schema is tailored to the dataset and access patterns of the problem in hand. However, we believe that NOSQL design can also benefit from traditional design approaches. In this paper we present a method to design databases for analytical workloads. Starting from the conceptual model and adopting the classical 3-phase design used for relational databases, we propose a novel design method considering the new features brought by NOSQL and encompassing relational and co-relational design altogether.Peer ReviewedPostprint (author's final draft
What is Quantum? Unifying Its Micro-Physical and Structural Appearance
We can recognize two modes in which 'quantum appears' in macro domains: (i) a
'micro-physical appearance', where quantum laws are assumed to be universal and
they are transferred from the micro to the macro level if suitable 'quantum
coherence' conditions (e.g., very low temperatures) are realized, (ii) a
'structural appearance', where no hypothesis is made on the validity of quantum
laws at a micro level, while genuine quantum aspects are detected at a
structural-modeling level. In this paper, we inquire into the connections
between the two appearances. We put forward the explanatory hypothesis that,
'the appearance of quantum in both cases' is due to 'the existence of a
specific form of organisation, which has the capacity to cope with random
perturbations that would destroy this organisation when not coped with'. We
analyse how 'organisation of matter', 'organisation of life', and 'organisation
of culture', play this role each in their specific domain of application, point
out the importance of evolution in this respect, and put forward how our
analysis sheds new light on 'what quantum is'.Comment: 10 page
Quantum Entanglement in Concept Combinations
Research in the application of quantum structures to cognitive science
confirms that these structures quite systematically appear in the dynamics of
concepts and their combinations and quantum-based models faithfully represent
experimental data of situations where classical approaches are problematical.
In this paper, we analyze the data we collected in an experiment on a specific
conceptual combination, showing that Bell's inequalities are violated in the
experiment. We present a new refined entanglement scheme to model these data
within standard quantum theory rules, where 'entangled measurements and
entangled evolutions' occur, in addition to the expected 'entangled states',
and present a full quantum representation in complex Hilbert space of the data.
This stronger form of entanglement in measurements and evolutions might have
relevant applications in the foundations of quantum theory, as well as in the
interpretation of nonlocality tests. It could indeed explain some
non-negligible 'anomalies' identified in EPR-Bell experiments.Comment: 16 pages, no figure
The Generalised Liar Paradox: A Quantum Model and Interpretation
The formalism of abstracted quantum mechanics is applied in a model of the
generalized Liar Paradox. Here, the Liar Paradox, a consistently testable
configuration of logical truth properties, is considered a dynamic conceptual
entity in the cognitive sphere. Basically, the intrinsic contextuality of the
truth-value of the Liar Paradox is appropriately covered by the abstracted
quantum mechanical approach. The formal details of the model are explicited
here for the generalized case. We prove the possibility of constructing a
quantum model of the m-sentence generalizations of the Liar Paradox. This
includes (i) the truth-falsehood state of the m-Liar Paradox can be represented
by an embedded 2m-dimensional quantum vector in a (2m)^m dimensional complex
Hilbert space, with cognitive interactions corresponding to projections, (ii)
the construction of a continuous 'time' dynamics is possible: typical truth and
falsehood value oscillations are described by Schrodinger evolution, (iii)
Kirchoff and von Neumann axioms are satisfied by introduction of 'truth-value
by inference' projectors, (iv) time invariance of unmeasured state.Comment: 13 pages, to be published in Foundations of Scienc
Eco‐Holonic 4.0 Circular Business Model to Conceptualize Sustainable Value Chain Towards Digital Transition
The purpose of this paper is to conceptualize a circular business model based on an Eco-Holonic Architecture, through the integration of circular economy and holonic principles. A conceptual model is developed to manage the complexity of integrating circular economy principles, digital transformation, and tools and frameworks for sustainability into business models. The proposed architecture is multilevel and multiscale in order to achieve the instantiation of the sustainable value chain in any territory. The architecture promotes the incorporation of circular economy and holonic principles into new circular business models. This integrated perspective of business model can support the design and upgrade of the manufacturing companies in their respective industrial sectors. The conceptual model proposed is based on activity theory that considers the interactions between technical and social systems and allows the mitigation of the metabolic rift that exists between natural and social metabolism. This study contributes to the existing literature on circular economy, circular business models and activity theory by considering holonic paradigm concerns, which have not been explored yet. This research also offers a unique holonic architecture of circular business model by considering different levels, relationships, dynamism and contextualization (territory) aspects
Quantum Theory and Human Perception of the Macro-World
We investigate the question of 'why customary macroscopic entities appear to
us humans as they do, i.e. as bounded entities occupying space and persisting
through time', starting from our knowledge of quantum theory, how it affects
the behavior of such customary macroscopic entities, and how it influences our
perception of them. For this purpose, we approach the question from three
perspectives. Firstly, we look at the situation from the standard quantum
angle, more specifically the de Broglie wavelength analysis of the behavior of
macroscopic entities, indicate how a problem with spin and identity arises, and
illustrate how both play a fundamental role in well-established experimental
quantum-macroscopical phenomena, such as Bose-Einstein condensates. Secondly,
we analyze how the question is influenced by our result in axiomatic quantum
theory, which proves that standard quantum theory is structurally incapable of
describing separated entities. Thirdly, we put forward our new 'conceptual
quantum interpretation', including a highly detailed reformulation of the
question to confront the new insights and views that arise with the foregoing
analysis. At the end of the final section, a nuanced answer is given that can
be summarized as follows. The specific and very classical perception of human
seeing -- light as a geometric theory -- and human touching -- only ruled by
Pauli's exclusion principle -- plays a role in our perception of macroscopic
entities as ontologically stable entities in space. To ascertain quantum
behavior in such macroscopic entities, we will need measuring apparatuses
capable of its detection. Future experimental research will have to show if
sharp quantum effects -- as they occur in smaller entities -- appear to be
ontological aspects of customary macroscopic entities.Comment: 28 page
Recommended from our members
Investigating the use of background knowledge for assessing the relevance of statements to an ontology in ontology evolution
The tasks of learning and enriching ontologies with new concepts and relations have attracted a lot of attention in the research community, leading to a number of tools facilitating the process of building and updating ontologies. These tools often discover new elements of information to be included in the considered ontology from external data sources such as text documents or databases, transforming these elements into ontology compatible statements or axioms. While some techniques are used to make sure that statements to be added are compatible with the ontology (e.g. through conflict detection), such tools generally pay little attention to the relevance of the statement in question. It is either assumed that any statement extracted from a data source is relevant, or that the user will assess whether a statement adds value to the ontology. In this paper, we investigate the use of background knowledge about the context where statements appear to assess their relevance. We devise a methodology to extract such a context from ontologies available online, to map it to the considered ontology and to visualize this mapping in a way that allows to study the intersection and complementarity of the two sources of knowledge. By applying this methodology on several examples, we identified an initial set of patterns giving strong indications concerning the relevance of a statement, as well as interesting issues to be considered when applying such techniques
- …