1,790 research outputs found
A Quantum Rosetta Stone for Interferometry
Heisenberg-limited measurement protocols can be used to gain an increase in
measurement precision over classical protocols. Such measurements can be
implemented using, e.g., optical Mach-Zehnder interferometers and Ramsey
spectroscopes. We address the formal equivalence between the Mach-Zehnder
interferometer, the Ramsey spectroscope, and the discrete Fourier transform.
Based on this equivalence we introduce the ``quantum Rosetta stone'', and we
describe a projective-measurement scheme for generating the desired
correlations between the interferometric input states in order to achieve
Heisenberg-limited sensitivity. The Rosetta stone then tells us the same method
should work in atom spectroscopy.Comment: 8 pages, 4 figure
Categorical Ontology of Complex Systems, Meta-Systems and Theory of Levels: The Emergence of Life, Human Consciousness and Society
Single cell interactomics in simpler organisms, as well as somatic cell interactomics in multicellular organisms, involve biomolecular interactions in complex signalling pathways that were recently represented in modular terms by quantum automata with ‘reversible behavior’ representing normal cell cycling and division. Other implications of such quantum automata, modular modeling of signaling pathways and cell differentiation during development are in the fields of neural plasticity and brain development leading to quantum-weave dynamic patterns and specific molecular processes underlying extensive memory, learning, anticipation mechanisms and the emergence of human consciousness during the early brain development in children. Cell interactomics is here represented for the first time as a mixture of ‘classical’ states that determine molecular dynamics subject to Boltzmann statistics and ‘steady-state’, metabolic (multi-stable) manifolds, together with ‘configuration’ spaces of metastable quantum states emerging from complex quantum dynamics of interacting networks of biomolecules, such as proteins and nucleic acids that are now collectively defined as quantum interactomics. On the other hand, the time dependent evolution over several generations of cancer cells --that are generally known to undergo frequent and extensive genetic mutations and, indeed, suffer genomic transformations at the chromosome level (such as extensive chromosomal aberrations found in many colon cancers)-- cannot be correctly represented in the ‘standard’ terms of quantum automaton modules, as the normal somatic cells can. This significant difference at the cancer cell genomic level is therefore reflected in major changes in cancer cell interactomics often from one cancer cell ‘cycle’ to the next, and thus it requires substantial changes in the modeling strategies, mathematical tools and experimental designs aimed at understanding cancer mechanisms. Novel solutions to this important problem in carcinogenesis are proposed and experimental validation procedures are suggested. From a medical research and clinical standpoint, this approach has important consequences for addressing and preventing the development of cancer resistance to medical therapy in ongoing clinical trials involving stage III cancer patients, as well as improving the designs of future clinical trials for cancer treatments.\ud
\ud
\ud
KEYWORDS: Emergence of Life and Human Consciousness;\ud
Proteomics; Artificial Intelligence; Complex Systems Dynamics; Quantum Automata models and Quantum Interactomics; quantum-weave dynamic patterns underlying human consciousness; specific molecular processes underlying extensive memory, learning, anticipation mechanisms and human consciousness; emergence of human consciousness during the early brain development in children; Cancer cell ‘cycling’; interacting networks of proteins and nucleic acids; genetic mutations and chromosomal aberrations in cancers, such as colon cancer; development of cancer resistance to therapy; ongoing clinical trials involving stage III cancer patients’ possible improvements of the designs for future clinical trials and cancer treatments. \ud
\u
Do you see what I mean?
Visualizers, like logicians, have long been concerned with meaning. Generalizing from MacEachren's overview of cartography, visualizers have to think about how people extract meaning from pictures (psychophysics), what people understand from a picture (cognition), how pictures are imbued with meaning (semiotics), and how in some cases that meaning arises within a social and/or cultural context. If we think of the communication acts carried out in the visualization process further levels of meaning are suggested. Visualization begins when someone has data that they wish to explore and interpret; the data are encoded as input to a visualization system, which may in its turn interact with other systems to produce a representation. This is communicated back to the user(s), who have to assess this against their goals and knowledge, possibly leading to further cycles of activity. Each phase of this process involves communication between two parties. For this to succeed, those parties must share a common language with an agreed meaning. We offer the following three steps, in increasing order of formality: terminology (jargon), taxonomy (vocabulary), and ontology. Our argument in this article is that it's time to begin synthesizing the fragments and views into a level 3 model, an ontology of visualization. We also address why this should happen, what is already in place, how such an ontology might be constructed, and why now
The Incomplete Rosetta Stone Problem: Identifiability Results for Multi-View Nonlinear ICA
We consider the problem of recovering a common latent source with independent
components from multiple views. This applies to settings in which a variable is
measured with multiple experimental modalities, and where the goal is to
synthesize the disparate measurements into a single unified representation. We
consider the case that the observed views are a nonlinear mixing of
component-wise corruptions of the sources. When the views are considered
separately, this reduces to nonlinear Independent Component Analysis (ICA) for
which it is provably impossible to undo the mixing. We present novel
identifiability proofs that this is possible when the multiple views are
considered jointly, showing that the mixing can theoretically be undone using
function approximators such as deep neural networks. In contrast to known
identifiability results for nonlinear ICA, we prove that independent latent
sources with arbitrary mixing can be recovered as long as multiple,
sufficiently different noisy views are available
Context-Oriented Algorithmic Design
Currently, algorithmic approaches are being introduced in several areas of expertise, namely Architecture. Algorithmic Design (AD) is an approach for architecture that takes advantage of algorithms to produce complex designs, to simplify the exploration of variations, or to mechanize tasks, including those related to analysis and optimization of designs. However, architects might need different models of the same design for different kinds of analysis, which tempts them to extend the same code base for different purposes, typically making the code brittle and hard to understand. In this paper, we propose to extend AD with Context-Oriented Programming (COP), a programming paradigm based on context that dynamically changes the behavior of the code. To this end, we propose a COP library and we explore its combination with an AD tool. Finally, we implement two case studies with our context-oriented approach, and discuss their advantages and disadvantages when compared to the traditional AD approach
Elfin UI:a graphical interface for protein design with modular building blocks
Molecular models have enabled understanding of biological structures and functions and allowed design of novel macro-molecules. Graphical user interfaces (GUIs) in molecular modeling are generally focused on atomic representations, but, especially for proteins, do not usually address designs of complex and large architectures, from nanometers to microns. Therefore, we have developed Elfin UI as a Blender add-on for the interactive design of large protein architectures with custom shapes. Elfin UI relies on compatible building blocks to design single- and multiple-chain protein structures. The software can be used: (1) as an interactive environment to explore building blocks combinations; and (2) as a computer aided design (CAD) tool to define target shapes that guide automated design. Elfin UI allows users to rapidly build new protein shapes, without the need to focus on amino acid sequence, and aims to make design of proteins and protein-based materials intuitive and accessible to researchers and members of the general public with limited expertise in protein engineering
Issues in the Design of a Pilot Concept-Based Query Interface for the Neuroinformatics Information Framework
This paper describes a pilot query interface that has been constructed to help us explore a "concept-based" approach for searching the
Neuroscience Information Framework (NIF). The query interface is
concept-based in the sense that the search terms submitted through the
interface are selected from a standardized vocabulary of terms
(concepts) that are structured in the form of an ontology. The NIF
contains three primary resources: the NIF Resource Registry, the NIF
Document Archive, and the NIF Database Mediator. These NIF resources
are very different in their nature and therefore pose challenges when
designing a single interface from which searches can be automatically
launched against all three resources simultaneously. The paper first
discusses briefly several background issues involving the use of
standardized biomedical vocabularies in biomedical information
retrieval, and then presents a detailed example that illustrates how
the pilot concept-based query interface operates. The paper concludes
by discussing certain lessons learned in the development of the current
version of the interface
- …