285 research outputs found

    Residual vectors for Alzheimer disease diagnosis and prognostication

    Get PDF
    Alzheimer disease (AD) is an increasingly prevalent neurodegenerative condition and a looming socioeconomic threat. A biomarker for the disease could make the process of diagnosis easier and more accurate, and accelerate drug discovery. The current work describes a method for scoring brain images that is inspired by fundamental principles from information retrieval (IR), a branch of computer science that includes the development of Internet search engines. For this research, a dataset of 254 baseline 18-F fluorodeoxyglucose positron emission tomography (FDG-PET) scans was obtained from the Alzheimer's Disease Neuroimaging Initiative (ADNI). For a given contrast, a subset of scans (nine of every 10) was used to compute a residual vector that typified the difference, at each voxel, between the two groups being contrasted. Scans that were not used for computing the residual vector (the remaining one of 10 scans) were then compared to the residual vector using a cosine similarity metric. This process was repeated sequentially, each time generating cosine similarity scores on 10% of the FDG-PET scans for each contrast. Statistical analysis revealed that the scores were significant predictors of functional decline as measured by the Functional Activities Questionnaire (FAQ). When logistic regression models that incorporated these scores were evaluated with leave-one-out cross-validation, cognitively normal controls were discerned from AD with sensitivity and specificity of 94.4% and 84.8%, respectively. Patients who converted from mild cognitive impairment (MCI) to AD were discerned from MCI nonconverters with sensitivity and specificity of 89.7% and 62.9%, respectively, when FAQ scores were brought into the model. Residual vectors are easy to compute and provide a simple method for scoring the similarity between an FDG-PET scan and sets of examples from a given diagnostic group. The method is readily generalizable to any imaging modality. Further interdisciplinary work between IR and clinical neuroscience is warranted

    The ethics of biobanking: key issues and controversies

    Get PDF
    The ethics of biobanking is one of the most controversial issues in current bioethics and public health debates. For some, biobanks offer the possibility of unprecedented advances which will revolutionise research and improve the health of future generations. For others they are worrying repositories of personal information and tissue which will be used without sufficient respect for those from whom they came. Wherever one stands on this spectrum, from an ethics perspective biobanks are revolutionary. Traditional ethical safeguards of informed consent and confidentiality, for example, simply don’t work for the governance of biobanks and as a result new ethical structures are required. Thus it is not too great a claim to say that biobanks require a rethinking of our ethical assumptions and frameworks which we have applied generally to other issues in ethics. This paper maps the key challenges and controversies of biobanking ethics; it considers; informed consent (its problems in biobanking and possibilities of participants’ withdrawal), broad consent, the problems of confidentiality, ownership, property and comercialisation issues, feedback to participants and the ethics of re-contact

    Concepts and Their Dynamics: A Quantum-Theoretic Modeling of Human Thought

    Full text link
    We analyze different aspects of our quantum modeling approach of human concepts, and more specifically focus on the quantum effects of contextuality, interference, entanglement and emergence, illustrating how each of them makes its appearance in specific situations of the dynamics of human concepts and their combinations. We point out the relation of our approach, which is based on an ontology of a concept as an entity in a state changing under influence of a context, with the main traditional concept theories, i.e. prototype theory, exemplar theory and theory theory. We ponder about the question why quantum theory performs so well in its modeling of human concepts, and shed light on this question by analyzing the role of complex amplitudes, showing how they allow to describe interference in the statistics of measurement outcomes, while in the traditional theories statistics of outcomes originates in classical probability weights, without the possibility of interference. The relevance of complex numbers, the appearance of entanglement, and the role of Fock space in explaining contextual emergence, all as unique features of the quantum modeling, are explicitly revealed in this paper by analyzing human concepts and their dynamics.Comment: 31 pages, 5 figure

    Effect of field exposure to 38-year-old residual petroleum hydrocarbons on growth, condition index, and filtration rate of the ribbed mussel, Geukensia demissa

    Get PDF
    Author Posting. © The Author(s), 2007. This is the author's version of the work. It is posted here by permission of Elsevier B.V. for personal use, not for redistribution. The definitive version was published in Environmental Pollution 154 (2008): 312-319, doi:10.1016/j.envpol.2007.10.008.In September 1969, the Florida barge spilled 700,000 L of No. 2 fuel oil into the salt marsh sediments of Wild Harbor, MA. Today a substantial amount, approximately 100 kg, of moderately degraded petroleum remains within the sediment and along eroding creek banks. The ribbed mussels, Geukensia demissa, which inhabit the salt marsh creek bank, are exposed to the spilled oil. Examination of short-term exposure was done with transplantation of G. demissa from a control site, Great Sippewissett marsh, into Wild Harbor. We examined the effects of long-term exposure with transplantation of mussels from Wild Harbor into Great Sippewissett. Both the short- and long-term exposure transplants exhibited slower growth rates, shorter mean shell lengths, lower condition indices, and decreased filtration rates. Our results add new knowledge about long-term consequences of spilled oil, a dimension that should be included when assessing oil-impacted areas and developing management plans designed to restore, rehabilitate, or replace impacted areas.This work is the result of research sponsored by NOAA National Sea Grant College Program Office, Department of Commerce, under Grant No. NA16RG2273, Woods Hole Oceanographic Institution Sea Grant Project No. R/P-73. Additional support was provided by funding from the NSF-funded Research Experience for Undergraduates program, award 0453292, an Office of Naval Research Young Investigator Award (N00014-04-01-0029) to C. Reddy

    Ideologies and their points of view

    Full text link
    © Springer International Publishing Switzerland 2016. It is well known that different arguments appeal to different people. We all process information in ways that are adapted to be consistent with our underlying ideologies. These ideologies can sometimes be framed in terms of particular axes or dimensions, which makes it possible to represent some aspects of an ideology as a region in the kind of vector space that is typical of many generalised quantum models. Such models can then be used to explain and predict, in broad strokes, whether a particular argument or proposal is likely to appeal to an individual with a particular ideology. The choice of suitable arguments to bring about desired actions is traditionally part of the art or science of rhetoric, and today’s highly polarised society means that this skill is becoming more important than ever. This paper presents a basic model for understanding how different goals will appeal to people with different ideologies, and thus how different rhetorical positions can be adopted to promote the same desired outcome. As an example, we consider different narratives and hence actions with respect to the environment and climate change, an important but currently highly controversial topic

    Experimental Evidence for Quantum Structure in Cognition

    Full text link
    We proof a theorem that shows that a collection of experimental data of membership weights of items with respect to a pair of concepts and its conjunction cannot be modeled within a classical measure theoretic weight structure in case the experimental data contain the effect called overextension. Since the effect of overextension, analogue to the well-known guppy effect for concept combinations, is abundant in all experiments testing weights of items with respect to pairs of concepts and their conjunctions, our theorem constitutes a no-go theorem for classical measure structure for common data of membership weights of items with respect to concepts and their combinations. We put forward a simple geometric criterion that reveals the non classicality of the membership weight structure and use experimentally measured membership weights estimated by subjects in experiments to illustrate our geometrical criterion. The violation of the classical weight structure is similar to the violation of the well-known Bell inequalities studied in quantum mechanics, and hence suggests that the quantum formalism and hence the modeling by quantum membership weights can accomplish what classical membership weights cannot do.Comment: 12 pages, 3 figure

    Meaning-focused and Quantum-inspired Information Retrieval

    Full text link
    In recent years, quantum-based methods have promisingly integrated the traditional procedures in information retrieval (IR) and natural language processing (NLP). Inspired by our research on the identification and application of quantum structures in cognition, more specifically our work on the representation of concepts and their combinations, we put forward a 'quantum meaning based' framework for structured query retrieval in text corpora and standardized testing corpora. This scheme for IR rests on considering as basic notions, (i) 'entities of meaning', e.g., concepts and their combinations and (ii) traces of such entities of meaning, which is how documents are considered in this approach. The meaning content of these 'entities of meaning' is reconstructed by solving an 'inverse problem' in the quantum formalism, consisting of reconstructing the full states of the entities of meaning from their collapsed states identified as traces in relevant documents. The advantages with respect to traditional approaches, such as Latent Semantic Analysis (LSA), are discussed by means of concrete examples.Comment: 11 page

    Quantum Aspects of Semantic Analysis and Symbolic Artificial Intelligence

    Full text link
    Modern approaches to semanic analysis if reformulated as Hilbert-space problems reveal formal structures known from quantum mechanics. Similar situation is found in distributed representations of cognitive structures developed for the purposes of neural networks. We take a closer look at similarites and differences between the above two fields and quantum information theory.Comment: version accepted in J. Phys. A (Letter to the Editor
    corecore