96 research outputs found

    Inconsistent boundaries

    Get PDF
    Research on this paper was supported by a grant from the Marsden Fund, Royal Society of New Zealand.Mereotopology is a theory of connected parts. The existence of boundaries, as parts of everyday objects, is basic to any such theory; but in classical mereotopology, there is a problem: if boundaries exist, then either distinct entities cannot be in contact, or else space is not topologically connected (Varzi in Noûs 31:26–58, 1997). In this paper we urge that this problem can be met with a paraconsistent mereotopology, and sketch the details of one such approach. The resulting theory focuses attention on the role of empty parts, in delivering a balanced and bounded metaphysics of naive space.PostprintPeer reviewe

    Using Evolutionary Algorithms for Fitting High-Dimensional Models to Neuronal Data

    Get PDF
    In the study of neurosciences, and of complex biological systems in general, there is frequently a need to fit mathematical models with large numbers of parameters to highly complex datasets. Here we consider algorithms of two different classes, gradient following (GF) methods and evolutionary algorithms (EA) and examine their performance in fitting a 9-parameter model of a filter-based visual neuron to real data recorded from a sample of 107 neurons in macaque primary visual cortex (V1). Although the GF method converged very rapidly on a solution, it was highly susceptible to the effects of local minima in the error surface and produced relatively poor fits unless the initial estimates of the parameters were already very good. Conversely, although the EA required many more iterations of evaluating the model neuron’s response to a series of stimuli, it ultimately found better solutions in nearly all cases and its performance was independent of the starting parameters of the model. Thus, although the fitting process was lengthy in terms of processing time, the relative lack of human intervention in the evolutionary algorithm, and its ability ultimately to generate model fits that could be trusted as being close to optimal, made it far superior in this particular application than the gradient following methods. This is likely to be the case in many further complex systems, as are often found in neuroscience

    The effects of clinical task interruptions on subsequent performance of a medication pre-administration task

    Get PDF
    There is a surge of research exploring the role of task interruptions in the manifestation of primary task errors both in controlled experimental settings, and safety critical workplaces such as healthcare. Despite such research providing valuable insights into the disruptive properties of task interruption, and, the importance of considering the likely disruptive consequences of clinical task interruptions in healthcare environments, there is an urgent need for an approach that best mimics complex working environments such as healthcare, whilst allowing better control over experimental variables with minimal constraints. We propose that this can be achieved with ecologically sensitive experimental tasks designed to have high levels of experimental control so that theoretical as well as practical parameters and factors can be tested. We developed a theoretically and ecologically informed procedural memory-based task - the CAMROSE Medication Pre-Administration Task. Results revealed significantly more sequence errors were made on low, moderate and high complex conditions compared to no interruption condition. There was no significant difference in non-sequence errors. Findings reveal the importance of developing ecologically valid tasks to explore non-observable characteristics of clinical task interruptions. Both theoretical and possible practical implications are discussed

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    How Humans Differ from Other Animals in Their Levels of Morphological Variation

    Get PDF
    Animal species come in many shapes and sizes, as do the individuals and populations that make up each species. To us, humans might seem to show particularly high levels of morphological variation, but perhaps this perception is simply based on enhanced recognition of individual conspecifics relative to individual heterospecifics. We here more objectively ask how humans compare to other animals in terms of body size variation. We quantitatively compare levels of variation in body length (height) and mass within and among 99 human populations and 848 animal populations (210 species). We find that humans show low levels of within-population body height variation in comparison to body length variation in other animals. Humans do not, however, show distinctive levels of within-population body mass variation, nor of among-population body height or mass variation. These results are consistent with the idea that natural and sexual selection have reduced human height variation within populations, while maintaining it among populations. We therefore hypothesize that humans have evolved on a rugged adaptive landscape with strong selection for body height optima that differ among locations

    Anomalous visual experience is linked to perceptual uncertainty and visual imagery vividness

    Get PDF
    An imbalance between top-down and bottom-up processing on perception (specifically, over-reliance on top-down processing) can lead to anomalous perception, such as illusions. One factor that may be involved in anomalous perception is visual mental imagery, which is the experience of “seeing” with the mind’s eye. There are vast individual differences in self-reported imagery vividness, and more vivid imagery is linked to a more sensory-like experience. We, therefore, hypothesized that susceptibility to anomalous perception is linked to individual imagery vividness. To investigate this, we adopted a paradigm that is known to elicit the perception of faces in pure visual noise (pareidolia). In four experiments, we explored how imagery vividness contributes to this experience under different response instructions and environments. We found strong evidence that people with more vivid imagery were more likely to see faces in the noise, although removing suggestive instructions weakened this relationship. Analyses from the first two experiments led us to explore confidence as another factor in pareidolia proneness. We, therefore, modulated environment noise and added a confidence rating in a novel design. We found strong evidence that pareidolia proneness is correlated with uncertainty about real percepts. Decreasing perceptual ambiguity abolished the relationship between pareidolia proneness and both imagery vividness and confidence. The results cannot be explained by incidental face-like patterns in the noise, individual variations in response bias, perceptual sensitivity, subjective perceptual thresholds, viewing distance, testing environments, motivation, gender, or prosopagnosia. This indicates a critical role of mental imagery vividness and perceptual uncertainty in anomalous perceptual experience. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (10.1007/s00426-020-01364-7) contains supplementary material, which is available to authorized users

    Spatio-structural granularity of biological material entities

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>With the continuously increasing demands on knowledge- and data-management that databases have to meet, ontologies and the theories of granularity they use become more and more important. Unfortunately, currently used theories and schemes of granularity unnecessarily limit the performance of ontologies due to two shortcomings: (i) they do not allow the integration of multiple granularity perspectives into one granularity framework; (ii) they are not applicable to cumulative-constitutively organized material entities, which cover most of the biomedical material entities.</p> <p>Results</p> <p>The above mentioned shortcomings are responsible for the major inconsistencies in currently used spatio-structural granularity schemes. By using the Basic Formal Ontology (BFO) as a top-level ontology and Keet's general theory of granularity, a granularity framework is presented that is applicable to cumulative-constitutively organized material entities. It provides a scheme for granulating complex material entities into their constitutive and regional parts by integrating various compositional and spatial granularity perspectives. Within a scale dependent resolution perspective, it even allows distinguishing different types of representations of the same material entity. Within other scale dependent perspectives, which are based on specific types of measurements (e.g. weight, volume, etc.), the possibility of organizing instances of material entities independent of their parthood relations and only according to increasing measures is provided as well. All granularity perspectives are connected to one another through overcrossing granularity levels, together forming an integrated whole that uses the <it>compositional object perspective </it>as an integrating backbone. This granularity framework allows to consistently assign structural granularity values to all different types of material entities.</p> <p>Conclusions</p> <p>The here presented framework provides a spatio-structural granularity framework for all domain reference ontologies that model cumulative-constitutively organized material entities. With its multi-perspectives approach it allows querying an ontology stored in a database at one's own desired different levels of detail: The contents of a database can be organized according to diverse granularity perspectives, which in their turn provide different <it>views </it>on its content (i.e. data, knowledge), each organized into different levels of detail.</p

    The Past and Future of Evolutionary Economics : Some Reflections Based on New Bibliometric Evidence

    Get PDF
    This document is the Accepted Manuscript version of the following article: Geoffrey M. Hodgson, and Juha-Antti Lamberg, ‘The past and future of evolutionary economics: some reflections based on new bibliometric evidence’, Evolutionary and Institutional Economics Review, first online 20 June 2016. The final publication is available at Springer via doi: http://dx.doi.org/10.1007/s40844-016-0044-3 © Japan Association for Evolutionary Economics 2016The modern wave of ‘evolutionary economics’ was launched with the classic study by Richard Nelson and Sidney Winter (1982). This paper reports a broad bibliometric analysis of ‘evolutionary’ research in the disciplines of management, business, economics, and sociology over 25 years from 1986 to 2010. It confirms that Nelson and Winter (1982) is an enduring nodal reference point for this broad field. The bibliometric evidence suggests that ‘evolutionary economics’ has benefitted from the rise of business schools and other interdisciplinary institutions, which have provided a home for evolutionary terminology, but it has failed to nurture a strong unifying core narrative or theory, which in turn could provide superior answers to important questions. This bibliometric evidence also shows that no strong cluster of general theoretical research immediately around Nelson and Winter (1982) has subsequently emerged. It identifies developmental problems in a partly successful but fragmented field. Future research in ‘evolutionary economics’ needs a more integrated research community with shared conceptual narratives and common research questions, to promote conversation and synergy between diverse clusters of research.Peer reviewedFinal Accepted Versio

    Recombinational Landscape and Population Genomics of Caenorhabditis elegans

    Get PDF
    Recombination rate and linkage disequilibrium, the latter a function of population genomic processes, are the critical parameters for mapping by linkage and association, and their patterns in Caenorhabditis elegans are poorly understood. We performed high-density SNP genotyping on a large panel of recombinant inbred advanced intercross lines (RIAILs) of C. elegans to characterize the landscape of recombination and, on a panel of wild strains, to characterize population genomic patterns. We confirmed that C. elegans autosomes exhibit discrete domains of nearly constant recombination rate, and we show, for the first time, that the pattern holds for the X chromosome as well. The terminal domains of each chromosome, spanning about 7% of the genome, exhibit effectively no recombination. The RIAILs exhibit a 5.3-fold expansion of the genetic map. With median marker spacing of 61 kb, they are a powerful resource for mapping quantitative trait loci in C. elegans. Among 125 wild isolates, we identified only 41 distinct haplotypes. The patterns of genotypic similarity suggest that some presumed wild strains are laboratory contaminants. The Hawaiian strain, CB4856, exhibits genetic isolation from the remainder of the global population, whose members exhibit ample evidence of intercrossing and recombining. The population effective recombination rate, estimated from the pattern of linkage disequilibrium, is correlated with the estimated meiotic recombination rate, but its magnitude implies that the effective rate of outcrossing is extremely low, corroborating reports of selection against recombinant genotypes. Despite the low population, effective recombination rate and extensive linkage disequilibrium among chromosomes, which are techniques that account for background levels of genomic similarity, permit association mapping in wild C. elegans strains
    corecore