591 research outputs found

    Supporting resource-based analysis of task information needs

    Get PDF
    We investigate here an approach to modelling the dynamic information requirements of a user performing a number of tasks, addressing both the provision and representation of information, viewing the information as being distributed across a set of resources. From knowledge of available resources at the user interface, and task information needs we can identify whether the system provides the user with adequate support for task execution. We look at how we can use tools to help reason about these issues, and illustrate their use through an example.We also consider a full range of analyses suggested using this approach which could potentially be supported by automated reasoning systems.(undefined

    Winnowing Wheat from Chaff: The Chunking GA

    Full text link
    In this work, we investigate the ability of a Chunking GA (ChGA) to reduce the size of variable length chromosomes and control bloat. The ChGA consists of a standard genetic algorithm augmented by a communal building block memory system and associated memory chromosomes and operators. A new mxn MaxSum fitness function used for this work is also described. Results show that a ChGA equipped with memory capacity equal to or greater than the minimal size of an optimal solution naturally eliminates unexpressed genes. © Springer-Verlag Berlin Heidelberg 2004

    A prospective evaluation of the predictive value of faecal calprotectin in quiescent Crohn’s disease

    Get PDF
    Background: The faecal calprotectin (FC) test is a non-invasive marker for gastrointestinal inflammation. Aim: To determine whether higher FC levels in individuals with quiescent Crohn’s disease are associated with clinical relapse over the ensuing 12 months.<p></p> Methods: A single centre prospective study was undertaken in Crohn's disease patients in clinical remission attending for routine review. The receiver operating characteristic (ROC) curve for the primary endpoint of clinical relapse by 12 months, based on FC at baseline, was calculated. Kaplan-Meier curves of time to relapse were based on the resulting optimal FC cutoff for predicting relapse.<p></p> Results: Of 97 patients recruited, 92 were either followed up for 12 months without relapsing, or reached the primary endpoint within that period. Of these, 10 (11%) had relapsed by 12 months. The median FC was lower for non-relapsers, 96µg/g (IQR 39-237), than for relapsers, 414µg/g (IQR 259-590), (p=0.005). The area under the ROC curve to predict relapse using FC was 77.4%. An optimal cutoff FC value of 240µg/g to predict relapse of quiescent Crohn’s had sensitivity of 80.0% and specificity of 74.4%. Negative predictive value was 96.8% and positive predictive value was 27.6%. FC≥240μg/g was associated with likelihood of relapse 5.7 (95% CI 1.9-17.3) times higher within 2.3 years than lower values (p=0.002).<p></p> Conclusions: In this prospective dataset, FC appears to be a useful, non-invasive tool to help identify quiescent Crohn’s disease patients at a low risk of relapse over the ensuing 12 months. FC of 240µg/g was the optimal cutoff in this cohort.<p></p&gt

    A systematic review examining the relationship between cytokines and cachexia in incurable cancer

    Get PDF
    Cancer cachexia is an unmet clinical need that affects more than 50% of patients with cancer. The systemic inflammatory response, which is mediated by a network of cytokines, has an established role in the genesis and maintenance of cancer as well as in cachexia; yet, the specific role of the cytokine milieu in cachexia requires elucidation. This systematic review aims to examine the relationship between cytokines and the cachexia syndrome in patients with incurable cancer. The databases MEDLINE, EMBASE, CINAHL, CENTRAL, PsycINFO, and Web of Science were searched for studies published between 01/01/2004 and 06/01/2020. Included studies measured cytokines and their relationship with cachexia and related symptoms/signs in adults with incurable cancer. After title screening (n = 5202), the abstracts (n = 1264) and the full-text studies (n = 322) were reviewed independently by two authors. The quality assessment of the selected papers was conducted using the modified Downs and Black checklist. Overall, 1277 patients with incurable cancer and 155 healthy controls were analysed in the 17 eligible studies. The mean age of the patients was 64 ± 15 (mean ± standard deviation). Only 34% of included participants were female. The included studies were assessed as moderate-quality to high-quality evidence (mean quality score: 7.8; range: 5–10). A total of 31 cytokines were examined in this review, of which interleukin-6 (IL-6, 14 studies) and tumour necrosis factor-α (TNF-α, 12 studies) were the most common. The definitions of cachexia and the weight-loss thresholds were highly variable across studies. Although the data could not be meta-analysed due to the high degree of methodological heterogeneity, the findings were discussed in a systematic manner. IL-6, TNF-α, and IL-8 were greater in cachectic patients compared with healthy individuals. Also, IL-6 levels were higher in cachectic participants as opposed to non-cachectic patients. Leptin, interferon-γ, IL-1β, IL-10, adiponectin, and ghrelin did not demonstrate any significant difference between groups when individuals with cancer cachexia were compared against non-cachectic patients or healthy participants. These findings suggest that a network of cytokines, commonly IL-6, TNF-α, and IL-8, are associated with the development of cachexia. Yet, this relationship is not proven to be causative and future studies should opt for longitudinal designs with consistent methodological approaches, as well as adequate techniques for analysing and reporting the results

    Monte Carlo Methods for Estimating Interfacial Free Energies and Line Tensions

    Full text link
    Excess contributions to the free energy due to interfaces occur for many problems encountered in the statistical physics of condensed matter when coexistence between different phases is possible (e.g. wetting phenomena, nucleation, crystal growth, etc.). This article reviews two methods to estimate both interfacial free energies and line tensions by Monte Carlo simulations of simple models, (e.g. the Ising model, a symmetrical binary Lennard-Jones fluid exhibiting a miscibility gap, and a simple Lennard-Jones fluid). One method is based on thermodynamic integration. This method is useful to study flat and inclined interfaces for Ising lattices, allowing also the estimation of line tensions of three-phase contact lines, when the interfaces meet walls (where "surface fields" may act). A generalization to off-lattice systems is described as well. The second method is based on the sampling of the order parameter distribution of the system throughout the two-phase coexistence region of the model. Both the interface free energies of flat interfaces and of (spherical or cylindrical) droplets (or bubbles) can be estimated, including also systems with walls, where sphere-cap shaped wall-attached droplets occur. The curvature-dependence of the interfacial free energy is discussed, and estimates for the line tensions are compared to results from the thermodynamic integration method. Basic limitations of all these methods are critically discussed, and an outlook on other approaches is given

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page
    corecore