9,008 research outputs found

    Ageing as a price of cooperation and complexity: Self-organization of complex systems causes the ageing of constituent networks

    Get PDF
    The analysis of network topology and dynamics is increasingly used for the description of the structure, function and evolution of complex systems. Here we summarize key aspects of the evolvability and robustness of the hierarchical network-set of macromolecules, cells, organisms, and ecosystems. Listing the costs and benefits of cooperation as a necessary behaviour to build this network hierarchy, we outline the major hypothesis of the paper: the emergence of hierarchical complexity needs cooperation leading to the ageing of the constituent networks. Local cooperation in a stable environment may lead to over-optimization developing an ‘always-old’ network, which ages slowly, and dies in an apoptosis-like process. Global cooperation by exploring a rapidly changing environment may cause an occasional over-perturbation exhausting system-resources, causing rapid degradation, ageing and death of an otherwise ‘forever-young’ network in a necrosis-like process. Giving a number of examples we explain how local and global cooperation can both evoke and help successful ageing. Finally, we show how various forms of cooperation and consequent ageing emerge as key elements in all major steps of evolution from the formation of protocells to the establishment of the globalized, modern human society. Thus, ageing emerges as a price of complexity, which is going hand-in-hand with cooperation enhancing each other in a successful community

    Designs efficiency for non-market valuation with choice modelling: how to measure it, what to report and why

    Get PDF
    We review the basic principles for the evaluation of design efficiency in discrete choice modelling with a focus on efficiency of WTP estimates from the multinomial logit model. The discussion is developed under the realistic assumption that researchers can plausibly define a prior on the utility coefficients. Some new measures of design performance in applied studies are proposed and their rationale discussed. An empirical example based on the generation and comparison of fifteen separate designs from a common set of assumptions illustrates the relevant considerations to the context of non-market valuation, with particular emphasis placed on C-efficiency. Conclusions are drawn for the practice of reporting in non-market valuation and for future work on design research

    Optimal transport for automatic alignment of untargeted metabolomic data

    Full text link
    Untargeted metabolomic profiling through liquid chromatography-mass spectrometry (LC-MS) measures a vast array of metabolites within biospecimens, advancing drug development, disease diagnosis, and risk prediction. However, the low throughput of LC-MS poses a major challenge for biomarker discovery, annotation, and experimental comparison, necessitating the merging of multiple datasets. Current data pooling methods encounter practical limitations due to their vulnerability to data variations and hyperparameter dependence. Here we introduce GromovMatcher, a flexible and user-friendly algorithm that automatically combines LC-MS datasets using optimal transport. By capitalizing on feature intensity correlation structures, GromovMatcher delivers superior alignment accuracy and robustness compared to existing approaches. This algorithm scales to thousands of features requiring minimal hyperparameter tuning. Applying our method to experimental patient studies of liver and pancreatic cancer, we discover shared metabolic features related to patient alcohol intake, demonstrating how GromovMatcher facilitates the search for biomarkers associated with lifestyle risk factors linked to several cancer types.Comment: 43 pages, 11 figure

    Geodesic Sinkhorn: optimal transport for high-dimensional datasets

    Full text link
    Understanding the dynamics and reactions of cells from population snapshots is a major challenge in single-cell transcriptomics. Here, we present Geodesic Sinkhorn, a method for interpolating populations along a data manifold that leverages existing kernels developed for single-cell dimensionality reduction and visualization methods. Our Geodesic Sinkhorn method uses a heat-geodesic ground distance that, as compared to Euclidean ground distances, is more accurate for interpolating single-cell dynamics on a wide variety of datasets and significantly speeds up the computation for sparse kernels. We first apply Geodesic Sinkhorn to 10 single-cell transcriptomics time series interpolation datasets as a drop-in replacement for existing interpolation methods where it outperforms on all datasets, showing its effectiveness in modeling cell dynamics. Second, we show how to efficiently approximate the operator with polynomial kernels allowing us to improve scaling to large datasets. Finally, we define the conditional Wasserstein-average treatment effect and show how it can elucidate the treatment effect on single-cell populations on a drug screen.Comment: 15 pages, 5 tables, 5 figures, submitted to RECOMB 202

    Tourism & Ageing Conference: abstract book

    Get PDF

    Spatiotemporal variation of taxi demand

    Get PDF
    The growth of urban areas has made taxi service become increasingly more popular due to its ubiquity and flexibility when compared with, more rigid, public transportation modes. However, in big cities taxi service is still unbalanced, resulting in inefficiencies such as long waiting times and excessive vacant trips. This paper presents an exploratory taxi fleet service analysis and compares two forecast models aimed at predicting the spatiotemporal variation of short-term taxi demand. For this paper, we used a large sample with more than 1 million trips between 2014 and 2017, representing roughly 10% of Lisbon’s fleet. We analysed the spatiotemporal variation between pick-up and drop-off locations and how they are affected by weather conditions and points of interest. More, based on historic data, we built two models to predict the demand, ARIMA and Artificial Neural Network (ANN), and evaluated and compared the performance of both models. This study not only allows the direct comparison of a linear statistical model with a machine learning one, but also leads to a better comprehension of complex interactions surrounding different urban data sources using the taxi service as a probe to better understand urban mobility-on-demand and its needs.info:eu-repo/semantics/publishedVersio

    Electoral Competition as a Determinant of Fiscal Decentralization

    Get PDF
    Fiscal decentralization is high on the agenda in policy fora. This paper empirically investigates the underlying causes of Â…fiscal decentralization, based on the predictions of a simple political economy model. We argue that the likeliness that a central government engages in devolution of powers depends in important ways on the political forces that it faces, the theoryÂ’'s main insight being that the central governmentÂ’'s electoral strength should, all else being equal, decrease that governmentÂ’'s share of spending. Consistent with the model'Â’s predictions, empirical results from a panel of democracies support the relevance of political factors as determinants of fiÂ…scal decentralization. The relationship between central government electoral strength and both expenditure and revenue centralization emerges as negative and non-linear.Fiscal decentralization; Fiscal federalism; Vertical interactions; Partial Decentralization; Elections

    Effectiveness and cost-effectiveness of a novel, group self-management course for adults with chronic musculoskeletal pain: study protocol for a multicentre, randomised controlled trial (COPERS)

    Get PDF
    Introduction: Chronic musculoskeletal pain is a common condition that often responds poorly to treatment. Self-management courses have been advocated as a non-drug pain management technique, although evidence for their effectiveness is equivocal. We designed and piloted a self-management course based on evidence for effectiveness for specific course components and characteristics. Methods/analysis: COPERS (coping with persistent pain, effectiveness research into self-management) is a pragmatic randomised controlled trial testing the effectiveness and cost-effectiveness of an intensive, group, cognitive behavioural-based, theoretically informed and manualised self-management course for chronic pain patients against a control of best usual care: a pain education booklet and a relaxation CD. The course lasts for 15 h, spread over 3 days, with a –2 h follow-up session 2 weeks later. We aim to recruit 685 participants with chronic musculoskeletal pain from primary, intermediate and secondary care services in two UK regions. The study is powered to show a standardised mean difference of 0.3 in the primary outcome, pain-related disability. Secondary outcomes include generic health-related quality of life, healthcare utilisation, pain self-efficacy, coping, depression, anxiety and social engagement. Outcomes are measured at 6 and 12 months postrandomisation. Pain self-efficacy is measured at 3 months to assess whether change mediates clinical effect. Ethics/dissemination: Ethics approval was given by Cambridgeshire Ethics 11/EE/046. This trial will provide robust data on the effectiveness and cost-effectiveness of an evidence-based, group self-management programme for chronic musculoskeletal pain. The published outcomes will help to inform future policy and practice around such self-management courses, both nationally and internationally. Trial registration: ISRCTN24426731
    corecore