1,227 research outputs found

    The CCD and readout electronics for the OMC instrument on Integral

    Get PDF
    The Optical Monitoring Camera (OMC) on ESA's Integral gamma-ray astronomy satellite is devoted to optical wavelength observations simultaneously covering the same field-of-view as the gamma-ray and X-ray instruments. The OMC consists of a refracting telescope with a CCD as the imaging device in the focal plane. Here we describe the CCD and its associated readout electronics, in particular pointing out features of interest to users of the OMC instrument and its data

    Computing Utopia: The Horizons of Computational Economies in History and Science Fiction

    Get PDF
    This article connects the recent flourishing of economic science fiction with the increasing technicity of contemporary financial markets, to pose questions about computational economies, both historical and fictional, and their ambiguous utopian currents. It explores examples of computational economies and societies in which economic resources are largely defined and allocated by computational systems to challenge—if not entirely dispel—assumptions about the inextricability of computation and the dystopian specters of capitalism, authoritarianism, and totalitarianism. The article puts insights from the histories of cybernetics, computer science, and economics into dialogue with sf novels that experiment with different sociopolitical configurations of computational economies. The novels that are the primary focus of the discussion are The Dispossessed: An Ambiguous Utopia (1974) by Ursula K. Le Guin and If Then (2015) by Matthew De Abaitua. The article concludes with some thoughts about the use of history and fiction for expanding the imaginative horizons of the computable in economics

    Communicating climate risk: a toolkit

    Get PDF
    The Communicating Climate Risk toolkit draws together best practice on the effective communication of climate information from across STEM, social sciences, and arts and humanities. It provides users with insights, recommendations, and tools for all forms of climate-related communication and decision-making, and identifies open problems

    A probabilistic analysis of argument cogency

    Get PDF
    This paper offers a probabilistic treatment of the conditions for argument cogency as endorsed in informal logic: acceptability, relevance, and sufficiency. Treating a natural language argument as a reason-claim-complex, our analysis identifies content features of defeasible argument on which the RSA conditions depend, namely: change in the commitment to the reason, the reason’s sensitivity and selectivity to the claim, one’s prior commitment to the claim, and the contextually determined thresholds of acceptability for reasons and for claims. Results contrast with, and may indeed serve to correct, the informal understanding and applications of the RSA criteria concerning their conceptual dependence, their function as update-thresholds, and their status as obligatory rather than permissive norms, but also show how these formal and informal normative approachs can in fact align

    Argument mining: A machine learning perspective

    Get PDF
    Argument mining has recently become a hot topic, attracting the interests of several and diverse research communities, ranging from artificial intelligence, to computational linguistics, natural language processing, social and philosophical sciences. In this paper, we attempt to describe the problems and challenges of argument mining from a machine learning angle. In particular, we advocate that machine learning techniques so far have been under-exploited, and that a more proper standardization of the problem, also with regards to the underlying argument model, could provide a crucial element to develop better systems

    Traffic-related pollution and asthma prevalence in children. Quantification of associations with nitrogen dioxide.

    Get PDF
    Ambient nitrogen dioxide is a widely available measure of traffic-related air pollution and is inconsistently associated with the prevalence of asthma symptoms in children. The use of this relationship to evaluate the health impact of policies affecting traffic management and traffic emissions is limited by the lack of a concentration-response function based on systematic review and meta-analysis of relevant studies. Using systematic methods, we identified papers containing quantitative estimates for nitrogen dioxide and the 12 month period prevalence of asthma symptoms in children in which the exposure contrast was within-community and dominated by traffic pollution. One estimate was selected from each study according to an a priori algorithm. Odds ratios were standardised to 10 μg/m(3) and summary estimates were obtained using random- and fixed-effects estimates. Eighteen studies were identified. Concentrations of nitrogen dioxide were estimated for the home address (12) and/or school (8) using a range of methods; land use regression (6), study monitors (6), dispersion modelling (4) and interpolation (2). Fourteen studies showed positive associations but only two associations were statistically significant at the 5 % level. There was moderate heterogeneity (I(2) = 32.8 %) and the random-effects estimate for the odds ratio was 1.06 (95 % CI 1.00 to 1.11). There was no evidence of small study bias. Individual studies tended to have only weak positive associations between nitrogen dioxide and asthma prevalence but the summary estimate bordered on statistical significance at the 5 % level. Although small, the potential impact on asthma prevalence could be considerable because of the high level of baseline prevalence in many cities. Whether the association is causal or indicates the effects of a correlated pollutant or other confounders, the estimate obtained by the meta-analysis would be appropriate for estimating impacts of traffic pollution on asthma prevalence

    Hybrid of swarm intelligent algorithms in medical applications

    Get PDF
    In this paper, we designed a hybrid of swarm intelligence algorithms to diagnose hepatitis, breast tissue, and dermatology conditions in patients with such infection. The effectiveness of hybrid swarm intelligent algorithms was studied since no single algorithm is effective in solving all types of problems. In this study, feed forward and Elman recurrent neural network (ERN) with swarm intelligent algorithms is used for the classification of the mentioned diseases. The capabilities of six (6) global optimization learning algorithms were studied and their performances in training as well as testing were compared. These algorithms include: hybrid of Cuckoo Search algorithm and Levenberg-Marquardt (LM) (CSLM), Cuckoo Search algorithm (CS) and backpropagation (BP) (CSBP), CS and ERN (CSERN), Artificial Bee Colony (ABC) and LM (ABCLM), ABC and BP (ABCBP), Genetic Algorithm (GA) and BP (GANN). Simulation comparative results indicated that the classification accuracy and run time of the CSLM outperform the CSERN, GANN, ABCBP, ABCLM, and CSBP in the breast tissue dataset. On the other hand, the CSERN performs better than the CSLM, GANN, ABCBP, ABCLM, and CSBP in both th

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio
    corecore