352 research outputs found

    Confidence in uncertainty: Error cost and commitment in early speech hypotheses

    Get PDF
    © 2018 Loth et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Interactions with artificial agents often lack immediacy because agents respond slower than their users expect. Automatic speech recognisers introduce this delay by analysing a user’s utterance only after it has been completed. Early, uncertain hypotheses of incremental speech recognisers can enable artificial agents to respond more timely. However, these hypotheses may change significantly with each update. Therefore, an already initiated action may turn into an error and invoke error cost. We investigated whether humans would use uncertain hypotheses for planning ahead and/or initiating their response. We designed a Ghost-in-the-Machine study in a bar scenario. A human participant controlled a bartending robot and perceived the scene only through its recognisers. The results showed that participants used uncertain hypotheses for selecting the best matching action. This is comparable to computing the utility of dialogue moves. Participants evaluated the available evidence and the error cost of their actions prior to initiating them. If the error cost was low, the participants initiated their response with only suggestive evidence. Otherwise, they waited for additional, more confident hypotheses if they still had time to do so. If there was time pressure but only little evidence, participants grounded their understanding with echo questions. These findings contribute to a psychologically plausible policy for human-robot interaction that enables artificial agents to respond more timely and socially appropriately under uncertainty

    Association of Methylentetraydrofolate Reductase (MTHFR) 677 C > T gene polymorphism and homocysteine levels in psoriasis vulgaris patients from Malaysia: a case-control study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The methylenetetrahydrofolate reductase (MTHFR) enzyme catalyzes the reduction of 5, 10-methylenetetrahydrofolate to 5-methyltetrahydrofolate and methyl donors. The methyl donors are required for the conversion of homocysteine to methionine. Mutation of MTHFR 677 C > T disrupts its thermostability therefore leads to defective enzyme activities and dysregulation of homocysteine levels.</p> <p>Methods</p> <p>This case-control study (n = 367) was conducted to investigate the correlation of the MTHFR gene polymorphism [NM_005957] and psoriasis vulgaris amongst the Malaysian population. Overnight fasting blood samples were collected from a subgroup of consented psoriasis vulgaris patients and matched controls (n = 84) for the quantification of homocysteine, vitamin B<sub>12 </sub>and folic acid levels.</p> <p>Results</p> <p>There was no significant increase of the MTHFR 677 C > T mutation in patients with psoriasis vulgaris compared with controls (<it>χ</it><sup>2 </sup>= 0.733, p = 0.392). No significant association between homocysteine levels and MTHFR gene polymorphism in cases and controls were observed (F = 0.91, df = 3, 80, p = 0.44). However, homocysteine levels in cases were negatively correlated with vitamin B<sub>12 </sub>(r = -0.173) and folic acid (r = -0.345) levels. Vitamin B<sub>12 </sub>and folic acid levels in cases were also negatively correlated (r = -0.164).</p> <p>Conclusions</p> <p>Our results indicate that there was no significant association between the MTHFR gene polymorphism and psoriasis vulgaris in the Malaysian population. There was no significant increase of the plasma homocysteine level in the psoriasis patients compared to the controls.</p

    Temporal Information Processing in Short- and Long-Term Memory of Patients with Schizophrenia

    Get PDF
    Cognitive deficits of patients with schizophrenia have been largely recognized as core symptoms of the disorder. One neglected factor that contributes to these deficits is the comprehension of time. In the present study, we assessed temporal information processing and manipulation from short- and long-term memory in 34 patients with chronic schizophrenia and 34 matched healthy controls. On the short-term memory temporal-order reconstruction task, an incidental or intentional learning strategy was deployed. Patients showed worse overall performance than healthy controls. The intentional learning strategy led to dissociable performance improvement in both groups. Whereas healthy controls improved on a performance measure (serial organization), patients improved on an error measure (inappropriate semantic clustering) when using the intentional instead of the incidental learning strategy. On the long-term memory script-generation task, routine and non-routine events of everyday activities (e.g., buying groceries) had to be generated in either chronological or inverted temporal order. Patients were slower than controls at generating events in the chronological routine condition only. They also committed more sequencing and boundary errors in the inverted conditions. The number of irrelevant events was higher in patients in the chronological, non-routine condition. These results suggest that patients with schizophrenia imprecisely access temporal information from short- and long-term memory. In short-term memory, processing of temporal information led to a reduction in errors rather than, as was the case in healthy controls, to an improvement in temporal-order recall. When accessing temporal information from long-term memory, patients were slower and committed more sequencing, boundary, and intrusion errors. Together, these results suggest that time information can be accessed and processed only imprecisely by patients who provide evidence for impaired time comprehension. This could contribute to symptomatic cognitive deficits and strategic inefficiency in schizophrenia

    People Efficiently Explore the Solution Space of the Computationally Intractable Traveling Salesman Problem to Find Near-Optimal Tours

    Get PDF
    Humans need to solve computationally intractable problems such as visual search, categorization, and simultaneous learning and acting, yet an increasing body of evidence suggests that their solutions to instantiations of these problems are near optimal. Computational complexity advances an explanation to this apparent paradox: (1) only a small portion of instances of such problems are actually hard, and (2) successful heuristics exploit structural properties of the typical instance to selectively improve parts that are likely to be sub-optimal. We hypothesize that these two ideas largely account for the good performance of humans on computationally hard problems. We tested part of this hypothesis by studying the solutions of 28 participants to 28 instances of the Euclidean Traveling Salesman Problem (TSP). Participants were provided feedback on the cost of their solutions and were allowed unlimited solution attempts (trials). We found a significant improvement between the first and last trials and that solutions are significantly different from random tours that follow the convex hull and do not have self-crossings. More importantly, we found that participants modified their current better solutions in such a way that edges belonging to the optimal solution (“good” edges) were significantly more likely to stay than other edges (“bad” edges), a hallmark of structural exploitation. We found, however, that more trials harmed the participants' ability to tell good from bad edges, suggesting that after too many trials the participants “ran out of ideas.” In sum, we provide the first demonstration of significant performance improvement on the TSP under repetition and feedback and evidence that human problem-solving may exploit the structure of hard problems paralleling behavior of state-of-the-art heuristics

    Azimuthal anisotropy and correlations at large transverse momenta in p+pp+p and Au+Au collisions at sNN\sqrt{s_{_{NN}}}= 200 GeV

    Get PDF
    Results on high transverse momentum charged particle emission with respect to the reaction plane are presented for Au+Au collisions at sNN\sqrt{s_{_{NN}}}= 200 GeV. Two- and four-particle correlations results are presented as well as a comparison of azimuthal correlations in Au+Au collisions to those in p+pp+p at the same energy. Elliptic anisotropy, v2v_2, is found to reach its maximum at pt3p_t \sim 3 GeV/c, then decrease slowly and remain significant up to pt7p_t\approx 7 -- 10 GeV/c. Stronger suppression is found in the back-to-back high-ptp_t particle correlations for particles emitted out-of-plane compared to those emitted in-plane. The centrality dependence of v2v_2 at intermediate ptp_t is compared to simple models based on jet quenching.Comment: 4 figures. Published version as PRL 93, 252301 (2004

    Azimuthal anisotropy in Au+Au collisions at sqrtsNN = 200 GeV

    Get PDF
    The results from the STAR Collaboration on directed flow (v_1), elliptic flow (v_2), and the fourth harmonic (v_4) in the anisotropic azimuthal distribution of particles from Au+Au collisions at sqrtsNN = 200 GeV are summarized and compared with results from other experiments and theoretical models. Results for identified particles are presented and fit with a Blast Wave model. Different anisotropic flow analysis methods are compared and nonflow effects are extracted from the data. For v_2, scaling with the number of constituent quarks and parton coalescence is discussed. For v_4, scaling with v_2^2 and quark coalescence is discussed.Comment: 26 pages. As accepted by Phys. Rev. C. Text rearranged, figures modified, but data the same. However, in Fig. 35 the hydro calculations are corrected in this version. The data tables are available at http://www.star.bnl.gov/central/publications/ by searching for "flow" and then this pape

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    SummaryBackground The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding Bill & Melinda Gates Foundation

    Quantitative conversations: the importance of developing rapport in standardised interviewing

    Get PDF
    © 2014, The Author(s). When developing household surveys, much emphasis is understandably placed on developing survey instruments that can elicit accurate and comparable responses. In order to ensure that carefully crafted questions are not undermined by ‘interviewer effects’, standardised interviewing tends to be utilised in preference to conversational techniques. However, by drawing on a behaviour coding analysis of survey paradata arising from the 2012 UK Poverty and Social Exclusion Survey we show that in practice standardised survey interviewing often involves extensive unscripted conversation between the interviewer and the respondent. Whilst these interactions can enhance response accuracy, cooperation and ethicality, unscripted conversations can also be problematic in terms of survey reliability and the ethical conduct of survey interviews, as well as raising more basic epistemological questions concerning the degree of standardisation typically assumed within survey research. We conclude that better training in conversational techniques is necessary, even when applying standardised interviewing methodologies. We also draw out some theoretical implications regarding the usefulness of the qualitative–quantitative dichotomy
    corecore