605 research outputs found

    STREAM WATER QUALITY MANAGEMENT: A STOCHASTIC MIXED-INTEGER PROGRAMMING MODEL

    Get PDF
    Water quality management under the watershed approach of Total Maximum Daily Load (TMDL) programs requires that water quality standards be maintained throughout the year. The main purpose of this research was to develop a methodology that incorporates inter-temporal variations in stream conditions through statistical distributions of pollution loading variables. This was demonstrated through a cost minimization mixed-integer linear programming (MIP) model that maintains the spatial integrity of the watershed problem. Traditional approaches for addressing variability in stream conditions are unlikely to satisfy the assumptions on which these methodologies are founded or are inadequate in addressing the problem correctly when distributions are not normal. The MIP model solves for the location and the maximum capacity of treatment plants to be built throughout the watershed which will provide the optimal level of treatment throughout the year. The proposed methodology involves estimation of parameters of the distribution of pollution loading variables from simulated data and use of those parameters to re-generate a suitable number of random observations in the optimization process such that the new data preserve the same distribution parameters. The objective of the empirical model was to minimize costs for implementing pH TMDLs for a watershed by determining the level of treatment required to attain water quality standards under stochastic stream conditions. The output of the model was total minimum costs for treatment and selection of the spatial pattern of the least-cost technologies for treatment. To minimize costs, the model utilized a spatial network of streams in the watershed, which provides opportunities for cost-reduction through trading of pollution among sources and/or least-cost treatment. The results were used to estimate the costs attributable to inter-temporal variations and the costs of different settings for the margin of safety. The methodology was tested with water quality data for the Paint Creek watershed in West Virginia. The stochastic model included nine streams in the optimal solution. An estimate of inter-temporal variations in stream conditions was calculated by comparing total costs under the stochastic model and a deterministic version of the stochastic model estimated with mean values of the loading variables. It was observed that the deterministic model underestimates total treatment cost by about 45 percent relative to the 97th percentile stochastic model. Estimates of different margin of safety were calculated by comparing total costs for the 99.9th percentile treatment (instead of an idealistic absolute treatment) with that of the 95th to 99th percentile treatment. The differential costs represent the savings due to the knowledge of the statistical distribution of pollution and an explicit margin of safety. Results indicate that treatment costs are about 7 percent lower when the level of assurance is reduced from 99.9 to 99 percent and 21 percent lower when 95 percent assurance is selected. The application of the methodology, however, is not limited to the estimation of TMDL implementation costs. For example, it could be utilized to estimate costs of anti-degradation policies for water quality management and other watershed management issues.Resource /Energy Economics and Policy,

    Modeling Information Exchange Opportunities for Effective Human-Computer Teamwork

    Get PDF
    This paper studies information exchange in collaborative group activities involving mixed networks of people and computer agents. It introduces the concept of "nearly decomposable" decision-making problems to address the complexity of information exchange decisions in such multi-agent settings. This class of decision-making problems arise in settings which have an action structure that requires agents to reason about only a subset of their partners' actions – but otherwise allows them to act independently. The paper presents a formal model of nearly decomposable decision-making problems, NED-MDPs, and defines an approximation algorithm, NED-DECOP that computes efficient information exchange strategies. The paper shows that NED-DECOP is more efficient than prior collaborative planning algorithms for this class of problem. It presents an empirical study of the information exchange decisions made by the algorithm that investigates the extent to which people accept interruption requests from a computer agent. The context for the study is a game in which the agent can ask people for information that may benefit its individual performance and thus the groupʼs collaboration. This study revealed the key factors affecting peopleʼs perception of the benefit of interruptions in this setting. The paper also describes the use of machine learning to predict the situations in which people deviate from the strategies generated by the algorithm, using a combination of domain features and features informed by the algorithm. The methodology followed in this work could form the basis for designing agents that effectively exchange information in collaborations with people.Engineering and Applied Science

    Improving fairness in machine learning systems: What do industry practitioners need?

    Full text link
    The potential for machine learning (ML) systems to amplify social inequities and unfairness is receiving increasing popular and academic attention. A surge of recent work has focused on the development of algorithmic tools to assess and mitigate such unfairness. If these tools are to have a positive impact on industry practice, however, it is crucial that their design be informed by an understanding of real-world needs. Through 35 semi-structured interviews and an anonymous survey of 267 ML practitioners, we conduct the first systematic investigation of commercial product teams' challenges and needs for support in developing fairer ML systems. We identify areas of alignment and disconnect between the challenges faced by industry practitioners and solutions proposed in the fair ML research literature. Based on these findings, we highlight directions for future ML and HCI research that will better address industry practitioners' needs.Comment: To appear in the 2019 ACM CHI Conference on Human Factors in Computing Systems (CHI 2019

    Cytomegalovirus Management in Solid Organ Transplant Recipients: A Pre-COVID-19 Survey From the Working Group of the European Society for Organ Transplantation

    Get PDF
    Infections are leading causes of morbidity/mortality following solid organ transplantation (SOT) and cytomegalovirus (CMV) is among the most frequent pathogens, causing a considerable threat to SOT recipients. A survey was conducted 19 July–31 October 2019 to capture clinical practices about CMV in SOT recipients (e.g., how practices aligned with guidelines, how adequately treatments met patients’ needs, and respondents’ expectations for future developments). Transplant professionals completed a ∼30-minute online questionnaire: 224 responses were included, representing 160 hospitals and 197 SOT programs (41 countries; 167[83%] European programs). Findings revealed a heterogenous approach to CMV diagnosis and management and, sometimes, significant divergence from international guidelines. Valganciclovir prophylaxis (of variable duration) was administered by 201/224 (90%) respondents in D+/R− SOT and by 40% in R+ cases, with pre-emptive strategies generally reserved for R+ cases: DNA thresholds to initiate treatment ranged across 10–10,000 copies/ml. Ganciclovir-resistant CMV strains were still perceived as major challenges, and tailored treatment was one of the most important unmet needs for CMV management. These findings may help to design studies to evaluate safety and efficacy of new strategies to prevent CMV disease in SOT recipients, and target specific educational activities to harmonize CMV management in this challenging population

    Perirhinal cortex and the recognition of relative familiarity

    Get PDF
    Spontaneous object recognition (SOR) is a widely used task of recognition memory in rodents which relies on their propensity to explore novel (or relatively novel) objects. Network models typically define perirhinal cortex as a region required for recognition of previously seen objects largely based on findings that lesions or inactivations of this area produce SOR deficits. However, relatively little is understood about the relationship between the activity of cells in the perirhinal cortex that signal novelty and familiarity and the behavioural responses of animals in the SOR task. Previous studies have used objects that are either highly familiar or absolutely novel, but everyday memory is for objects that sit on a spectrum of familiarity which includes objects that have been seen only a few times, or objects that are similar to objects which have been previously experienced. We present two studies that explore cellular activity (through c-fos imaging) within perirhinal cortex of rats performing SOR where the familiarity of objects has been manipulated. Despite robust recognition memory performance, we show no significant changes in perirhinal activity related to the level of familiarity of the objects. Reasons for this lack of familiarity-related modulation in perirhinal cortex activity are discussed. The current findings support emerging evidence that perirhinal responses to novelty are complex and that task demands are critical to the involvement of perirhinal cortex in the control of object recognition memory

    Temporal evolution of anxiety and depression in chronic heart failure and its association with clinical outcome

    Get PDF
    Background: Although anxiety and depression have been associated with adverse outcomes in chronic heart failure (HF), data on temporal evolution of these symptoms are scarce. We aimed to investigate the association between repeatedly measured depression and anxiety symptoms and clinical outcome in chronic HF patients. Methods: In this prospective observational study, outpatients with chronic HF were included and followed-up for a maximum of 2.5 years. The hospital anxiety and depression scale (HADS) questionnaire was conducted every six months. The primary endpoint was a composite of HF hospitalization, cardiovascular death, heart transplantation and left ventricular assist device (LVAD) implantation. Cox and joint models were used to investigate the association between the HADS score and the endpoint. Results: A total of 362 patients filled out a median (25th–75th percentile) of 3 [2–4] questionnaires each. Mean ± SD age was 63 ± 13 years, 72% were men. Anxiety scores remained relatively stable leading up to the endpoint, while depression scores increased. Higher baseline depression scores were significantly associated with the endpoint (hazard ratio [HR] 1.68 and 95% confidence interval [CI] 1.19–2.36 per log(score+1), p = 0.003), while higher baseline anxiety scores did not reach statistical significance (HR [95% CI] 1.34 [0.99–1.83], p = 0.061). When repeatedly measured, both higher anxiety (HR [95% CI] 1.57[1.07–2.30], p = 0.022) and depression (HR [95% CI] 2.04 [1.39–3.06], p &lt; 0.001) scores were significantly associated with the endpoint. Conclusion: Serial measurements of depression and anxiety symptoms identify chronic HF patients with increased risk of adverse clinical outcomes. Screening for both disorders should be considered in clinical practice.</p

    The different risk of new-onset, chronic, worsening, and advanced heart failure:A systematic review and meta-regression analysis

    Get PDF
    Aims: Heart failure (HF) is a chronic and progressive syndrome associated with a poor prognosis. While it may seem intuitive that the risk of adverse outcomes varies across the different stages of HF, an overview of these risks is lacking. This study aims to determine the risk of all-cause mortality and HF hospitalizations associated with new-onset HF, chronic HF (CHF), worsening HF (WHF), and advanced HF. Methods and results: We performed a systematic review of observational studies from 2012 to 2022 using five different databases. The primary outcomes were 30-day and 1-year all-cause mortality, as well as 1-year HF hospitalization. Studies were pooled using random effects meta-analysis, and mixed-effects meta-regression was used to compare the different HF groups. Among the 15 759 studies screened, 66 were included representing 862 046 HF patients. Pooled 30-day mortality rates did not reveal a significant distinction between hospital-admitted patients, with rates of 10.13% for new-onset HF and 8.11% for WHF (p = 0.10). However, the 1-year mortality risk differed and increased stepwise from CHF to advanced HF, with a rate of 8.47% (95% confidence interval [CI] 7.24–9.89) for CHF, 21.15% (95% CI 17.78–24.95) for new-onset HF, 26.84% (95% CI 23.74–30.19) for WHF, and 29.74% (95% CI 24.15–36.10) for advanced HF. Readmission rates for HF at 1 year followed a similar trend. Conclusions: Our meta-analysis of observational studies confirms the different risk for adverse outcomes across the distinct HF stages. Moreover, it emphasizes the negative prognostic value of WHF as the first progressive stage from CHF towards advanced HF.</p

    The different risk of new-onset, chronic, worsening, and advanced heart failure:A systematic review and meta-regression analysis

    Get PDF
    Aims: Heart failure (HF) is a chronic and progressive syndrome associated with a poor prognosis. While it may seem intuitive that the risk of adverse outcomes varies across the different stages of HF, an overview of these risks is lacking. This study aims to determine the risk of all-cause mortality and HF hospitalizations associated with new-onset HF, chronic HF (CHF), worsening HF (WHF), and advanced HF. Methods and results: We performed a systematic review of observational studies from 2012 to 2022 using five different databases. The primary outcomes were 30-day and 1-year all-cause mortality, as well as 1-year HF hospitalization. Studies were pooled using random effects meta-analysis, and mixed-effects meta-regression was used to compare the different HF groups. Among the 15 759 studies screened, 66 were included representing 862 046 HF patients. Pooled 30-day mortality rates did not reveal a significant distinction between hospital-admitted patients, with rates of 10.13% for new-onset HF and 8.11% for WHF (p = 0.10). However, the 1-year mortality risk differed and increased stepwise from CHF to advanced HF, with a rate of 8.47% (95% confidence interval [CI] 7.24–9.89) for CHF, 21.15% (95% CI 17.78–24.95) for new-onset HF, 26.84% (95% CI 23.74–30.19) for WHF, and 29.74% (95% CI 24.15–36.10) for advanced HF. Readmission rates for HF at 1 year followed a similar trend. Conclusions: Our meta-analysis of observational studies confirms the different risk for adverse outcomes across the distinct HF stages. Moreover, it emphasizes the negative prognostic value of WHF as the first progressive stage from CHF towards advanced HF.</p
    • …
    corecore