282 research outputs found
Community rotorcraft air transportation benefits and opportunities
Information about rotorcraft that will assist community planners in assessing and planning for the use of rotorcraft transportation in their communities is provided. Information useful to helicopter researchers, manufacturers, and operators concerning helicopter opportunities and benefits is also given. Three primary topics are discussed: the current status and future projections of rotorcraft technology, and the comparison of that technology with other transportation vehicles; the community benefits of promising rotorcraft transportation opportunities; and the integration and interfacing considerations between rotorcraft and other transportation vehicles. Helicopter applications in a number of business and public service fields are examined in various geographical settings
Impact of Mental Health Comorbidities on Health Care Utilization and Expenditure in a Large US Managed Care Adult Population with ADHD
AbstractObjectiveTo estimate the health resource use (HRU) and expenditure of adult patients with attention deficit/hyperactivity disorder (ADHD) subsequently diagnosed with one or more mental health (MH) comorbidities.MethodsUsing Kaiser Permanente Southern California electronic medical records (January 1, 2006, to December 31, 2009), we identified adults with at least one ADHD diagnosis and at least two subsequent prescriptions fills for ADHD medication. The date of first MH comorbidity diagnosis after the index ADHD diagnosis was defined as the index transition date. Continuous eligibility 12 months before and after the index transition date was required. For patients with multiple transitions (≥2), the post-transition period reflected the 12 months after the second transition. HRU for all-cause inpatient, outpatient, emergency department, behavioral therapy, overall prescription fill counts, and ADHD-specific prescription fill counts and mean patient expenditure (2010 US 1822 ± 4432 ± $301 (both P < 0.0001) in the single and multiple transition cohorts, respectively.ConclusionsTwenty-six percent of patients with ADHD transitioned to MH comorbid diagnoses. Increased HRU and expenditure were associated with MH transitions. Identifying of patients with ADHD at risk for MH comorbidities may help to improve their outcomes
The Endogenous Th17 Response in NO<inf>2</inf>-Promoted Allergic Airway Disease Is Dispensable for Airway Hyperresponsiveness and Distinct from Th17 Adoptive Transfer
Severe, glucocorticoid-resistant asthma comprises 5-7% of patients with asthma. IL-17 is a biomarker of severe asthma, and the adoptive transfer of Th17 cells in mice is sufficient to induce glucocorticoid-resistant allergic airway disease. Nitrogen dioxide (NO2) is an environmental toxin that correlates with asthma severity, exacerbation, and risk of adverse outcomes. Mice that are allergically sensitized to the antigen ovalbumin by exposure to NO2 exhibit a mixed Th2/Th17 adaptive immune response and eosinophil and neutrophil recruitment to the airway following antigen challenge, a phenotype reminiscent of severe clinical asthma. Because IL-1 receptor (IL-1R) signaling is critical in the generation of the Th17 response in vivo, we hypothesized that the IL-1R/Th17 axis contributes to pulmonary inflammation and airway hyperresponsiveness (AHR) in NO2-promoted allergic airway disease and manifests in glucocorticoid-resistant cytokine production. IL-17A neutralization at the time of antigen challenge or genetic deficiency in IL-1R resulted in decreased neutrophil recruitment to the airway following antigen challenge but did not protect against the development of AHR. Instead, IL-1R-/- mice developed exacerbated AHR compared to WT mice. Lung cells from NO2-allergically inflamed mice that were treated in vitro with dexamethasone (Dex) during antigen restimulation exhibited reduced Th17 cytokine production, whereas Th17 cytokine production by lung cells from recipient mice of in vitro Th17-polarized OTII T-cells was resistant to Dex. These results demonstrate that the IL-1R/Th17 axis does not contribute to AHR development in NO2-promoted allergic airway disease, that Th17 adoptive transfer does not necessarily reflect an endogenously-generated Th17 response, and that functions of Th17 responses are contingent on the experimental conditions in which they are generated. © 2013 Martin et al
The 'dirty dozen' of freshwater science: detecting then reconciling hydrological data biases and errors
Sound water policy and management rests on sound hydrometeorological and ecological data. Conversely, unrepresentative, poorly collected, or erroneously archived data introduce uncertainty regarding the magnitude, rate, and direction of environmental change, in addition to undermining confidence in decision-making processes. Unfortunately, data biases and errors can enter the information flow at various stages, starting with site selection, instrumentation, sampling/measurement procedures, postprocessing and ending with archiving systems. Techniques such as visual inspection of raw data, graphical representation, and comparison between sites, outlier, and trend detection, and referral to metadata can all help uncover spurious data. Tell-tale signs of ambiguous and/or anomalous data are highlighted using 12 carefully chosen cases drawn mainly from hydrology (‘the dirty dozen’). These include evidence of changes in site or local conditions (due to land management, river regulation, or urbanization); modifications to instrumentation or inconsistent observer behavior; mismatched or misrepresentative sampling in space and time; treatment of missing values, postprocessing and data storage errors. Also for raising awareness of pitfalls, recommendations are provided for uncovering lapses in data quality after the information has been gathered. It is noted that error detection and attribution are more problematic for very large data sets, where observation networks are automated, or when various information sources have been combined. In these cases, more holistic indicators of data integrity are needed that reflect the overall information life-cycle and application(s) of the hydrological data
Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy
Background
A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets.
Methods
Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis.
Results
A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001).
Conclusion
We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty
Health-related quality of life of children with attention-deficit/hyperactivity disorder versus children with diabetes and healthy controls
The impact of attention-deficit/hyperactivity disorder (ADHD) on health-related quality of life (HRQoL) is reported to be similar to that of other mental health and physical disorders. In this cross-sectional study, we hypothesized that children with ADHD and children with type 1 diabetes mellitus (T1DM) would have significantly worse HRQoL compared with healthy children, and that better clinical status in ADHD and T1DM would be associated with better HRQoL. Children were recruited from three outpatient services in Scotland. Responses to two frequently used validated HRQoL instruments, the Paediatric Quality of Life Inventory (PedsQL) and Child Health and Illness Profile-child edition (CHIP-CE), were obtained from parents/carers and children (6–16 years) with/without ADHD or T1DM. Child and parent/carer-completed HRQoL measurements were evaluated for 213 children with ADHD, 58 children with T1DM and 117 healthy children (control group). Significantly lower self and parent/carer ratings were observed across most PedsQL (P < 0.001) and CHIP-CE (P < 0.05) domains (indicating reduced HRQoL) for the ADHD group compared with the T1DM and control groups. Parent/carer and child ratings were significantly correlated for both measures of HRQoL (PedsQL total score: P < 0.001; CHIP-CE all domains: P < 0.001), but only with low-to-moderate strength. Correlation between ADHD severity and HRQoL was significant with both PedsQL and CHIP-CE for all parent/carer (P < 0.01) and most child (P < 0.05) ratings; more ADHD symptoms were associated with poorer HRQoL. These data demonstrate that ADHD has a significant impact on HRQoL (as observed in both parent/carer and child ratings), which seems to be greater than that for children with T1DM
Climate-driven variability in the occurrence of major floods across North America and Europe
Concern over the potential impact of anthropogenic climate change on flooding has led to a proliferation of studies examining past flood trends. Many studies have analysed annual-maximum flow trends but few have quantified changes in major (25–100 year return period) floods, i.e. those that have the greatest societal impacts. Existing major-flood studies used a limited number of very large catchments affected to varying degrees by alterations such as reservoirs and urbanisation. In the current study, trends in major-flood occurrence from 1961 to 2010 and from 1931 to 2010 were assessed using a very large dataset (>1200 gauges) of diverse catchments from North America and Europe; only minimally altered catchments were used, to focus on climate-driven changes rather than changes due to catchment alterations. Trend testing of major floods was based on counting the number of exceedances of a given flood threshold within a group of gauges. Evidence for significant trends varied between groups of gauges that were defined by catchment size, location, climate, flood threshold and period of record, indicating that generalizations about flood trends across large domains or a diversity of catchment types are ungrounded. Overall, the number of significant trends in major-flood occurrence across North America and Europe was approximately the number expected due to chance alone. Changes over time in the occurrence of major floods were dominated by multidecadal variability rather than by long-term trends. There were more than three times as many significant relationships between major-flood occurrence and the Atlantic Multidecadal Oscillation than significant long-term trends
Climate driven trends in historical extreme low streamflows on four continents
Understanding temporal trends in low streamflows is important for water management and ecosystems. This work focuses on trends in the occurrence rate of extreme low-flow events (5- to 100-year return periods) for pooled groups of stations. We use data from 1,184 minimally altered catchments in Europe, North and South America, and Australia to discern historical climate-driven trends in extreme low flows (1976–2015 and 1946–2015). The understanding of low streamflows is complicated by different hydrological regimes in cold, transitional, and warm regions. We use a novel classification to define low-flow regimes using air temperature and monthly low-flow frequency. Trends in the annual occurrence rate of extreme low-flow events (proportion of pooled stations each year) were assessed for each regime. Most regimes on multiple continents did not have significant (p < 0.05) trends in the occurrence rate of extreme low streamflows from 1976 to 2015; however, occurrence rates for the cold-season low-flow regime in North America were found to be significantly decreasing for low return-period events. In contrast, there were statistically significant increases for this period in warm regions of NA which were associated with the variation in the Pacific Decadal Oscillation. Significant decreases in extreme low-flow occurrence rates were dominant from 1946 to 2015 in Europe and NA for both cold- and warm-season low-flow regimes; there were also some non-significant trends. The difference in the results between the shorter (40-year) and longer (70-year) records and between low-flow regimes highlights the complexities of low-flow response to changing climatic conditions
- …