77 research outputs found

    Security should be there by default: Investigating how journalists perceive and respond to risks from the Internet of Things

    Get PDF
    Journalists have long been the targets of both physical and cyber-attacks from well-resourced adversaries. Internet of Things (IoT) devices are arguably a new avenue of threat towards journalists through both targeted and generalised cyber-physical exploitation. This study comprises three parts: First, we interviewed 11 journalists and surveyed 5 further journalists, to determine the extent to which journalists perceive threats through the IoT, particularly via consumer IoT devices. Second, we surveyed 34 cyber security experts to establish if and how lay-people can combat IoT threats. Third, we compared these findings to assess journalists' knowledge of threats, and whether their protective mechanisms would be effective against experts' depictions and predictions of IoT threats. Our results indicate that journalists generally are unaware of IoT-related risks and are not adequately protecting themselves; this considers cases where they possess IoT devices, or where they enter IoT-enabled environments (e.g., at work or home). Expert recommendations spanned both immediate and long-term mitigation methods, including practical actions that are technical and socio-political in nature. However, all proposed individual mitigation methods are likely to be short-term solutions, with 26 of 34 (76.5%) of cyber security experts responding that within the next five years it will not be possible for the public to opt-out of interaction with the IoT

    The Program for climate Model diagnosis and Intercomparison: 20-th anniversary Symposium

    Full text link
    Twenty years ago, W. Lawrence (Larry) Gates approached the U.S. Department of Energy (DOE) Office of Energy Research (now the Office of Science) with a plan to coordinate the comparison and documentation of climate model differences. This effort would help improve our understanding of climate change through a systematic approach to model intercomparison. Early attempts at comparing results showed a surprisingly large range in control climate from such parameters as cloud cover, precipitation, and even atmospheric temperature. The DOE agreed to fund the effort at the Lawrence Livermore National Laboratory (LLNL), in part because of the existing computing environment and because of a preexisting atmospheric science group that contained a wide variety of expertise. The project was named the Program for Climate Model Diagnosis and Intercomparison (PCMDI), and it has changed the international landscape of climate modeling over the past 20 years. In spring 2009 the DOE hosted a 1-day symposium to celebrate the twentieth anniversary of PCMDI and to honor its founder, Larry Gates. Through their personal experiences, the morning presenters painted an image of climate science in the 1970s and 1980s, that generated early support from the international community for model intercomparison, thereby bringing PCMDI into existence. Four talks covered GatesÃÂâÃÂÃÂÃÂÃÂs early contributions to climate research at the University of California, Los Angeles (UCLA), the RAND Corporation, and Oregon State University through the founding of PCMDI to coordinate the Atmospheric Model Intercomparison Project (AMIP). The speakers were, in order of presentation, Warren Washington [National Center for Atmospheric Research (NCAR)], Kelly Redmond (Western Regional Climate Center), George Boer (Canadian Centre for Climate Modelling and Analysis), and Lennart Bengtsson [University of Reading, former director of the European Centre for Medium-Range Weather Forecasts (ECMWF)]. The afternoon session emphasized the scientific ideas that are the basis of PCMDIÃÂâÃÂÃÂÃÂÃÂs success, summarizing their evolution and impact. Four speakers followed the various PCMDI-supported climate model intercomparison projects, beginning with early work on cloud representations in models, presented by Robert D. Cess (Distinguished Professor Emeritus, Stony Brook University), and then the latest Cloud Feedback Model Intercomparison Projects (CFMIPs) led by Sandrine Bony (Laboratoire de MÃÂÃÂÃÂétÃÂÃÂÃÂéorologie Dynamique). Benjamin Santer (LLNL) presented a review of the climate change detection and attribution (D & A) work pioneered at PCMDI, and Gerald A. Meehl (NCAR) ended the day with a look toward the future of climate change research

    Safety and feasibility of oral immunotherapy to multiple allergens for food allergy

    Get PDF
    BACKGROUND: Thirty percent of children with food allergy are allergic to more than one food. Previous studies on oral immunotherapy (OIT) for food allergy have focused on the administration of a single allergen at the time. This study aimed at evaluating the safety of a modified OIT protocol using multiple foods at one time. METHODS: Participants underwent double-blind placebo-controlled food challenges (DBPCFC) up to a cumulative dose of 182 mg of food protein to peanut followed by other nuts, sesame, dairy or egg. Those meeting inclusion criteria for peanut only were started on single-allergen OIT while those with additional allergies had up to 5 foods included in their OIT mix. Reactions during dose escalations and home dosing were recorded in a symptom diary. RESULTS: Forty participants met inclusion criteria on peanut DBPCFC. Of these, 15 were mono-allergic to peanut and 25 had additional food allergies. Rates of reaction per dose did not differ significantly between the two groups (median of 3.3% and 3.7% in multi and single OIT group, respectively; p = .31). In both groups, most reactions were mild but two severe reactions requiring epinephrine occurred in each group. Dose escalations progressed similarly in both groups although, per protocol design, those on multiple food took longer to reach equivalent doses per food (median +4 mo.; p < .0001). CONCLUSIONS: Preliminary data show oral immunotherapy using multiple food allergens simultaneously to be feasible and relatively safe when performed in a hospital setting with trained personnel. Additional, larger, randomized studies are required to continue to test safety and efficacy of multi-OIT. TRIAL REGISTRATION: Clinicaltrial.gov NCT0149017

    Antenatal tobacco use and iron deficiency anemia: integrating tobacco control into antenatal care in urban India

    Full text link
    Abstract Background In India, tobacco use during pregnancy is not routinely addressed during antenatal care. We measured the association between tobacco use and anemia in low-income pregnant women, and identified ways to integrate tobacco cessation into existing antenatal care at primary health centers. Methods We conducted an observational study using structured interviews with antenatal care clinic patients (n = 100) about tobacco use, anemia, and risk factors such as consumption of iron rich foods and food insecurity. We performed blood tests for serum cotinine, hemoglobin and ferritin. We conducted in-depth interviews with physicians (n = 5) and auxiliary nurse midwives (n = 5), and focus groups with community health workers (n = 65) to better understand tobacco and anemia control services offered during antenatal care. Results We found that 16% of patients used tobacco, 72% were anemic, 41% had iron deficiency anemia (IDA) and 29% were food insecure. Regression analysis showed that tobacco use (OR = 14.3; 95%CI = 2.6, 77.9) and consumption of green leafy vegetables (OR = 0.6; 95%CI = 0.4, 0.9) were independently associated with IDA, and tobacco use was not associated with consumption of iron-rich foods or household food insecurity. Clinics had a system for screening, treatment and follow-up care for anemic and iron-deficient antenatal patients, but not for tobacco use. Clinicians and community health workers were interested in integrating tobacco screening and cessation services with current maternal care services such as anemia control. Tobacco users wanted help to quit. Conclusion It would be worthwhile to assess the feasibility of integrating antenatal tobacco screening and cessation services with antenatal care services for anemia control, such as screening and guidance during clinic visits and cessation support during home visits.https://deepblue.lib.umich.edu/bitstream/2027.42/143514/1/12978_2018_Article_516.pd

    Effective Control of Schistosoma haematobium Infection in a Ghanaian Community following Installation of a Water Recreation Area

    Get PDF
    BackgroundUrogenital schistosomiasis caused by Schistosoma haematobium was endemic in Adasawase, Ghana in 2007. Transmission was reported to be primarily through recreational water contact.MethodsWe designed a water recreation area (WRA) to prevent transmission to school-aged children. The WRA features a concrete pool supplied by a borehole well and a gravity-driven rainwater collection system; it is 30 m2 and is split into shallow and deep sections to accommodate a variety of age groups. The WRA opened in 2009 and children were encouraged to use it for recreation as opposed to the local river. We screened children annually for S. haematobium eggs in their urine in 2008, 2009, and 2010 and established differences in infection rates before (2008–09) and after (2009–10) installation of the WRA. After each annual screening, children were treated with praziquantel and rescreened to confirm parasite clearance.Principal FindingsInitial baseline testing in 2008 established that 105 of 247 (42.5%) children were egg-positive. In 2009, with drug treatment alone, the pre-WRA annual cumulative incidence of infection was 29 of 216 (13.4%). In 2010, this incidence rate fell significantly (p<0.001, chi-squared) to 9 of 245 (3.7%) children after installation of the WRA. Logistic regression analysis was used to determine correlates of infection among the variables age, sex, distance between home and river, minutes observed at the river, low height-for-age, low weight-for-age, low Body Mass Index (BMI)-for-age, and previous infection status.Conclusion/SignificanceThe installation and use of a WRA is a feasible and highly effective means to reduce the incidence of schistosomiasis in school-aged children in a rural Ghanaian community. In conjunction with drug treatment and education, such an intervention can represent a significant step towards the control of schistosomiasis. The WRA should be tested in other water-rich endemic areas to determine whether infection prevalence can be substantially reduced.Author SummaryUrogenital schistosomiasis is a disease caused by the parasite Schistosoma haematobium; it is often characterized by bloody urine and tends to disproportionately affect school-aged children in rural tropical regions. The parasite is transmitted via skin contact with surface water that is contaminated by human waste. The disease was endemic in Adasawase, a rural Ghanaian community, in 2007. Transmission occurred mainly through recreational water contact. We collaborated with community members to design a water recreation area (WRA) featuring a concrete pool supplied by a borehole well and a rainwater collection system. We opened the pool in 2009 and local officials encouraged children to use the WRA for recreation. We screened local children annually (2008, 2009, 2010) for S. haematobium infection. After each screening, children were treated with praziquantel and rescreened. Baseline testing in 2008 established that at least 105 of 247 (42.5%) children were infected. In 2009, 29 of 216 (13.4%) children were infected, reflecting annual cumulative incidence. In 2010, a significantly smaller percentage of children (9 of 245, 3.7%) were infected. We conclude that the WRA effectively reduced infection in Adasawase, and that it should be tested in other water-rich endemic areas

    Tissue Glucocorticoid Metabolism in Adrenal Insufficiency:A Prospective Study of Dual-release Hydrocortisone Therapy

    Get PDF
    Background: Patients with adrenal insufficiency (AI) require life-long glucocorticoid (GC) replacement therapy. Within tissues, cortisol (F) availability is under the control of the isozymes of 11β-hydroxysteroid dehydrogenase (11β-HSD). We hypothesize that corticosteroid metabolism is altered in patients with AI because of the nonphysiological pattern of current immediate release hydrocortisone (IR-HC) replacement therapy. The use of a once-daily dual-release hydrocortisone (DR-HC) preparation, (Plenadren®), offers a more physiological cortisol profile and may alter corticosteroid metabolism in vivo.Study Design and Methods: Prospective crossover study assessing the impact of 12 weeks of DR-HC on systemic GC metabolism (urinary steroid metabolome profiling), cortisol activation in the liver (cortisone acetate challenge test), and subcutaneous adipose tissue (microdialysis, biopsy for gene expression analysis) in 51 patients with AI (primary and secondary) in comparison to IR-HC treatment and age- and BMI-matched controls.Results: Patients with AI receiving IR-HC had a higher median 24-hour urinary excretion of cortisol compared with healthy controls (72.1 µg/24 hours [IQR 43.6-124.2] vs 51.9 µg/24 hours [35.5-72.3], P = .02), with lower global activity of 11β-HSD2 and higher 5-alpha reductase activity. Following the switch from IR-HC to DR-HC therapy, there was a significant reduction in urinary cortisol and total GC metabolite excretion, which was most significant in the evening. There was an increase in 11β-HSD2 activity. Hepatic 11β-HSD1 activity was not significantly altered after switching to DR-HC, but there was a significant reduction in the expression and activity of 11β-HSD1 in subcutaneous adipose tissue.Conclusion: Using comprehensive in vivo techniques, we have demonstrated abnormalities in corticosteroid metabolism in patients with primary and secondary AI receiving IR-HC. This dysregulation of pre-receptor glucocorticoid metabolism results in enhanced glucocorticoid activation in adipose tissue, which was ameliorated by treatment with DR-HC

    Brief Report: Diagnostic Accuracy of Oral Mucosal Transudate Tests Compared with Blood-Based Rapid Tests for HIV Among Children Aged 18 Months to 18 Years in Kenya and Zimbabwe.

    Get PDF
    BACKGROUND: Gaps persist in HIV testing for children who were not tested in prevention of mother-to-child HIV transmission programs. Oral mucosal transudate (OMT) rapid HIV tests have been shown to be highly sensitive in adults, but their performance has not been established in children. METHODS: Antiretroviral therapy-naive children aged 18 months to 18 years in Kenya and Zimbabwe were tested for HIV using rapid OraQuick ADVANCE Rapid HIV-1/2 Antibody test on oral fluids (OMT) and blood-based rapid diagnostic testing (BBT). BBT followed Kenyan and Zimbabwean national algorithms. Sensitivity and specificity were calculated using the national algorithms as the reference standard. RESULTS: A total of 1776 children were enrolled; median age was 7.3 years (interquartile range: 4.7-11.6). Among 71 children positive by BBT, all 71 were positive by OMT (sensitivity: 100% [97.5% confidence interval (CI): 94.9% to 100%]). Among the 1705 children negative by BBT, 1703 were negative by OMT (specificity: 99.9% [95% CI: 99.6% to 100.0%]). Due to discrepant BBT and OMT results, 2 children who initially tested BBT-negative and OMT-positive were subsequently confirmed positive within 1 week by further tests. Excluding these 2 children, the sensitivity and specificity of OMT compared with those of BBT were each 100% (97.5% CI: 94.9% to 100% and 99.8% to 100%, respectively). CONCLUSIONS: Compared to national algorithms, OMT did not miss any HIV-positive children. These data suggest that OMTs are valid in this age range. Future research should explore the acceptability and uptake of OMT by caregivers and health workers to increase pediatric HIV testing coverage

    Large-Scale Selective Sweep among Segregation Distorter Chromosomes in African Populations of Drosophila melanogaster

    Get PDF
    Segregation Distorter (SD) is a selfish, coadapted gene complex on chromosome 2 of Drosophila melanogaster that strongly distorts Mendelian transmission; heterozygous SD/SD+ males sire almost exclusively SD-bearing progeny. Fifty years of genetic, molecular, and theory work have made SD one of the best-characterized meiotic drive systems, but surprisingly the details of its evolutionary origins and population dynamics remain unclear. Earlier analyses suggested that the SD system arose recently in the Mediterranean basin and then spread to a low, stable equilibrium frequency (1–5%) in most natural populations worldwide. In this report, we show, first, that SD chromosomes occur in populations in sub-Saharan Africa, the ancestral range of D. melanogaster, at a similarly low frequency (∼2%), providing evidence for the robustness of its equilibrium frequency but raising doubts about the Mediterranean-origins hypothesis. Second, our genetic analyses reveal two kinds of SD chromosomes in Africa: inversion-free SD chromosomes with little or no transmission advantage; and an African-endemic inversion-bearing SD chromosome, SD-Mal, with a perfect transmission advantage. Third, our population genetic analyses show that SD-Mal chromosomes swept across the African continent very recently, causing linkage disequilibrium and an absence of variability over 39% of the length of the second chromosome. Thus, despite a seemingly stable equilibrium frequency, SD chromosomes continue to evolve, to compete with one another, or evade suppressors in the genome

    Implementation determinants and strategies in integration of PrEP into maternal and child health and family planning services: experiences of frontline healthcare workers in Kenya

    Get PDF
    BackgroundDelivery of PrEP to adolescent girls and young women (AGYW) and to pregnant women through maternal and child health (MCH) and family planning (FP) clinics is scaling up in Kenya. Evaluation of implementation challenges and strategies is critical to optimize delivery.MethodsWe conducted focus group discussions (FGDs) with healthcare workers (HCWs) in MCH and FP clinics offering PrEP in a large implementation project in Kisumu, Kenya. Discussion guides were based on the Consolidated Framework for Implementation Research (CFIR). FGDs were audio recorded and transcribed. Directed content analysis was used to identify implementation challenges and strategies to overcome them.ResultsFifty HCWs from 26 facilities participated in 8 FGDs. HCWs believed PrEP integration was appropriate because it met the needs of AGYW and pregnant women by providing a female-controlled prevention strategy and aligned with policy priorities of elimination of vertical HIV transmission. They were universally accepting of PrEP provision, especially through MCH clinics, noting the relative advantage of this approach because it: (1) enabled high coverage, (2) harmonized PrEP and MCH visits, and (3) minimized stigma compared to PrEP offered through HIV care clinics. However, HCWs noted implementation challenges affecting feasibility and adoption including: (1) increased workload and documentation burden amid workforce shortages, (2) insufficient health care worker knowledge (3) multiple implementing partners with competing priorities (4) drug and documentation form stockouts. HCWs employed various implementation strategies to overcome challenges, including task shifting from nurses to HIV testing providers, patient flow modifications (e.g., fast-tracking PrEP clients to reduce wait times), PrEP demand generation and myth clarification during health talks, provider education, dedicated PrEP delivery rooms, and coordination with adolescent-friendly services. Additional suggested strategies to improve PrEP integration included community education to increase broader PrEP awareness and enable shorter counseling sessions, and task-shifting data entry and client risk assessments.ConclusionsHCWs were enthusiastic about the appropriateness and acceptability of integrating PrEP services into MCH and FP clinics but noted challenges to adoption and feasibility. Strategies to address challenges focused on improving provider time and space constraints, and increasing provider and client knowledge
    • …
    corecore