82 research outputs found

    Ethnic Identity and Local Government Responsiveness in Taiwan

    Get PDF
    Countless studies have shown that local officials are less responsive to ethnic minority citizens. Surprisingly, we find no similar pattern of discrimination by Taiwanese local officials. In an online contacting experiment, we send citizen service requests to the websites of 358 township and district chiefs, randomly varying the name of the putative citizen to reflect an indigenous or an ethnically Chinese identity and collecting data on officials\u27 responses. We find that officials are equally responsive to both identities. Drawing on in-depth interviews and nonparticipant observation in government service centers, we attribute this surprising finding to institutional elements of Taiwan\u27s local bureaucracy that limit the impact of individual-level bias. However, our research provides preliminary evidence that local governments are generally less responsive in indigenous areas. While clearly defined procedures may prevent discrimination against indigenous individuals, interregional differences in local state capacity can nonetheless produce unequal experiences with local governance

    Staphylococcus aureus bacteremia in pediatric patients: Uncovering a rural health challenge

    Get PDF
    BACKGROUND: METHODS: To investigate factors influencing RESULTS: Of 251 patients, 69 (27%) were from rural areas; 28 (11%) were initially admitted to an OSH. Treatment failure occurred in 39 (16%) patients. Patients from rural areas were more likely to be infected with methicillin-resistant CONCLUSIONS: Children from rural areas face barriers to specialized health care. These challenges may contribute to severe illness and worse outcomes among children wit

    Association of inappropriate outpatient pediatric antibiotic prescriptions with adverse drug events and health care expenditures

    Get PDF
    Importance: Nonguideline antibiotic prescribing for the treatment of pediatric infections is common, but the consequences of inappropriate antibiotics are not well described. Objective: To evaluate the comparative safety and health care expenditures of inappropriate vs appropriate oral antibiotic prescriptions for common outpatient pediatric infections. Design, Setting, and Participants: This cohort study included children aged 6 months to 17 years diagnosed with a bacterial infection (suppurative otitis media [OM], pharyngitis, sinusitis) or viral infection (influenza, viral upper respiratory infection [URI], bronchiolitis, bronchitis, nonsuppurative OM) as an outpatient from April 1, 2016, to September 30, 2018, in the IBM MarketScan Commercial Database. Data were analyzed from August to November 2021. Exposures: Inappropriate (ie, non-guideline-recommended) vs appropriate (ie, guideline-recommended) oral antibiotic agents dispensed from an outpatient pharmacy on the date of infection. Main Outcomes and Measures: Propensity score-weighted Cox proportional hazards models were used to estimate hazards ratios (HRs) and 95% CIs for the association between inappropriate antibiotic prescriptions and adverse drug events. Two-part models were used to calculate 30-day all-cause attributable health care expenditures by infection type. National-level annual attributable expenditures were calculated by scaling attributable expenditures in the study cohort to the national employer-sponsored insurance population. Results: The cohort included 2 804 245 eligible children (52% male; median [IQR] age, 8 [4-12] years). Overall, 31% to 36% received inappropriate antibiotics for bacterial infections and 4% to 70% for viral infections. Inappropriate antibiotics were associated with increased risk of several adverse drug events, including Clostridioides difficile infection and severe allergic reaction among children treated with a nonrecommended antibiotic agent for a bacterial infection (among patients with suppurative OM, C. difficile infection: HR, 6.23; 95% CI, 2.24-17.32; allergic reaction: HR, 4.14; 95% CI, 2.48-6.92). Thirty-day attributable health care expenditures were generally higher among children who received inappropriate antibiotics, ranging from 21to21 to 56 for bacterial infections and from -96to96 to 97 for viral infections. National annual attributable expenditure estimates were highest for suppurative OM (25.3million),pharyngitis(25.3 million), pharyngitis (21.3 million), and viral URI ($19.1 million). Conclusions and Relevance: In this cohort study of children with common infections treated in an outpatient setting, inappropriate antibiotic prescriptions were common and associated with increased risks of adverse drug events and higher attributable health care expenditures. These findings highlight the individual- and national-level consequences of inappropriate antibiotic prescribing and further support implementation of outpatient antibiotic stewardship programs

    Defining the ABC of gene essentiality in streptococci

    Get PDF
    Background Utilising next generation sequencing to interrogate saturated bacterial mutant libraries provides unprecedented information for the assignment of genome-wide gene essentiality. Exposure of saturated mutant libraries to specific conditions and subsequent sequencing can be exploited to uncover gene essentiality relevant to the condition. Here we present a barcoded transposon directed insertion-site sequencing (TraDIS) system to define an essential gene list for Streptococcus equi subsp. equi, the causative agent of strangles in horses, for the first time. The gene essentiality data for this group C Streptococcus was compared to that of group A and B streptococci. Results Six barcoded variants of pGh9:ISS1 were designed and used to generate mutant libraries containing between 33,000-66,000 unique mutants. TraDIS was performed on DNA extracted from each library and data were analysed separately and as a combined master pool. Gene essentiality determined that 19.5% of the S. equi genome was essential. Gene essentialities were compared to those of group A and group B streptococci, identifying concordances of 90.2% and 89.4%, respectively and an overall concordance of 83.7% between the three species. Conclusions The use of barcoded pGh9:ISS1 to generate mutant libraries provides a highly useful tool for the assignment of gene function in S. equi and other streptococci. The shared essential gene set of group A, B and C streptococci provides further evidence of the close genetic relationships between these important pathogenic bacteria. Therefore, the ABC of gene essentiality reported here provides a solid foundation towards reporting the functional genome of streptococci

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Longer-term efficiency and safety of increasing the frequency of whole blood donation (INTERVAL): extension study of a randomised trial of 20 757 blood donors

    Get PDF
    Background: The INTERVAL trial showed that, over a 2-year period, inter-donation intervals for whole blood donation can be safely reduced to meet blood shortages. We extended the INTERVAL trial for a further 2 years to evaluate the longer-term risks and benefits of varying inter-donation intervals, and to compare routine versus more intensive reminders to help donors keep appointments. Methods: The INTERVAL trial was a parallel group, pragmatic, randomised trial that recruited blood donors aged 18 years or older from 25 static donor centres of NHS Blood and Transplant across England, UK. Here we report on the prespecified analyses after 4 years of follow-up. Participants were whole blood donors who agreed to continue trial participation on their originally allocated inter-donation intervals (men: 12, 10, and 8 weeks; women: 16, 14, and 12 weeks). They were further block-randomised (1:1) to routine versus more intensive reminders using computer-generated random sequences. The prespecified primary outcome was units of blood collected per year analysed in the intention-to-treat population. Secondary outcomes related to safety were quality of life, self-reported symptoms potentially related to donation, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin and other factors. This trial is registered with ISRCTN, number ISRCTN24760606, and has completed. Findings: Between Oct 19, 2014, and May 3, 2016, 20 757 of the 38 035 invited blood donors (10 843 [58%] men, 9914 [51%] women) participated in the extension study. 10 378 (50%) were randomly assigned to routine reminders and 10 379 (50%) were randomly assigned to more intensive reminders. Median follow-up was 1·1 years (IQR 0·7–1·3). Compared with routine reminders, more intensive reminders increased blood collection by a mean of 0·11 units per year (95% CI 0·04–0·17; p=0·0003) in men and 0·06 units per year (0·01–0·11; p=0·0094) in women. During the extension study, each week shorter inter-donation interval increased blood collection by a mean of 0·23 units per year (0·21–0·25) in men and 0·14 units per year (0·12–0·15) in women (both p<0·0001). More frequent donation resulted in more deferrals for low haemoglobin (odds ratio per week shorter inter-donation interval 1·19 [95% CI 1·15–1·22] in men and 1·10 [1·06–1·14] in women), and lower mean haemoglobin (difference per week shorter inter-donation interval −0·84 g/L [95% CI −0·99 to −0·70] in men and −0·45 g/L [–0·59 to −0·31] in women) and ferritin concentrations (percentage difference per week shorter inter-donation interval −6·5% [95% CI −7·6 to −5·5] in men and −5·3% [–6·5 to −4·2] in women; all p<0·0001). No differences were observed in quality of life, serious adverse events, or self-reported symptoms (p>0.0001 for tests of linear trend by inter-donation intervals) other than a higher reported frequency of doctor-diagnosed low iron concentrations and prescription of iron supplements in men (p<0·0001). Interpretation: During a period of up to 4 years, shorter inter-donation intervals and more intensive reminders resulted in more blood being collected without a detectable effect on donors' mental and physical wellbeing. However, donors had decreased haemoglobin concentrations and more self-reported symptoms compared with the initial 2 years of the trial. Our findings suggest that blood collection services could safely use shorter donation intervals and more intensive reminders to meet shortages, for donors who maintain adequate haemoglobin concentrations and iron stores. Funding: NHS Blood and Transplant, UK National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Whole-cell and single-channel α 1 β 1 γ 2S GABA A receptor currents elicited by a ”multipuffer” drug application device

    Full text link
     Pharmacological characterization of ion channels and receptors in cultured neurons or transfected cell lines requires microapplication of multiple drug solutions during electrophysiological recording. An ideal device could apply a large number of solutions to a limited area with rapid arrival and removal of drug solutions. We describe a novel ”multipuffer” rapid application device, based on a modified T-tube with a nozzle made from a glass micropipette tip. Drug solutions are drawn via suction from open reservoirs mounted above the recording chamber through the device into a waste trap. Closure of a solenoid valve between the device and the waste trap causes flow of drug solution though the T-tube nozzle. Any number of drug solutions can be applied with rapid onset (50–100 ms) after a brief fixed delay (100–200 ms). Recombinant α 1 β 1 γ 2S GABA A receptors (GABARs) transfected into L929 fibroblasts were recorded using whole-cell and single-channel configurations. Application of GABA resulted in chloride currents with an EC 50 of 12.2 μM and a Hill slope of 1.27, suggesting more than one binding site for GABA. GABAR currents were enhanced by diazepam and pentobarbital and inhibited by bicuculline and picrotoxin. Single-channel recordings revealed a main conductance state of 26–28 pS. This device is particularly suitable for rapid, spatially controlled drug applications onto neurons or other cells recorded in the whole-cell configuration, but is also appropriate for isolated single-channel or multichannel membrane patch recordings.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/42242/1/424-432-6-1080_64321080.pd

    Large Prospective Study of Ovarian Cancer Screening in High-Risk Women: CA125 Cut-Point Defined by Menopausal Status

    Get PDF
    Previous screening trials for early detection of ovarian cancer in postmenopausal women have used the standard CA125 cut-point of 35 U/mL, the 98th percentile in this population yielding a 2% false positive rate, while the same cut-point in trials of premenopausal women results in substantially higher false positive rates. We investigated demographic and clinical factors predicting CA125 distributions, including 98th percentiles, in a large population of high-risk women participating in two ovarian cancer screening studies with common eligibility criteria and screening protocols
    corecore