409 research outputs found

    Fundamentalists\u27 Efforts to Intervene in Curricular Decisions

    Get PDF

    Fundamentalists\u27 Efforts to Intervene in Curricular Decisions

    Get PDF

    A face for all seasons:searching for context-specific leadership traits and discovering a general preference for perceived health

    Get PDF
    Previous research indicates that followers tend to contingently match particular leader qualities to evolutionarily consistent situations requiring collective action (i.e., context-specific cognitive leadership prototypes) and information processing undergoes categorization which ranks certain qualities as first-order context-general and others as second-order context-specific. To further investigate this contingent categorization phenomenon we examined the “attractiveness halo”—a first-order facial cue which significantly biases leadership preferences. While controlling for facial attractiveness, we independently manipulated the underlying facial cues of health and intelligence and then primed participants with four distinct organizational dynamics requiring leadership (i.e., competition vs. cooperation between groups and exploratory change vs. stable exploitation). It was expected that the differing requirements of the four dynamics would contingently select for relatively healthier- or intelligent-looking leaders. We found perceived facial intelligence to be a second-order context-specific trait—for instance, in times requiring a leader to address between-group cooperation—whereas perceived health is significantly preferred across all contexts (i.e., a first-order trait). The results also indicate that facial health positively affects perceived masculinity while facial intelligence negatively affects perceived masculinity, which may partially explain leader choice in some of the environmental contexts. The limitations and a number of implications regarding leadership biases are discussed

    Intravital imaging of the kidney in a rat model of salt-sensitive hypertension

    Get PDF
    Hypertension is one of the most prevalent diseases worldwide and a major risk factor for renal failure and cardiovascular disease. The role of albuminuria, a common feature of hypertension and robust predictor of cardiorenal disorders, remains incompletely understood. The goal of this study was to investigate the mechanisms leading to albuminuria in the kidney of a rat model of hypertension, the Dahl salt-sensitive (SS) rat. To determine the relative contributions of the glomerulus and proximal tubule (PT) to albuminuria, we applied intravital two-photon-based imaging to investigate the complex renal physiological changes that occur during salt-induced hypertension. Following a high-salt diet, SS rats exhibited elevated blood pressure, increased glomerular sieving of albumin (GSCalb = 0.0686), relative permeability to albumin (+Δ16%), and impaired volume hemodynamics (-Δ14%). Serum albumin but not serum globulins or creatinine concentration was decreased (-0.54 g/dl), which was concomitant with increased filtration of albumin (3.7 vs. 0.8 g/day normal diet). Pathologically, hypertensive animals had significant tubular damage, as indicated by increased prevalence of granular casts, expansion and necrosis of PT epithelial cells (+Δ2.20 score/image), progressive augmentation of red blood cell velocity (+Δ269 µm/s) and micro vessel diameter (+Δ4.3 µm), and increased vascular injury (+Δ0.61 leakage/image). Therefore, development of salt-induced hypertension can be triggered by fast and progressive pathogenic remodeling of PT epithelia, which can be associated with changes in albumin handling. Collectively, these results indicate that both the glomerulus and the PT contribute to albuminuria, and dual treatment of glomerular filtration and albumin reabsorption may represent an effective treatment of salt-sensitive hypertension

    Angiogenesis inhibitor therapies for advanced renal cell carcinoma: Toxicity and treatment patterns in clinical practice from a global medical chart review

    Get PDF
    The aim of this study was to assess the treatment patterns and safety of sunitinib, sorafenib and bevacizumab in real-world clinical settings in US, Europe and Asia. Medical records were abstracted at 18 community oncology clinics in the US and at 21 tertiary oncology centers in US, Europe and Asia for 883 patients ≥18 years who had histologically/cytologically confirmed diagnosis of advanced RCC and received sunitinib (n=631), sorafenib (n=207) or bevacizumab (n=45) as first‑line treatment. No prior treatment was permitted. Data were collected on all adverse events (AEs) and treatment modifications, including discontinuation, interruption and dose reduction. Treatment duration was estimated using Kaplan-Meier analysis. Demographics were similar across treatment groups and regions. Median treatment duration ranged from 6.1 to 10.7 months, 5.1 to 8.5 months and 7.5 to 9.8 months for sunitinib, sorafenib and bevacizumab patients, respectively. Grade 3/4 AEs were experienced by 26.0, 28.0 and 15.6% of sunitinib, sorafenib and bevacizumab patients, respectively. Treatment discontinuations occurred in 62.4 (Asia) to 63.1% (US) sunitinib, 68.8 (Asia) to 90.0% (Europe) sorafenib, and 66.7 (Asia) to 81.8% (US) bevacizumab patients. Globally, treatment modifications due to AEs occurred in 55.1, 54.2 and 50.0% sunitinib, sorafenib and bevacizumab patients, respectively. This study in a large, global cohort of advanced RCC patients found that angiogenesis inhibitors are associated with high rates of AEs and treatment modifications. Findings suggest an unmet need for more tolerable agents for RCC treatment

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Infestation of shore crab gills by a free-living mussel species

    Get PDF
    Parasitic and commensal species can impact the structure and function of ecological communities and are typically highly specialized to overcome host defences. Here, we report multiple instances of a normally free-living species, the blue mussel Mytilus edulis Linnaeus, 1758, inhabiting the branchial chamber of the shore crab Carcinus maenas (Linnaeus, 1758) collected from widely separated geographical locations. A total of 127 C. maenas were examined from four locations in the English Channel, one location in the Irish Sea and two locations at the entrance of the Baltic Sea. The branchial chambers of three crabs (one from the English Channel and two from Gullmar Fjord, Sweden) were infested with mussels resembling the genus Mytilus. Sequencing at the Me15/16 locus on the polyphenolic adhesive protein gene confirmed the identity as M. edulis. Bivalve infestation always occurred in larger red male individuals. Up to 16 mussels, ranging from 2 to 11 mm in shell length, were found in each individual, either wedged between gill lamellae or attached to the branchial chamber inner wall. This is one of the first reports of a bivalve inhabiting crustacean gills and is an intriguing case of a normally free-living prey species infesting its predato
    corecore