273 research outputs found

    Deletion of the GABAA α2-subunit does not alter self dministration of cocaine or reinstatement of cocaine seeking

    Get PDF
    Rationale GABAA receptors containing α2-subunits are highly represented in brain areas that are involved in motivation and reward, and have been associated with addiction to several drugs, including cocaine. We have shown previously that a deletion of the α2-subunit results in an absence of sensitisation to cocaine. Objective We investigated the reinforcing properties of cocaine in GABAA α2-subunit knockout (KO) mice using an intravenous self-administration procedure. Methods α2-subunit wildtype (WT), heterozygous (HT) and KO mice were trained to lever press for a 30 % condensed milk solution. After implantation with a jugular catheter, mice were trained to lever press for cocaine (0.5 mg/kg/infusion) during ten daily sessions. Responding was extinguished and the mice tested for cue- and cocaine-primed reinstatement. Separate groups of mice were trained to respond for decreasing doses of cocaine (0.25, 0.125, 0.06 and 0.03 mg/kg). Results No differences were found in acquisition of lever pressing for milk. All genotypes acquired self-administration of cocaine and did not differ in rates of self-administration, dose dependency or reinstatement. However, whilst WT and HT mice showed a dose-dependent increase in lever pressing during the cue presentation, KO mice did not. Conclusions Despite a reported absence of sensitisation, motivation to obtain cocaine remains unchanged in KO and HT mice. Reinstatement of cocaine seeking by cocaine and cocaine-paired cues is also unaffected. We postulate that whilst not directly involved in reward perception, the α2-subunit may be involved in modulating the “energising” aspect of cocaine’s effects on reward-seeking

    Deletion of the gabra2 gene results in hypersensitivity to the acute effects of ethanol but does not alter ethanol self administration

    Get PDF
    Human genetic studies have suggested that polymorphisms of the GABRA2 gene encoding the GABA(A) α2-subunit are associated with ethanol dependence. Variations in this gene also convey sensitivity to the subjective effects of ethanol, indicating a role in mediating ethanol-related behaviours. We therefore investigated the consequences of deleting the α2-subunit on the ataxic and rewarding properties of ethanol in mice. Ataxic and sedative effects of ethanol were explored in GABA(A) α2-subunit wildtype (WT) and knockout (KO) mice using a Rotarod apparatus, wire hang and the duration of loss of righting reflex. Following training, KO mice showed shorter latencies to fall than WT littermates under ethanol (2 g/kg i.p.) in both Rotarod and wire hang tests. After administration of ethanol (3.5 g/kg i.p.), KO mice took longer to regain the righting reflex than WT mice. To ensure the acute effects are not due to the gabra2 deletion affecting pharmacokinetics, blood ethanol concentrations were measured at 20 minute intervals after acute administration (2 g/kg i.p.), and did not differ between genotypes. To investigate ethanol's rewarding properties, WT and KO mice were trained to lever press to receive increasing concentrations of ethanol on an FR4 schedule of reinforcement. Both WT and KO mice self-administered ethanol at similar rates, with no differences in the numbers of reinforcers earned. These data indicate a protective role for α2-subunits, against the acute sedative and ataxic effects of ethanol. However, no change was observed in ethanol self administration, suggesting the rewarding effects of ethanol remain unchange

    Clinical characteristics of different histologic types of breast cancer

    Get PDF
    Breast cancer is a heterogeneous disease, though little is known about some of its rarer forms, including certain histologic types. Using Surveillance, Epidemiology, and End Results Program data on 135 157 invasive breast cancer cases diagnosed from 1992 to 2001, relationships between nine histologic types of breast cancer and various tumour characteristics were assessed. Among women aged 50–89 years at diagnosis, lobular and ductal/lobular carcinoma cases were more likely to be diagnosed with stage III/IV, ⩾5.0 cm, and node-positive tumours compared to ductal carcinoma cases. Mucinous, comedo, tubular, and medullary carcinomas were less likely to present at an advanced stage. Lobular, ductal/lobular, mucinous, tubular, and papillary carcinomas were less likely, and comedo, medullary, and inflammatory carcinomas were more likely to be oestrogen receptor (ER) negative/progesterone receptor (PR) negative and high grade (notably, 68.2% of medullary carcinomas were ER−/PR− vs 19.3% of ductal carcinomas). In general, similar differences were observed among women diagnosed at age 30–49 years. Inflammatory carcinomas are associated with more aggressive tumour phenotypes, and mucinous, tubular, and papillary tumours are associated with less aggressive phenotypes. The histologic types of breast cancer studied here differ greatly in their clinical presentations, and the differences in their hormone receptor profiles and grades point to their likely different aetiologies

    Early-life adversity selectively impairs α2-GABAA receptor expression in the mouse nucleus accumbens and influences the behavioral effects of cocaine

    Get PDF
    Haplotypes of the Gabra2 gene encoding the α2-subunit of the GABAA receptor (GABAAR) are associated with drug abuse, suggesting that α2-GABAARs may play an important role in the circuitry underlying drug misuse. The genetic association of Gabra2 haplotypes with cocaine addiction appears to be evident primarily in individuals who had experienced childhood trauma. Given this association of childhood trauma, cocaine abuse and the Gabra2 haplotypes, we have explored in a mouse model of early life adversity (ELA) whether such events influence the behavioral effects of cocaine and if, as suggested by the human studies, α2-GABAARs in the nucleus accumbens (NAc) are involved in these perturbed behaviors. In adult mice prior ELA caused a selective decrease of accumbal α2-subunit mRNA, resulting in a selective decrease in the number and size of the α2-subunit (but not the α1-subunit) immunoreactive clusters in NAc core medium spiny neurons (MSNs). Functionally, in adult MSNs ELA decreased the amplitude and frequency of GABAAR-mediated miniature inhibitory postsynaptic currents (mIPSCs), a profile similar to that of α2 “knock-out” (α2−/−) mice. Behaviourally, adult male ELA and α2−/− mice exhibited an enhanced locomotor response to acute cocaine and blunted sensitisation upon repeated cocaine administration, when compared to their appropriate controls. Collectively, these findings reveal a neurobiological mechanism which may relate to the clinical observation that early trauma increases the risk for substance abuse disorder (SAD) in individuals harbouring haplotypic variations in the Gabra2 gene

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Long-term (trophic) purinergic signalling: purinoceptors control cell proliferation, differentiation and death

    Get PDF
    The purinergic signalling system, which uses purines and pyrimidines as chemical transmitters, and purinoceptors as effectors, is deeply rooted in evolution and development and is a pivotal factor in cell communication. The ATP and its derivatives function as a 'danger signal' in the most primitive forms of life. Purinoceptors are extraordinarily widely distributed in all cell types and tissues and they are involved in the regulation of an even more extraordinary number of biological processes. In addition to fast purinergic signalling in neurotransmission, neuromodulation and secretion, there is long-term (trophic) purinergic signalling involving cell proliferation, differentiation, motility and death in the development and regeneration of most systems of the body. In this article, we focus on the latter in the immune/defence system, in stratified epithelia in visceral organs and skin, embryological development, bone formation and resorption, as well as in cancer. Cell Death and Disease (2010) 1, e9; doi:10.1038/cddis.2009.11; published online 14 January 201

    Prehospital transdermal glyceryl trinitrate in patients with ultra-acute presumed stroke (RIGHT-2): an ambulance-based, randomised, sham-controlled, blinded, phase 3 trial

    Get PDF
    Background High blood pressure is common in acute stroke and is a predictor of poor outcome; however, large trials of lowering blood pressure have given variable results, and the management of high blood pressure in ultra-acute stroke remains unclear. We investigated whether transdermal glyceryl trinitrate (GTN; also known as nitroglycerin), a nitric oxide donor, might improve outcome when administered very early after stroke onset. Methods We did a multicentre, paramedic-delivered, ambulance-based, prospective, randomised, sham-controlled, blinded-endpoint, phase 3 trial in adults with presumed stroke within 4 h of onset, face-arm-speech-time score of 2 or 3, and systolic blood pressure 120 mm Hg or higher. Participants were randomly assigned (1:1) to receive transdermal GTN (5 mg once daily for 4 days; the GTN group) or a similar sham dressing (the sham group) in UKbased ambulances by paramedics, with treatment continued in hospital. Paramedics were unmasked to treatment, whereas participants were masked. The primary outcome was the 7-level modified Rankin Scale (mRS; a measure of functional outcome) at 90 days, assessed by central telephone follow-up with masking to treatment. Analysis was hierarchical, first in participants with a confirmed stroke or transient ischaemic attack (cohort 1), and then in all participants who were randomly assigned (intention to treat, cohort 2) according to the statistical analysis plan. This trial is registered with ISRCTN, number ISRCTN26986053. Findings Between Oct 22, 2015, and May 23, 2018, 516 paramedics from eight UK ambulance services recruited 1149 participants (n=568 in the GTN group, n=581 in the sham group). The median time to randomisation was 71 min (IQR 45–116). 597 (52%) patients had ischaemic stroke, 145 (13%) had intracerebral haemorrhage, 109 (9%) had transient ischaemic attack, and 297 (26%) had a non-stroke mimic at the final diagnosis of the index event. In the GTN group, participants’ systolic blood pressure was lowered by 5·8 mm Hg compared with the sham group (p<0·0001), and diastolic blood pressure was lowered by 2·6 mm Hg (p=0·0026) at hospital admission. We found no difference in mRS between the groups in participants with a final diagnosis of stroke or transient ischaemic stroke (cohort 1): 3 (IQR 2–5; n=420) in the GTN group versus 3 (2–5; n=408) in the sham group, adjusted common odds ratio for poor outcome 1·25 (95% CI 0·97–1·60; p=0·083); we also found no difference in mRS between all patients (cohort 2: 3 [2–5]; n=544, in the GTN group vs 3 [2–5]; n=558, in the sham group; 1·04 [0·84–1·29]; p=0·69). We found no difference in secondary outcomes, death (treatment-related deaths: 36 in the GTN group vs 23 in the sham group [p=0·091]), or serious adverse events (188 in the GTN group vs 170 in the sham group [p=0·16]) between treatment groups. Interpretation Prehospital treatment with transdermal GTN does not seem to improve functional outcome in patients with presumed stroke. It is feasible for UK paramedics to obtain consent and treat patients with stroke in the ultraacute prehospital setting

    Reconstructing grassland fire history using sedimentary charcoal: Considering count, size and shape

    Get PDF
    Citation: Leys, B. A., Commerford, J. L., & McLauchlan, K. K. (2017). Reconstructing grassland fire history using sedimentary charcoal: Considering count, size and shape. Plos One, 12(4), 15. doi:10.1371/journal.pone.0176445Fire is a key Earth system process, with 80% of annual fire activity taking place in grassland areas. However, past fire regimes in grassland systems have been difficult to quantify due to challenges in interpreting the charcoal signal in depositional environments. To improve reconstructions of grassland fire regimes, it is essential to assess two key traits: (1) charcoal count, and (2) charcoal shape. In this study, we quantified the number of charcoal pieces in 51 sediment samples of ponds in the Great Plains and tested its relevance as a proxy for the fire regime by examining 13 potential factors influencing charcoal count, including various fire regime components (e.g. the fire frequency, the area burned, and the fire season), vegetation cover and pollen assemblages, and climate variables. We also quantified the width to length (W: L) ratio of charcoal particles, to assess its utility as a proxy of fuel types in grassland environments by direct comparison with vegetation cover and pollen assemblages. Our first conclusion is that charcoal particles produced by grassland fires are smaller than those produced by forest fires. Thus, a mesh size of 120 mu m as used in forested environments is too large for grassland ecosystems. We recommend counting all charcoal particles over 60 mu m in grasslands and mixed grass-forest environments to increase the number of samples with useful data. Second, a W: L ratio of 0.5 or smaller appears to be an indicator for fuel types, when vegetation surrounding the site is before composed of at least 40% grassland vegetation. Third, the area burned within 1060m of the depositional environments explained both the count and the area of charcoal particles. Therefore, changes in charcoal count or charcoal area through time indicate a change in area burned. The fire regimes of grassland systems, including both human and climatic influences on fire behavior, can be characterized by long-term charcoal records
    corecore