186 research outputs found

    Verifying Different-modality Properties for Concepts Produces Switching Costs

    Get PDF
    According to perceptual symbol systems (Barsalou, 1999), sensory-motor simulations underlie the representation of concepts. It follows that sensory-motor phenomena should arise in conceptual processing. Previous studies have shown that switching from one modality to another during perceptual processing incurs a processing cost. If perceptual simulation underlies conceptual processing, then verifying the properties of concepts should exhibit a switching cost as well. For example, verifying a property in the auditory modality (e.g., BLENDER-loud) should be slower after verifying a property in a different modality (e.g., CRANBERRIES-tart) than in the same modality (e.g., LEAVES-rustling). Only words were presented to subjects, and there were no instructions to use imagery. Nevertheless switching modalities incurred a cost, analogous to switching modalities in perception. A second experiment showed that this effect was not due to associative priming between properties in the same modality. These results support the hypothesis that perceptual simulation underlies conceptual processing

    Highly unsaturated fatty acid synthesis in marine fish: Cloning, functional characterization, and nutritional regulation of fatty acyl delta6 desaturase of Atlantic cod (Gadus morhua L.)

    Get PDF
    Fish contain high levels of the n-3 highly unsaturated fatty acids (HUFA), eicosapentaenoic (EPA) and docosahexaenoic (DHA) acids that are crucial to the health of higher vertebrates. Biosynthesis of HUFA requires enzyme-mediated desaturation of fatty acids. Here we report cloning and functional characterisation of a ∆6 fatty acyl desaturase of Atlantic cod (Gadus morhua), and describe its tissue expression and nutritional regulation. PCR primers were designed based on the sequences of conserved motifs in available fish desaturases and used to isolate a cDNA fragment from liver of cod. The full-length cDNA was obtained by Rapid Amplification of cDNA Ends (RACE). The cDNA for the putative fatty acyl desaturase was shown to comprise 1980bp which included a 5’-UTR of 261bp and a 3’-UTR of 375bp. Sequencing revealed that the cDNA included an ORF of 1344 bp that specified a protein of 447 amino acids. The protein sequence included three histidine boxes, two transmembrane regions, and an N-terminal cytochrome b5 domain containing the haem-binding motif HPGG, all of which are characteristic of microsomal fatty acid desaturases. The cDNA displayed Δ6 desaturase activity in a heterologous yeast expression system. Quantitative real time PCR assay of gene expression in cod showed that the ∆6 desaturase gene, was highly expressed in brain, relatively highly expressed in liver, kidney, intestine, red muscle and gill, and expressed at much lower levels in white muscle, spleen and heart. In contrast, the abundance of a cod fatty acyl elongase transcript was high in brain and gill, with intermediate levels in kidney, spleen, intestine and heart, and relatively low expression in liver. The expression of the Δ6 desaturase gene and the PUFA elongase gene may be under a degree of nutritional regulation, with levels being marginally increased in livers and intestine of fish fed a vegetable oil blend by comparison with levels in fish fed fish oil. However, this was not reflected in increased Δ6 desaturase activity in hepatocytes or enterocytes, which showed very little highly unsaturated fatty acid biosynthesis activity irrespective of diet. The study described has demonstrated that Atlantic cod express a fatty acid desaturase gene with functional Δ6 activity in a yeast expression system. This is consistent with an established hypothesis that the poor ability of marine fish to synthesise HUFA is not due to lack of a Δ6 desaturase, but rather to deficiencies in other parts of the biosynthetic pathway. However, further studies are required to determine why the Δ6 desaturase appears to be barely functional in cod under the conditions tested

    Leaf litter decomposition -- Estimates of global variability based on Yasso07 model

    Full text link
    Litter decomposition is an important process in the global carbon cycle. It accounts for most of the heterotrophic soil respiration and results in formation of more stable soil organic carbon (SOC) which is the largest terrestrial carbon stock. Litter decomposition may induce remarkable feedbacks to climate change because it is a climate-dependent process. To investigate the global patterns of litter decomposition, we developed a description of this process and tested the validity of this description using a large set of foliar litter mass loss measurements (nearly 10 000 data points derived from approximately 70 000 litter bags). We applied the Markov chain Monte Carlo method to estimate uncertainty in the parameter values and results of our model called Yasso07. The model appeared globally applicable. It estimated the effects of litter type (plant species) and climate on mass loss with little systematic error over the first 10 decomposition years, using only initial litter chemistry, air temperature and precipitation as input variables. Illustrative of the global variability in litter mass loss rates, our example calculations showed that a typical conifer litter had 68% of its initial mass still remaining after two decomposition years in tundra while a deciduous litter had only 15% remaining in the tropics. Uncertainty in these estimates, a direct result of the uncertainty of the parameter values of the model, varied according to the distribution of the litter bag data among climate conditions and ranged from 2% in tundra to 4% in the tropics. This reliability was adequate to use the model and distinguish the effects of even small differences in litter quality or climate conditions on litter decomposition as statistically significant.Comment: 19 Pages, to appear in Ecological Modellin

    Continuous cerebroventricular administration of dopamine: A new treatment for severe dyskinesia in Parkinson’s disease?

    Get PDF
    In Parkinson’s disease (PD) depletion of dopamine in the nigro-striatal pathway is a main pathological hallmark that requires continuous and focal restoration. Current predominant treatment with intermittent oral administration of its precursor, Levodopa (l-dopa), remains the gold standard but pharmacological drawbacks trigger motor fluctuations and dyskinesia. Continuous intracerebroventricular (i.c.v.) administration of dopamine previously failed as a therapy because of an inability to resolve the accelerated dopamine oxidation and tachyphylaxia. We aim to overcome prior challenges by demonstrating treatment feasibility and efficacy of continuous i.c.v. of dopamine close to the striatum. Dopamine prepared either anaerobically (A-dopamine) or aerobically (O-dopamine) in the presence or absence of a conservator (sodium metabisulfite, SMBS) was assessed upon acute MPTP and chronic 6-OHDA lesioning and compared to peripheral l-dopa treatment. A-dopamine restored motor function and induced a dose dependent increase of nigro-striatal tyrosine hydroxylase positive neurons in mice after 7 days of MPTP insult that was not evident with either O-dopamine or l-dopa. In the 6-OHDA rat model, continuous circadian i.c.v. injection of A-dopamine over 30 days also improved motor activity without occurrence of tachyphylaxia. This safety profile was highly favorable as A-dopamine did not induce dyskinesia or behavioral sensitization as observed with peripheral l-dopa treatment. Indicative of a new therapeutic strategy for patients suffering from l-dopa related complications with dyskinesia, continuous i.c.v. of A-dopamine has greater efficacy in mediating motor impairment over a large therapeutic index without inducing dyskinesia and tachyphylaxia

    Can forest management based on natural disturbances maintain ecological resilience?

    Get PDF
    Given the increasingly global stresses on forests, many ecologists argue that managers must maintain ecological resilience: the capacity of ecosystems to absorb disturbances without undergoing fundamental change. In this review we ask: Can the emerging paradigm of natural-disturbance-based management (NDBM) maintain ecological resilience in managed forests? Applying resilience theory requires careful articulation of the ecosystem state under consideration, the disturbances and stresses that affect the persistence of possible alternative states, and the spatial and temporal scales of management relevance. Implementing NDBM while maintaining resilience means recognizing that (i) biodiversity is important for long-term ecosystem persistence, (ii) natural disturbances play a critical role as a generator of structural and compositional heterogeneity at multiple scales, and (iii) traditional management tends to produce forests more homogeneous than those disturbed naturally and increases the likelihood of unexpected catastrophic change by constraining variation of key environmental processes. NDBM may maintain resilience if silvicultural strategies retain the structures and processes that perpetuate desired states while reducing those that enhance resilience of undesirable states. Such strategies require an understanding of harvesting impacts on slow ecosystem processes, such as seed-bank or nutrient dynamics, which in the long term can lead to ecological surprises by altering the forest's capacity to reorganize after disturbance

    Effects of cleaning methods upon preservation of stable isotopes and trace elements in shells of Cyprideis torosa (Crustacea, Ostracoda): implications for palaeoenvironmental reconstruction

    Get PDF
    The trace element (Sr/Ca and Mg/Ca) and stable isotope (δ¹⁸O and δ¹³C) geochemistry of fossil ostracod valves provide valuable information, particularly in lacustrine settings, on palaeo-water composition and palaeotemperature. The removal of sedimentary and organic contamination prior to geochemical analysis is essential to avoid bias of the results. Previous stable isotope and trace element work on ostracod shells has, however, employed different treatments for the removal of contamination beyond simple ‘manual’ cleaning using a paint brush and methanol under a low-power binocular microscope. For isotopic work pre-treatments include chemical oxidation, vacuum roasting and plasma ashing, and for trace element work sonication, chemical oxidation and reductive cleaning. The impact of different treatments on the geochemical composition of the valve calcite has not been evaluated in full, and a universal protocol has not been established. Here, a systematic investigation of the cleaning methods is undertaken using specimens of the ubiquitous euryhaline species, Cyprideis torosa. Cleaning methods are evaluated by undertaking paired analyses on a single carapace (comprising two valves); in modern ostracods, whose valves are assumed to be unaltered, the two valves should have identical geochemical and isotopic composition. Hence, when one valve is subjected to the chosen treatment and the other to simple manual cleaning any difference in composition can confidently be assigned to the treatment method. We show that certain cleaning methods have the potential to cause alteration to the geochemical signal, particularly Mg/Ca and δ¹⁸O, and hence have implications for palaeoenvironmental reconstructions. For trace element determinations we recommend cleaning by sonication and for stable isotope analysis, oxidation by hydrogen peroxide. These methods remove contamination, yet do not significantly alter the geochemical signal

    Cosmological parameters from SDSS and WMAP

    Full text link
    We measure cosmological parameters using the three-dimensional power spectrum P(k) from over 200,000 galaxies in the Sloan Digital Sky Survey (SDSS) in combination with WMAP and other data. Our results are consistent with a ``vanilla'' flat adiabatic Lambda-CDM model without tilt (n=1), running tilt, tensor modes or massive neutrinos. Adding SDSS information more than halves the WMAP-only error bars on some parameters, tightening 1 sigma constraints on the Hubble parameter from h~0.74+0.18-0.07 to h~0.70+0.04-0.03, on the matter density from Omega_m~0.25+/-0.10 to Omega_m~0.30+/-0.04 (1 sigma) and on neutrino masses from <11 eV to <0.6 eV (95%). SDSS helps even more when dropping prior assumptions about curvature, neutrinos, tensor modes and the equation of state. Our results are in substantial agreement with the joint analysis of WMAP and the 2dF Galaxy Redshift Survey, which is an impressive consistency check with independent redshift survey data and analysis techniques. In this paper, we place particular emphasis on clarifying the physical origin of the constraints, i.e., what we do and do not know when using different data sets and prior assumptions. For instance, dropping the assumption that space is perfectly flat, the WMAP-only constraint on the measured age of the Universe tightens from t0~16.3+2.3-1.8 Gyr to t0~14.1+1.0-0.9 Gyr by adding SDSS and SN Ia data. Including tensors, running tilt, neutrino mass and equation of state in the list of free parameters, many constraints are still quite weak, but future cosmological measurements from SDSS and other sources should allow these to be substantially tightened.Comment: Minor revisions to match accepted PRD version. SDSS data and ppt figures available at http://www.hep.upenn.edu/~max/sdsspars.htm

    Health inequalities, fundamental causes and power:Towards the practice of good theory

    Get PDF
    Reducing health inequalities remains a challenge for policy makers across the world. Beginning from Lewin’s famous dictum that “there is nothing as practical as a good theory”, this paper begins from an appreciative discussion of ‘fundamental cause theory’, emphasizing the elegance of its theoretical encapsulation of the challenge, the relevance of its critical focus for action, and its potential to support the practical mobilisation of knowledge in generating change. Moreover, it is argued that recent developments in the theory, provide an opportunity for further theoretical development focused more clearly on the concept of power (Dickie et al. 2015). A critical focus on power as the essential element in maintaining, increasing or reducing social and economic inequalities – including health inequalities – can both enhance the coherence of the theory, and also enhance the capacity to challenge the roots of health inequalities at different levels and scales. This paper provides an initial contribution by proposing a framework to help to identify the most important sources, forms and positions of power, as well as the social spaces in which they operate. Subsequent work could usefully test, elaborate and adapt this framework, or indeed ultimately replace it with something better, to help focus actions to reduce inequalities
    corecore