5,149 research outputs found

    Semantic diversity:A measure of contextual variation in word meaning based on latent semantic analysis

    Get PDF
    Semantic ambiguity is typically measured by summing the number of senses or dictionary definitions that a word has. Such measures are somewhat subjective and may not adequately capture the full extent of variation in word meaning, particularly for polysemous words that can be used in many different ways, with subtle shifts in meaning. Here, we describe an alternative, computationally derived measure of ambiguity based on the proposal that the meanings of words vary continuously as a function of their contexts. On this view, words that appear in a wide range of contexts on diverse topics are more variable in meaning than those that appear in a restricted set of similar contexts. To quantify this variation, we performed latent semantic analysis on a large text corpus to estimate the semantic similarities of different linguistic contexts. From these estimates, we calculated the degree to which the different contexts associated with a given word vary in their meanings. We term this quantity a word's semantic diversity (SemD). We suggest that this approach provides an objective way of quantifying the subtle, context-dependent variations in word meaning that are often present in language. We demonstrate that SemD is correlated with other measures of ambiguity and contextual variability, as well as with frequency and imageability. We also show that SemD is a strong predictor of performance in semantic judgments in healthy individuals and in patients with semantic deficits, accounting for unique variance beyond that of other predictors. SemD values for over 30,000 English words are provided as supplementary materials. © 2012 Psychonomic Society, Inc

    From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument

    Get PDF
    <b>Background</b> Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.<p></p> <b>Methods</b> A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.<p></p> <b>Results</b> The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.<p></p> <b>Conclusions</b> To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study

    Logopenic and nonfluent variants of primary progressive aphasia are differentiated by acoustic measures of speech production

    Get PDF
    Differentiation of logopenic (lvPPA) and nonfluent/agrammatic (nfvPPA) variants of Primary Progressive Aphasia is important yet remains challenging since it hinges on expert based evaluation of speech and language production. In this study acoustic measures of speech in conjunction with voxel-based morphometry were used to determine the success of the measures as an adjunct to diagnosis and to explore the neural basis of apraxia of speech in nfvPPA. Forty-one patients (21 lvPPA, 20 nfvPPA) were recruited from a consecutive sample with suspected frontotemporal dementia. Patients were diagnosed using the current gold-standard of expert perceptual judgment, based on presence/absence of particular speech features during speaking tasks. Seventeen healthy age-matched adults served as controls. MRI scans were available for 11 control and 37 PPA cases; 23 of the PPA cases underwent amyloid ligand PET imaging. Measures, corresponding to perceptual features of apraxia of speech, were periods of silence during reading and relative vowel duration and intensity in polysyllable word repetition. Discriminant function analyses revealed that a measure of relative vowel duration differentiated nfvPPA cases from both control and lvPPA cases (r2 = 0.47) with 88% agreement with expert judgment of presence of apraxia of speech in nfvPPA cases. VBM analysis showed that relative vowel duration covaried with grey matter intensity in areas critical for speech motor planning and programming: precentral gyrus, supplementary motor area and inferior frontal gyrus bilaterally, only affected in the nfvPPA group. This bilateral involvement of frontal speech networks in nfvPPA potentially affects access to compensatory mechanisms involving right hemisphere homologues. Measures of silences during reading also discriminated the PPA and control groups, but did not increase predictive accuracy. Findings suggest that a measure of relative vowel duration from of a polysyllable word repetition task may be sufficient for detecting most cases of apraxia of speech and distinguishing between nfvPPA and lvPPA

    Investigating A Dose Response Relationship between High Fat Diet Consumption and the Contractile Performance of Isolated Mouse Soleus, EDL and Diaphragm Muscles

    Get PDF
    PurposeRecent evidence has demonstrated an obesity-induced, skeletal muscle-specific reduction in contractile performance. The extent and magnitude of these changes in relation to total dose of high-fat diet consumption remains unclear. This study aimed to examine the dose–response relationship between a high-fat diet and isolated skeletal muscle contractility.Methods120 female CD1 mice were randomly assigned to either control group or groups receiving 2, 4, 8 or 12 weeks of a high-calorie diet (N = 24). At 20 weeks, soleus, EDL or diaphragm muscle was isolated (n = 8 in each case) and isometric force, work loop power output and fatigue resistance were measured.ResultsWhen analysed with respect to feeding duration, there was no effect of diet on the measured parameters prior to 8 weeks of feeding. Compared to controls, 8-week feeding caused a reduction in normalised power of the soleus, and 8- and 12-week feeding caused reduced normalised isometric force, power and fatigue resistance of the EDL. Diaphragm from the 12-week group produced lower normalised power, whereas 8- and 12-week groups produced significantly lower normalised isometric force. Correlation statistics indicated that body fat accumulation and decline in contractility will be specific to the individual and independent of the feeding duration.ConclusionThe data indicate that a high-fat diet causes a decline in muscle quality with specific contractile parameters being affected in each muscle. We also uniquely demonstrate that the amount of fat gain, irrespective of feeding duration, may be the main factor in reducing contractile performance

    Can programme theory be used as a 'translational tool’ to optimise health service delivery in a national early years’ initiative in Scotland: a case study

    Get PDF
    Background Theory-based evaluation (TBE) approaches are heralded as supporting formative evaluation by facilitating increased use of evaluative findings to guide programme improvement. It is essential that learning from programme implementation is better used to improve delivery and to inform other initiatives, if interventions are to be as effective as they have the potential to be. Nonetheless, few studies describe formative feedback methods, or report direct instrumental use of findings resulting from TBE. This paper uses the case of Scotland’s, National Health Service, early years’, oral health improvement initiative (Childsmile) to describe the use of TBE as a framework for providing feedback on delivery to programme staff and to assess its impact on programmatic action.<p></p> Methods In-depth, semi-structured interviews and focus groups with key stakeholders explored perceived deviations between the Childsmile programme 'as delivered’ and its Programme Theory (PT). The data was thematically analysed using constant comparative methods. Findings were shared with key programme stakeholders and discussions around likely impact and necessary actions were facilitated by the authors. Documentary review and ongoing observations of programme meetings were undertaken to assess the extent to which learning was acted upon.<p></p> Results On the whole, the activities documented in Childsmile’s PT were implemented as intended. This paper purposefully focuses on those activities where variation in delivery was evident. Differences resulted from the stage of roll-out reached and the flexibility given to individual NHS boards to tailor local implementation. Some adaptations were thought to have diverged from the central features of Childsmile’s PT, to the extent that there was a risk to achieving outcomes. The methods employed prompted national service improvement action, and proposals for local action by individual NHS boards to address this.<p></p> Conclusions The TBE approach provided a platform, to direct attention to areas of risk within a national health initiative, and to agree which intervention components were 'core’ to its hypothesised success. The study demonstrates that PT can be used as a 'translational tool’ to facilitate instrumental use of evaluative findings to optimise implementation within a complex health improvement programme.<p></p&gt

    Long term time variability of cosmic rays and possible relevance to the development of life on Earth

    Full text link
    An analysis is made of the manner in which the cosmic ray intensity at Earth has varied over its existence and its possible relevance to both the origin and the evolution of life. Much of the analysis relates to the 'high energy' cosmic rays (E>1014eV;=0.1PeVE>10^{14}eV;=0.1PeV) and their variability due to the changing proximity of the solar system to supernova remnants which are generally believed to be responsible for most cosmic rays up to PeV energies. It is pointed out that, on a statistical basis, there will have been considerable variations in the likely 100 My between the Earth's biosphere reaching reasonable stability and the onset of very elementary life. Interestingly, there is the increasingly strong possibility that PeV cosmic rays are responsible for the initiation of terrestrial lightning strokes and the possibility arises of considerable increases in the frequency of lightnings and thereby the formation of some of the complex molecules which are the 'building blocks of life'. Attention is also given to the well known generation of the oxides of nitrogen by lightning strokes which are poisonous to animal life but helpful to plant growth; here, too, the violent swings of cosmic ray intensities may have had relevance to evolutionary changes. A particular variant of the cosmic ray acceleration model, put forward by us, predicts an increase in lightning rate in the past and this has been sought in Korean historical records. Finally, the time dependence of the overall cosmic ray intensity, which manifests itself mainly at sub-10 GeV energies, has been examined. The relevance of cosmic rays to the 'global electrical circuit' points to the importance of this concept.Comment: 18 pages, 5 figures, accepted by 'Surveys in Geophysics

    Ferritins: furnishing proteins with iron

    Get PDF
    Ferritins are a superfamily of iron oxidation, storage and mineralization proteins found throughout the animal, plant, and microbial kingdoms. The majority of ferritins consist of 24 subunits that individually fold into 4-α-helix bundles and assemble in a highly symmetric manner to form an approximately spherical protein coat around a central cavity into which an iron-containing mineral can be formed. Channels through the coat at inter-subunit contact points facilitate passage of iron ions to and from the central cavity, and intrasubunit catalytic sites, called ferroxidase centers, drive Fe2+ oxidation and O2 reduction. Though the different members of the superfamily share a common structure, there is often little amino acid sequence identity between them. Even where there is a high degree of sequence identity between two ferritins there can be major differences in how the proteins handle iron. In this review we describe some of the important structural features of ferritins and their mineralized iron cores and examine in detail how three selected ferritins oxidise Fe2+ in order to explore the mechanistic variations that exist amongst ferritins. We suggest that the mechanistic differences reflect differing evolutionary pressures on amino acid sequences, and that these differing pressures are a consequence of different primary functions for different ferritins

    S-COL: A Copernican turn for the development of flexibly reusable collaboration scripts

    Get PDF
    Collaboration scripts are usually implemented as parts of a particular collaborative-learning platform. Therefore, scripts of demonstrated effectiveness are hardly used with learning platforms at other sites, and replication studies are rare. The approach of a platform-independent description language for scripts that allows for easy implementation of the same script on different platforms has not succeeded yet in making the transfer of scripts feasible. We present an alternative solution that treats the problem as a special case of providing support on top of diverse Web pages: In this case, the challenge is to trigger support based on the recognition of a Web page as belonging to a specific type of functionally equivalent pages such as the search query form or the results page of a search engine. The solution suggested has been implemented by means of a tool called S-COL (Scripting for Collaborative Online Learning) and allows for the sustainable development of scripts and scaffolds that can be used with a broad variety of content and platforms. The tool’s functions are described. In order to demonstrate the feasibility and ease of script reuse with S-COL, we describe the flexible re-implementation of a collaboration script for argumentation in S-COL and its adaptation to different learning platforms. To demonstrate that a collaboration script implemented in S-COL can actually foster learning, an empirical study about the effects of a specific script for collaborative online search on learning activities is presented. The further potentials and the limitations of the S-COL approach are discussed

    A rocky planet transiting a nearby low-mass star

    Full text link
    M-dwarf stars -- hydrogen-burning stars that are smaller than 60 per cent of the size of the Sun -- are the most common class of star in our Galaxy and outnumber Sun-like stars by a ratio of 12:1. Recent results have shown that M dwarfs host Earth-sized planets in great numbers: the average number of M-dwarf planets that are between 0.5 to 1.5 times the size of Earth is at least 1.4 per star. The nearest such planets known to transit their star are 39 parsecs away, too distant for detailed follow-up observations to measure the planetary masses or to study their atmospheres. Here we report observations of GJ 1132b, a planet with a size of 1.2 Earth radii that is transiting a small star 12 parsecs away. Our Doppler mass measurement of GJ 1132b yields a density consistent with an Earth-like bulk composition, similar to the compositions of the six known exoplanets with masses less than six times that of the Earth and precisely measured densities. Receiving 19 times more stellar radiation than the Earth, the planet is too hot to be habitable but is cool enough to support a substantial atmosphere, one that has probably been considerably depleted of hydrogen. Because the host star is nearby and only 21 per cent the radius of the Sun, existing and upcoming telescopes will be able to observe the composition and dynamics of the planetary atmosphere.Comment: Published in Nature on 12 November 2015, available at http://dx.doi.org/10.1038/nature15762. This is the authors' version of the manuscrip

    Marginalization of end-use technologies in energy innovation for climate protection

    Get PDF
    Mitigating climate change requires directed innovation efforts to develop and deploy energy technologies. Innovation activities are directed towards the outcome of climate protection by public institutions, policies and resources that in turn shape market behaviour. We analyse diverse indicators of activity throughout the innovation system to assess these efforts. We find efficient end-use technologies contribute large potential emission reductions and provide higher social returns on investment than energy-supply technologies. Yet public institutions, policies and financial resources pervasively privilege energy-supply technologies. Directed innovation efforts are strikingly misaligned with the needs of an emissions-constrained world. Significantly greater effort is needed to develop the full potential of efficient end-use technologies
    corecore