6,843 research outputs found

    Spoken Discourse Assessment and Analysis in Aphasia: An International Survey of Current Practices.

    Full text link
    Purpose Spoken discourse analysis is commonly employed in the assessment and treatment of people living with aphasia, yet there is no standardization in assessment, analysis, or reporting procedures, thereby precluding comparison/meta-analyses of data and hindering replication of findings. An important first step is to identify current practices in collecting and analyzing spoken discourse in aphasia. Thus, this study surveyed current practices, with the goal of working toward standardizing spoken discourse assessment first in research settings with subsequent implementation into clinical settings. Method A mixed-methods (quantitative and qualitative) survey was publicized to researchers and clinicians around the globe who have collected and/or analyzed spoken discourse data in aphasia. The survey data were collected between September and November 2019. Results Of the 201 individuals who consented to participate, 189 completed all mandatory questions in the survey (with fewer completing nonmandatory response questions). The majority of respondents reported barriers to utilizing discourse including transcription, coding, and analysis. The most common barrier was time (e.g., lack of time). Respondents also indicated that there was a lack of, and a need for, psychometric properties and normative data for spoken discourse use in the assessment and treatment of persons with aphasia. Quantitative and qualitative results are described in detail. Conclusions The current survey study evaluated spoken discourse methods in aphasia across research and clinical settings. Findings from this study will be used to guide development of process standardization in spoken discourse and for the creation of a psychometric and normative property database. Supplemental Material https://doi.org/10.23641/asha.166395100

    Calibrating fluvial erosion laws and quantifying river response to faulting in Sardinia, Italy

    Get PDF
    It is now widely accepted that rivers modify their erosion rates in response to variable rock uplift rates, resulting in changes in channel slope that propagate upstream through time. Therefore, present-day river morphology may contain a record of tectonic history. The simple stream power incision model can, in principle, be used to quantify past uplift rates over a variety of spatial and temporal scales. Nonetheless, the erosional model's exponents of area and slope (m and n respectively) and ‘bedrock erodibility’ (k) remain poorly constrained. In this paper, we will use a geologically and geomorphically well constrained Plio-Pleistocene volcanic landscape in central Sardinia, Italy, to calibrate the stream power erosion equation and to investigate the slip rate of faults that have been seismically quiescent in the historic past. By analysing digital elevation models, geological maps and Landsat imagery, we have identified the geomorphic expression of several volcanic features (eruption centres and basaltic lava flows) and three normal faults with 6 to 8 km fault traces within the outcrop. Downstream, river longitudinal profiles show a similar transient response to relative base level fall, probably as a result of relief inversion at the edge of the volcanic outcrop. From measurements of incision, local slope and upstream catchment area across eight different rivers, we calculate n ≈ 1, m = 0.50 ± 0.02 and, using a landscape age from literature of 2.7 Ma, bedrock erodibility k = 0.10 ± 0.04 m(1−2m) Myr−1. There are also knickpoints on rivers upstream of two normal faults, and we used numerical inverse modelling of the longitudinal profiles to predict the slip rate of these faults since 2.7 Ma. The results from the inverse model show that the erosional parameter values derived in this study can produce theoretical longitudinal profiles that closely resemble observed river profiles upstream of the faults. The lowest misfit theoretical longitudinal profiles were generated by a model of temporally discontinuous footwall uplift with consistently low throw rates (<0.1 mm yr−1). The predicted footwall uplift history is similar for the two faults, both showing periods of fault slip and no fault movement since 2.7 Ma. Keywords Stream powerNormal faultBasaltSardini

    An investigation of minimisation criteria

    Get PDF
    Minimisation can be used within treatment trials to ensure that prognostic factors are evenly distributed between treatment groups. The technique is relatively straightforward to apply but does require running tallies of patient recruitments to be made and some simple calculations to be performed prior to each allocation. As computing facilities have become more widely available, minimisation has become a more feasible option for many. Although the technique has increased in popularity, the mode of application is often poorly reported and the choice of input parameters not justified in any logical way

    Limit on the mass of a long-lived or stable gluino

    Full text link
    We reinterpret the generic CDF charged massive particle limit to obtain a limit on the mass of a stable or long-lived gluino. Various sources of uncertainty are examined. The RR-hadron spectrum and scattering cross sections are modeled based on known low-energy hadron physics and the resultant uncertainties are quantified and found to be small compared to uncertainties from the scale dependence of the NLO pQCD production cross sections. The largest uncertainty in the limit comes from the unknown squark mass: when the squark -- gluino mass splitting is small, we obtain a gluino mass limit of 407 GeV, while in the limit of heavy squarks the gluino mass limit is 397 GeV. For arbitrary (degenerate) squark masses, we obtain a lower limit of 322 GeV on the gluino mass. These limits apply for any gluino lifetime longer than ∼30\sim 30 ns, and are the most stringent limits for such a long-lived or stable gluino.Comment: 15 pages, 5 figures, accepted for publication in JHE

    There is no market for new antibiotics: This allows an open approach to research and development

    Get PDF
    There is an increasingly urgent need for new antibiotics, yet there is a significant and persistent economic problem when it comes to developing such medicines. The problem stems from the perceived need for a 'market' to drive commercial antibiotic development. In this article, we explore abandoning the market as a prerequisite for successful antibiotic research and development. Once one stops trying to fix a market model that has stopped functioning, one is free to carry out research and development (R&D) in ways that are more openly collaborative, a mechanism that has been demonstrably effective for the R&D underpinning the response to the COVID pandemic. New 'open source' research models have great potential for the development of medicines for areas of public health where the traditional profit-driven model struggles to deliver. New financial initiatives, including major push/pull incentives, aimed at fixing the broken antibiotics market provide one possible means for funding an openly collaborative approach to drug development. We argue that now is therefore the time to evaluate, at scale, whether such methods can deliver new medicines through to patients, in a timely manner
    • …
    corecore