662 research outputs found

    What do patients prefer their functional seizures to be called, and what are their experiences of diagnosis? - A mixed methods investigation

    Get PDF
    This study explored the preferred terms for functional seizures, and the experience of being diagnosed, from the patient’s perspective. 39 patients in a neuropsychiatry service diagnosed with functional seizures completed an online survey to investigate preferences for, and offensiveness of, 11 common diagnostic terms used to describe functional seizures. Of these 39 patients, 13 consented to take part in a semistructured interview exploring the experience of receiving a diagnosis. Nonepileptic attack disorder (NEAD), functional seizures, functional nonepileptic attacks (FNEA), and dissociative seizures were ranked the highest preferred terms and did not significantly differ from one another. NEAD was the least offensive term, with functional seizures and FNEA following closely. Significant overlap in confidence intervals was found between the offensiveness of all terms. Terms that indicated a psychological origin were the least preferred and viewed as most offensive. Thematic analysis identified three main themes on the experience of being diagnosed: ‘being heard and having a shared understanding’, ‘feeling alone’, and ‘sense of hope’. Patients favored diagnostic terms that facilitated and alleviated these themes on a personal basis; however, preferences differed across individuals. Our findings suggest that a range of terms have a similar level of preference and offense rating, with NEAD, functional seizures, and FNEA being the most favorable. Qualitative analysis indicates that a term and its accompanying explanation should facilitate shared acceptance and understanding, and several terms provide this. In combination with our previous study on healthy participants, we propose that one of the two terms researched are adopted by patients, health professionals, and the public: Functional nonepileptic attacks or Functional seizures

    Implementing telephone triage in general practice: a process evaluation of a cluster randomised controlled trial

    Get PDF
    Background: Telephone triage represents one strategy to manage demand for face-to-face GP appointments in primary care. However, limited evidence exists of the challenges GP practices face in implementing telephone triage. We conducted a qualitative process evaluation alongside a UK-based cluster randomised trial (ESTEEM) which compared the impact of GP-led and nurse-led telephone triage with usual care on primary care workload, cost, patient experience, and safety for patients requesting a same-day GP consultation. The aim of the process study was to provide insights into the observed effects of the ESTEEM trial from the perspectives of staff and patients, and to specify the circumstances under which triage is likely to be successfully implemented. Here we report perspectives of staff. Methods: The intervention comprised implementation of either GP-led or nurse-led telephone triage for a period of 2-3 months. A qualitative evaluation was conducted using staff interviews recruited from eight general practices (4 GP triage, 4 Nurse triage) in the UK, implementing triage as part of the ESTEEM trial. Qualitative interviews were undertaken with 44 staff members in GP triage and nurse triage practices (16 GPs, 8 nurses, 7 practice managers, 13 administrative staff). Results: Staff reported diverse experiences and perceptions regarding the implementation of telephone triage, its effects on workload, and on the benefits of triage. Such diversity were explained by the different ways triage was organised, the staffing models used to support triage, how the introduction of triage was communicated across practice staff, and by how staff roles were reconfigured as a result of implementing triage. Conclusion: The findings from the process evaluation offer insight into the range of ways GP practices participating in ESTEEM implemented telephone triage, and the circumstances under which telephone triage can be successfully implemented beyond the context of a clinical trial. Staff experiences and perceptions of telephone triage are shaped by the way practices communicate with staff, prepare for and sustain the changes required to implement triage effectively, as well as by existing practice culture, and staff and patient behaviour arising in response to the changes made. Trial registration: Current Controlled Trials ISRCTN20687662. Registered 28 May 2009

    Fracturing ranked surfaces

    Get PDF
    Discretized landscapes can be mapped onto ranked surfaces, where every element (site or bond) has a unique rank associated with its corresponding relative height. By sequentially allocating these elements according to their ranks and systematically preventing the occupation of bridges, namely elements that, if occupied, would provide global connectivity, we disclose that bridges hide a new tricritical point at an occupation fraction p=pcp=p_{c}, where pcp_{c} is the percolation threshold of random percolation. For any value of pp in the interval pc<p1p_{c}< p \leq 1, our results show that the set of bridges has a fractal dimension dBB1.22d_{BB} \approx 1.22 in two dimensions. In the limit p1p \rightarrow 1, a self-similar fracture is revealed as a singly connected line that divides the system in two domains. We then unveil how several seemingly unrelated physical models tumble into the same universality class and also present results for higher dimensions

    Chedoke Arm and Hand Activity Inventory-9 (CAHAI-9): Perceived clinical utility within 14 days of stroke

    Get PDF
    Purpose: The Chedoke Arm and Hand Activity Inventory-9 (CAHAI-9) is an activity-based assessment developed to include relevant functional tasks and to be sensitive to clinically important changes in upper limb function. The aim of this study was to explore both therapists' and clients' views on the clinical utility of CAHAI-9 within 14 days of stroke. Method: Twenty-one occupational therapists actively working in stroke settings were recruited by convenience sampling from 8 hospitals and participated in semistructured focus groups. Five clients within 14 days of stroke were recruited by consecutive sampling from 1 metropolitan hospital and participated in structured individual interviews. The transcripts were analyzed thematically. Results: Six themes emerged from the focus groups and interviews: collecting information, decisions regarding client suitability, administration and scoring, organizational demands, raising awareness, and clients' perceptions of CAHAI-9 utility. All therapists agreed CAHAI-9 was suited for the stroke population and assisted identification of client abilities or difficulties within functional contexts. Opinions varied as to whether CAHAI-9 should be routinely administered with clients who had mild and severe upper limb deficits, but therapists agreed it was appropriate for clients with moderate deficits. Therapists made suggestions regarding refinement of the scoring and training to increase utility. All clients with stroke felt that the assessment provided reassurance regarding their recovery. Conclusion: The findings indicate that CAHAI-9 shows promise as an upper limb ability assessment for clients within 14 days of stroke

    Application of the speed-duration relationship to normalize the intensity of high-intensity interval training

    Get PDF
    The tolerable duration of continuous high-intensity exercise is determined by the hyperbolic Speed-tolerable duration (S-tLIM) relationship. However, application of the S-tLIM relationship to normalize the intensity of High-Intensity Interval Training (HIIT) has yet to be considered, with this the aim of present study. Subjects completed a ramp-incremental test, and series of 4 constant-speed tests to determine the S-tLIM relationship. A sub-group of subjects (n = 8) then repeated 4 min bouts of exercise at the speeds predicted to induce intolerance at 4 min (WR4), 6 min (WR6) and 8 min (WR8), interspersed with bouts of 4 min recovery, to the point of exercise intolerance (fixed WR HIIT) on different days, with the aim of establishing the work rate that could be sustained for 960 s (i.e. 4×4 min). A sub-group of subjects (n = 6) also completed 4 bouts of exercise interspersed with 4 min recovery, with each bout continued to the point of exercise intolerance (maximal HIIT) to determine the appropriate protocol for maximizing the amount of high-intensity work that can be completed during 4×4 min HIIT. For fixed WR HIIT tLIM of HIIT sessions was 399±81 s for WR4, 892±181 s for WR6 and 1517±346 s for WR8, with total exercise durations all significantly different from each other (P&#60;0.050). For maximal HIIT, there was no difference in tLIM of each of the 4 bouts (Bout 1: 229±27 s; Bout 2: 262±37 s; Bout 3: 235±49 s; Bout 4: 235±53 s; P&#62;0.050). However, there was significantly less high-intensity work completed during bouts 2 (153.5±40. 9 m), 3 (136.9±38.9 m), and 4 (136.7±39.3 m), compared with bout 1 (264.9±58.7 m; P&#62;0.050). These data establish that WR6 provides the appropriate work rate to normalize the intensity of HIIT between subjects. Maximal HIIT provides a protocol which allows the relative contribution of the work rate profile to physiological adaptations to be considered during alternative intensity-matched HIIT protocols

    Comparison between the HCV IRES domain IV RNA structure and the Iron Responsive Element

    Get PDF
    Background: Serum ferritin and hepatic iron concentrations are frequently elevated in patients who are chronically infected with the hepatitis C virus (HCV), and hepatic iron concentration has been used to predict response to interferon therapy, but these correlations are not well understood. The HCV genome contains an RNA structure resembling an iron responsive element (IRE) in its internal ribosome entry site (IRES) structural domain IV (dIV). An IRE is a stem loop structure used to control the expression of eukaryotic proteins involved in iron homeostasis by either inhibiting ribosomal binding or protecting the mRNA from nuclease degradation. The HCV structure, located within the binding site of the 40S ribosomal subunit, might function as an authentic IRE or by an IRE-like mechanism.----- Results: Electrophoretic mobility shift assays showed that the HCV IRES domain IV structure does not interact with the iron regulatory protein 1 (IRP1) in vitro. Systematic HCV IRES RNA mutagenesis suggested that IRP1 cannot accommodate the shape of the wild type HCV IRES dIV RNA structure.----- Conclusion The HCV IRES dIV RNA structure is not an authentic IRE. The possibility that this RNA structure is responsible for the observed correlations between intracellular iron concentration and HCV infection parameters through an IRE-like mechanism in response to some other cellular signal remains to be tested

    Community based intervention to optimize osteoporosis management: randomized controlled trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Osteoporosis-related fractures are a significant public health concern. Interventions that increase detection and treatment of osteoporosis are underutilized. This pragmatic randomised study was done to evaluate the impact of a multifaceted community-based care program aimed at optimizing evidence-based management in patients at risk for osteoporosis and fractures.</p> <p>Methods</p> <p>This was a 12-month randomized trial performed in Ontario, Canada. Eligible patients were community-dwelling, aged ≥55 years, and identified to be at risk for osteoporosis-related fractures. Two hundred and one patients were allocated to the intervention group or to usual care. Components of the intervention were directed towards primary care physicians and patients and included facilitated bone mineral density testing, patient education and patient-specific recommendations for osteoporosis treatment. The primary outcome was the implementation of appropriate osteoporosis management.</p> <p>Results</p> <p>101 patients were allocated to intervention and 100 to control. Mean age of participants was 71.9 ± 7.2 years and 94% were women. Pharmacological treatment (alendronate, risedronate, or raloxifene) for osteoporosis was increased by 29% compared to usual care (56% [29/52] vs. 27% [16/60]; relative risk [RR] 2.09, 95% confidence interval [CI] 1.29 to 3.40). More individuals in the intervention group were taking calcium (54% [54/101] vs. 20% [20/100]; RR 2.67, 95% CI 1.74 to 4.12) and vitamin D (33% [33/101] vs. 20% [20/100]; RR 1.63, 95% CI 1.01 to 2.65).</p> <p>Conclusions</p> <p>A multi-faceted community-based intervention improved management of osteoporosis in high risk patients compared with usual care.</p> <p>Trial Registration</p> <p>This trial has been registered with clinicaltrials.gov (ID: NCT00465387)</p

    How Phytophthora cinnamomi became associated with the death of Eucalyptus marginata – the early investigations into jarrah dieback

    Get PDF
    The name jarrah dieback was used in the 1940s to describe a serious economic problem in the jarrah forest in the south west of Western Australia. This was the sudden death of groups of jarrah (Eucalyptus marginata) trees that occurred on previously logged sites that had a tendency to become waterlogged in winter. Although the cause was not determined at the time, from symptoms recorded in early investigations the most likely explanation is that the trees died as the result of waterlogging damage. In the 1960s it was shown that many of these sites were infested by the introduced oomycete Phytophthora cinnamomi and tree deaths, together with the deaths of many mid- and under-storey plants, were attributed to this pathogen. A chronology of the research, based on contemporary unpublished documents, shows that in 1968 the conclusion that P. cinnamomi caused jarrah deaths was not supported by the available evidence, because the work did not satisfy the first and fourth of Koch’s postulates. The evidence that P. cinnamomi killed many mid- and under-storey plants was much stronger. There are two problems that have been confused: the death of groups of jarrah trees (jarrah dieback) that is caused by waterlogging and the death of many mid- and under-storey plants (Phytophthora dieback) caused by P. cinnamomi infection

    The primary cilium as a dual sensor of mechanochemical signals in chondrocytes

    Get PDF
    The primary cilium is an immotile, solitary, and microtubule-based structure that projects from cell surfaces into the extracellular environment. The primary cilium functions as a dual sensor, as mechanosensors and chemosensors. The primary cilia coordinate several essential cell signaling pathways that are mainly involved in cell division and differentiation. A primary cilium malfunction can result in several human diseases. Mechanical loading is sense by mechanosensitive cells in nearly all tissues and organs. With this sensation, the mechanical signal is further transduced into biochemical signals involving pathways such as Akt, PKA, FAK, ERK, and MAPK. In this review, we focus on the fundamental functional and structural features of primary cilia in chondrocytes and chondrogenic cells
    corecore