1,000 research outputs found

    Radiographic evaluation of calcaneal fractures: To measure or not to measure

    Get PDF
    Objective: The aim of this study was to correlate the functional outcome after treatment for displaced intra-articular calcaneal fracture with plain radiography. Design: The design was a prognostic study of a retrospective cohort with concurrent follow-up. Patients: A total of 33 patients with a unilateral calcaneal fracture and a minimum follow-up of 13 months participated. Patients filled in three disease-specific questionnaires, graded their satisfaction and the indication for an arthrodesis was noted. Standardised radiographs were made of the previously injured side and the normal (control) side. Different angles and distances were measured on these radiographs and compared with values described in the literature. The differences in values in angles and distances between the injured and uninjured (control) foot were correlated with the outcome of the questionnaires, and the indication for an arthrodesis. Results: None of the angles correlated with the disease-specific outcome scores. Of the angles only the tibiotalar angle correlated with the VAS (r=0.35, p=0.045) and only the absolute foot height correlated with the indication for an arthrodesis (odds=0.70, CI=0.50-0.99). Conclusion: In this study the radiographic evaluation correlated poorly with the final outcome. Measurements on plain radiographs seem not to be useful in determining outcome after intra-articular calcaneal fractures

    Priority setting in developing countries health care institutions: the case of a Ugandan hospital

    Get PDF
    BACKGROUND: Because the demand for health services outstrips the available resources, priority setting is one of the most difficult issues faced by health policy makers, particularly those in developing countries. However, there is lack of literature that describes and evaluates priority setting in these contexts. The objective of this paper is to describe priority setting in a teaching hospital in Uganda and evaluate the description against an ethical framework for fair priority setting processes – Accountability for Reasonableness. METHODS: A case study in a 1,500 bed national referral hospital receiving 1,320 out patients per day and an average budget of US$ 13.5 million per year. We reviewed documents and carried out 70 in-depth interviews (14 health planners, 40 doctors, and 16 nurses working at the hospital). Interviews were recorded and transcribed. Data analysis employed the modified thematic approach to describe priority setting, and the description was evaluated using the four conditions of Accountability for Reasonableness: relevance, publicity, revisions and enforcement. RESULTS: Senior managers, guided by the hospital strategic plan make the hospital budget allocation decisions. Frontline practitioners expressed lack of knowledge of the process. Relevance: Priority is given according to a cluster of factors including need, emergencies and patient volume. However, surgical departments and departments whose leaders "make a lot of noise" are also prioritized. Publicity: Decisions, but not reasons, are publicized through general meetings and circulars, but this information does not always reach the frontline practitioners. Publicity to the general public was through ad hoc radio programs and to patients who directly ask. Revisions: There were no formal mechanisms for challenging the reasoning. Enforcement: There were no mechanisms to ensure adherence to the four conditions of a fair process. CONCLUSION: Priority setting decisions at this hospital do not satisfy the conditions of fairness. To improve, the hospital should: (i) engage frontline practitioners, (ii) publicize the reasons for decisions both within the hospital and to the general public, and (iii) develop formal mechanisms for challenging the reasoning. In addition, capacity strengthening is required for senior managers who must accept responsibility for ensuring that the above three conditions are met

    Methods to study splicing from high-throughput RNA Sequencing data

    Full text link
    The development of novel high-throughput sequencing (HTS) methods for RNA (RNA-Seq) has provided a very powerful mean to study splicing under multiple conditions at unprecedented depth. However, the complexity of the information to be analyzed has turned this into a challenging task. In the last few years, a plethora of tools have been developed, allowing researchers to process RNA-Seq data to study the expression of isoforms and splicing events, and their relative changes under different conditions. We provide an overview of the methods available to study splicing from short RNA-Seq data. We group the methods according to the different questions they address: 1) Assignment of the sequencing reads to their likely gene of origin. This is addressed by methods that map reads to the genome and/or to the available gene annotations. 2) Recovering the sequence of splicing events and isoforms. This is addressed by transcript reconstruction and de novo assembly methods. 3) Quantification of events and isoforms. Either after reconstructing transcripts or using an annotation, many methods estimate the expression level or the relative usage of isoforms and/or events. 4) Providing an isoform or event view of differential splicing or expression. These include methods that compare relative event/isoform abundance or isoform expression across two or more conditions. 5) Visualizing splicing regulation. Various tools facilitate the visualization of the RNA-Seq data in the context of alternative splicing. In this review, we do not describe the specific mathematical models behind each method. Our aim is rather to provide an overview that could serve as an entry point for users who need to decide on a suitable tool for a specific analysis. We also attempt to propose a classification of the tools according to the operations they do, to facilitate the comparison and choice of methods.Comment: 31 pages, 1 figure, 9 tables. Small corrections adde

    Does Habitual Physical Activity Increase the Sensitivity of the Appetite Control System? A Systematic Review.

    Get PDF
    BACKGROUND: It has been proposed that habitual physical activity improves appetite control; however, the evidence has never been systematically reviewed. OBJECTIVE: To examine whether appetite control (e.g. subjective appetite, appetite-related peptides, food intake) differs according to levels of physical activity. DATA SOURCES: Medline, Embase and SPORTDiscus were searched for articles published between 1996 and 2015, using keywords pertaining to physical activity, appetite, food intake and appetite-related peptides. STUDY SELECTION: Articles were included if they involved healthy non-smoking adults (aged 18-64 years) participating in cross-sectional studies examining appetite control in active and inactive individuals; or before and after exercise training in previously inactive individuals. STUDY APPRAISAL AND SYNTHESIS: Of 77 full-text articles assessed, 28 studies (14 cross-sectional; 14 exercise training) met the inclusion criteria. RESULTS: Appetite sensations and absolute energy intake did not differ consistently across studies. Active individuals had a greater ability to compensate for high-energy preloads through reductions in energy intake, in comparison with inactive controls. When physical activity level was graded across cross-sectional studies (low, medium, high, very high), a significant curvilinear effect on energy intake (z-scores) was observed. LIMITATIONS: Methodological issues existed concerning the small number of studies, lack of objective quantification of food intake, and various definitions used to define active and inactive individuals. CONCLUSION: Habitually active individuals showed improved compensation for the energy density of foods, but no consistent differences in appetite or absolute energy intake, in comparison with inactive individuals. This review supports a J-shaped relationship between physical activity level and energy intake. Further studies are required to confirm these findings. PROSPERO REGISTRATION NUMBER: CRD42015019696

    E. coli NfsA: an alternative nitroreductase for prodrug activation gene therapy in combination with CB1954

    Get PDF
    Prodrug activation gene therapy is a developing approach to cancer treatment, whereby prodrug-activating enzymes are expressed in tumour cells. After administration of a non-toxic prodrug, its conversion to cytotoxic metabolites directly kills tumour cells expressing the activating enzyme, whereas the local spread of activated metabolites can kill nearby cells lacking the enzyme (bystander cell killing). One promising combination that has entered clinical trials uses the nitroreductase NfsB from Escherichia coli to activate the prodrug, CB1954, to a potent bifunctional alkylating agent. NfsA, the major E. coli nitroreductase, has greater activity with nitrofuran antibiotics, but it has not been compared in the past with NfsB for the activation of CB1954. We show superior in vitro kinetics of CB1954 activation by NfsA using the NADPH cofactor, and show that the expression of NfsA in bacterial or human cells results in a 3.5- to 8-fold greater sensitivity to CB1954, relative to NfsB. Although NfsB reduces either the 2-NO2 or 4-NO2 positions of CB1954 in an equimolar ratio, we show that NfsA preferentially reduces the 2-NO2 group, which leads to a greater bystander effect with cells expressing NfsA than with NfsB. NfsA is also more effective than NfsB for cell sensitisation to nitrofurans and to a selection of alternative, dinitrobenzamide mustard (DNBM) prodrugs

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Representing 3D Space in Working Memory: Spatial Images from Vision, Hearing, Touch, and Language

    Get PDF
    The chapter deals with a form of transient spatial representation referred to as a spatial image. Like a percept, it is externalized, scaled to the environment, and can appear in any direction about the observer. It transcends the concept of modality, as it can be based on inputs from the three spatial senses, from language, and from long-term memory. Evidence is presented that supports each of the claimed properties of the spatial image, showing that it is quite different from a visual image. Much of the evidence presented is based on spatial updating. A major concern is whether spatial images from different input modalities are functionally equivalent— that once instantiated in working memory, the spatial images from different modalities have the same functional characteristics with respect to subsequent processing, such as that involved in spatial updating. Going further, the research provides some evidence that spatial images are amodal (i.e., do not retain modality-specific features)

    Eighteenth-century genomes show that mixed infections were common at time of peak tuberculosis in Europe

    Get PDF
    Tuberculosis (TB) was once a major killer in Europe, but it is unclear how the strains and patterns of infection at 'peak TB' relate to what we see today. Here we describe 14 genome sequences of M. tuberculosis, representing 12 distinct genotypes, obtained from human remains from eighteenth-century Hungary using metagenomics. All our historic genotypes belong to M. tuberculosis Lineage 4. Bayesian phylogenetic dating, based on samples with well-documented dates, places the most recent common ancestor of this lineage in the late Roman period. We find that most bodies yielded more than one M. tuberculosis genotype and we document an intimate epidemiological link between infections in two long-dead individuals. Our results suggest that metagenomic approaches usefully inform detection and characterization of historical and contemporary infections

    Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector

    Get PDF
    Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente
    corecore