631 research outputs found

    Spinal anesthesia: should everyone receive a urinary catheter?: a randomized, prospective study of patients undergoing total hip arthroplasty.

    Get PDF
    BACKGROUND: The objective of this randomized prospective study was to determine whether a urinary catheter is necessary for all patients undergoing total hip arthroplasty under spinal anesthesia. METHODS: Consecutive patients undergoing total hip arthroplasty under spinal anesthesia were randomized to treatment with or without insertion of an indwelling urinary catheter. All patients received spinal anesthesia with 15 to 30 mg of 0.5% bupivacaine. The catheter group was subjected to a standard postoperative protocol, with removal of the indwelling catheter within forty-eight hours postoperatively. The experimental group was monitored for urinary retention and, if necessary, had straight catheterization up to two times prior to the placement of an indwelling catheter. RESULTS: Two hundred patients were included in the study. There was no significant difference between the two groups in terms of the prevalence of urinary retention, the prevalence of urinary tract infection, or the length of stay. Nine patients in the no-catheter group and three patients in the catheter group (following removal of the catheter) required straight catheterization because of urinary retention. Three patients in the catheter group and no patient in the no-catheter group had development of urinary tract infection. CONCLUSIONS: Patients undergoing total hip arthroplasty under spinal anesthesia appear to be at low risk for urinary retention. Thus, a routine indwelling catheter is not required for such patients

    Everglades Ecological Forecasting II: Utilizing NASA Earth Observations to Enhance the Capabilities of Everglades National Park to Monitor & Predict Mangrove Extent to Aid Current Restoration Efforts

    Get PDF
    Mangroves act as a transition zone between fresh and salt water habitats by filtering and indicating salinity levels along the coast of the Florida Everglades. However, dredging and canals built in the early 1900s depleted the Everglades of much of its freshwater resources. In an attempt to assist in maintaining the health of threatened habitats, efforts have been made within Everglades National Park to rebalance the ecosystem and adhere to sustainably managing mangrove forests. The Everglades Ecological Forecasting II team utilized Google Earth Engine API and satellite imagery from Landsat 5, 7, and 8 to continuously create land-change maps over a 25 year period, and to allow park officials to continue producing maps in the future. In order to make the process replicable for project partners at Everglades National Park, the team was able to conduct a supervised classification approach to display mangrove regions in 1995, 2000, 2005, 2010 and 2015. As freshwater was depleted, mangroves encroached further inland and freshwater marshes declined. The current extent map, along with transition maps helped create forecasting models that show mangrove encroachment further inland in the year 2030 as well. This project highlights the changes to the Everglade habitats in relation to a changing climate and hydrological changes throughout the park

    Evaluating case studies of community-oriented integrated care.

    Get PDF
    This paper summarises a ten-year conversation within London Journal of Primary Care about the nature of community-oriented integrated care (COIC) and how to develop and evaluate it. COIC means integration of efforts for combined disease-treatment and health-enhancement at local, community level. COIC is similar to the World Health Organisation concept of a Community-Based Coordinating Hub - both require a local geographic area where different organisations align their activities for whole system integration and develop local communities for health. COIC is a necessary part of an integrated system for health and care because it enables multiple insights into 'wicked problems', and multiple services to integrate their activities for people with complex conditions, at the same time helping everyone to collaborate for the health of the local population. The conversation concludes seven aspects of COIC that warrant further attention

    The course of pain drawings during a 10-week treatment period in patients with acute and sub-acute low back pain

    Get PDF
    BACKGROUND: Pain drawings are widely used as an assessment of patients' subjective pain in low back pain patients being considered for surgery. Less work has been done on primary health care patients. Moreover, the possible correlation between pain drawing modalities and other pain assessment methods, such as pain score and functional variables needs to be described. Thus, the objectives were to describe the course of pain drawings during treatment in primary health care for low back pain patients. METHODS: 160 primary health care outpatients with acute or sub-acute low back pain were studied during 10 weeks of a stay active concept versus manual therapy in addition to the stay active concept. The patients filled out 3 pain drawings each, at baseline and after 5 and 10 weeks of treatment. In addition the patients also reported pain and functional variables during the 3 measurement periods. RESULTS: The proportion of areas marked, the mean number of areas marked (pain drawing score), mean number of modalities used (area score), and the proportion of patients with pain radiation all decreased during the 10-week treatment period. Most of the improvement occurred during the first half of the period. The seven different pain modalities in the pain drawing were correlated to pain and functional variables. In case of no radiation some modalities were associated with more pain and disability than others, a finding that grew stronger over time. For patients with pain radiation, the modality differences were smaller and inconsistent. CONCLUSION: Pain modalities are significantly correlated with pain and functional variables. There is a shift from painful modalities to less painful ones over time

    Differential Gene Expression from Microarray Analysis Distinguishes Woven and Lamellar Bone Formation in the Rat Ulna following Mechanical Loading

    Get PDF
    Formation of woven and lamellar bone in the adult skeleton can be induced through mechanical loading. Although much is known about the morphological appearance and structural properties of the newly formed bone, the molecular responses to loading are still not well understood. The objective of our study was to use a microarray to distinguish the molecular responses between woven and lamellar bone formation induced through mechanical loading. Rat forelimb loading was completed in a single bout to induce the formation of woven bone (WBF loading) or lamellar bone (LBF loading). A set of normal (non-loaded) rats were used as controls. Microarrays were performed at three timepoints after loading: 1 hr, 1 day and 3 days. Confirmation of microarray results was done for a select group of genes using quantitative real-time PCR (qRT-PCR). The micorarray identified numerous genes and pathways that were differentially regulated for woven, but not lamellar bone formation. Few changes in gene expression were evident comparing lamellar bone formation to normal controls. A total of 395 genes were differentially expressed between formation of woven and lamellar bone 1 hr after loading, while 5883 and 5974 genes were differentially expressed on days 1 and 3, respectively. Results suggest that not only are the levels of expression different for each type of bone formation, but that distinct pathways are activated only for woven bone formation. A strong early inflammatory response preceded an increase in angiogenic and osteogenic gene expression for woven bone formation. Furthermore, at later timepoints there was evidence of bone resorption after WBF loading. In summary, the vast coverage of the microarray offers a comprehensive characterization of the early differences in expression between woven and lamellar bone formation

    Using the entomological inoculation rate to assess the impact of vector control on malaria parasite transmission and elimination

    Get PDF
    Prior studies have shown that annual entomological inoculation rates (EIRs) must be reduced to less than one to substantially reduce the prevalence of malaria infection. In this study, EIR values were used to quantify the impact of insecticide-treated bed nets (ITNs), indoor residual spraying (IRS), and source reduction (SR) on malaria transmission. The analysis of EIR was extended through determining whether available vector control tools can ultimately eradicate malaria. The analysis is based primarily on a review of all controlled studies that used ITN, IRS, and/or SR and reported their effects on the EIR. To compare EIRs between studies, the percent difference in EIR between the intervention and control groups was calculated. Eight vector control intervention studies that measured EIR were found: four ITN studies, one IRS study, one SR study, and two studies with separate ITN and IRS intervention groups. In both the Tanzania study and the Solomon Islands study, one community received ITNs and one received IRS. In the second year of the Tanzania study, EIR was 90% lower in the ITN community and 93% lower in the IRS community, relative to the community without intervention; the ITN and IRS effects were not significantly different. In contrast, in the Solomon Islands study, EIR was 94% lower in the ITN community and 56% lower in the IRS community. The one SR study, in Dar es Salaam, reported a lower EIR reduction (47%) than the ITN and IRS studies. All of these vector control interventions reduced EIR, but none reduced it to zero. These studies indicate that current vector control methods alone cannot ultimately eradicate malaria because no intervention sustained an annual EIR less than one. While researchers develop new tools, integrated vector management may make the greatest impact on malaria transmission. There are many gaps in the entomological malaria literature and recommendations for future research are provided

    The contribution of metacognitions and attentional control to decisional procrastination

    Get PDF
    Earlier research has implicated metacognitions and attentional control in procrastination and self-regulatory failure. This study tested several hypotheses: (1) that metacognitions would be positively correlated with decisional procrastination; (2) that attentional control would be negatively correlated with decisional procrastination; (3) that metacognitions would be negatively correlated with attentional control; and (4) that metacognitions and attentional control would predict decisional procrastination when controlling for negative affect. One hundred and twenty-nine participants completed the Depression Anxiety Stress Scale 21, the Meta-Cognitions Questionnaire 30, the Attentional Control Scale, and the Decisional Procrastination Scale. Significant relationships were found between all three attentional control factors (focusing, shifting, and flexible control of thought) and two metacognitions factors (negative beliefs concerning thoughts about uncontrollability and danger, and cognitive confidence). Results also revealed that decisional procrastination was significantly associated with negative affect, all measured metacognitions factors, and all attentional control factors. In the final step of a hierarchical regression analysis only stress, cognitive confidence, and attention shifting were independent predictors of decisional procrastination. Overall these findings support the hypotheses and are consistent with the Self-Regulatory Executive Function model of psychological dysfunction. The implications of these findings are discussed

    The Mobile Phone in the Diffusion of Knowledge for Institutional Quality in Sub-Saharan Africa

    Get PDF
    This study assesses the mobile phone in the diffusion of knowledge for better governance in Sub-Saharan Africa from 2000 to 2012. For this purpose we employ Generalised Method of Moments with forward orthogonal deviations. The empirical evidence is based on three complementary knowledge diffusion variables (innovation, internet penetration and educational quality) and 10 governance indicators that are bundled and unbundled. The following are the main findings. First, there is an unconditional positive effect of mobile phone penetration on good governance. Second, the net effects on political, economic, and institutional governances that are associated with the interaction of the mobile phone with knowledge diffusion variables are positive for the most part. Third, countries with low levels of governance are catching-up their counterparts with higher levels of governance. The above findings are broadly consistent with theoretical underpinnings on the relevance of mobile phones in mitigating bad governance in Africa. The evidence of some insignificant net effects and decreasing marginal impacts may be an indication that the mobile phone could also be employed to decrease government quality. Overall, this study has established net positive effects for the most part. Five rationales could elicit the positive net effects on good governance from the interaction between mobile phones and knowledge diffusion, among others, the knowledge variables enhance: reach, access, adoption, cost-effectiveness, and interaction. In a nut shell, the positive net effects are apparent because the knowledge diffusion variables complement mobile phones in reducing information asymmetry and monopoly that create conducive conditions for bad governance. The contribution of the findings to existing theories and justifications of the underlying positive net effects are discussed
    corecore