2,227 research outputs found

    Can patterns of urban biodiversity be predicted using simple measures of green infrastructure?

    Get PDF
    Urban species and habitats provide important ecosystem services such as summertime cooling, recreation, and pollination at a variety of scales. Many studies have assessed how biodiversity responds to urbanization, but little work has been done to try and create recommendations that can be easily applied to urban planning, design and management practice. Urban planning often operates at broad spatial scales, typically using relatively simplistic targets for land cover mix to influence biodiversity and ecosystem service provision. Would more complicated, but still easily created, prescriptions for urban vegetation be beneficial? Here we assess the importance of vegetation measures (percentage vegetation cover, tree canopy cover and variation in canopy height) across four taxonomic groups (bats, bees, hoverflies and birds) at multiple spatial scales (100, 250, 500, 1000 m) within a major urban area (Birmingham, the United Kingdom). We found that small-scale (100–250-m radius) measures of vegetation were important predictors for hoverflies and bees, and that bats were sensitive to vegetation at a medium spatial-scale (250–500 m). In contrast, birds responded to vegetation characteristics at both small (100 m) and large (1000 m) scales. Vegetation cover, tree cover and variation in canopy height were expected to decrease with built surface cover; however, only vegetation height showed this expected trend. The results indicate the importance of relatively small patches of vegetation cover for supporting urban biodiversity, and show that relatively simple measures of vegetation characteristics can be useful predictors of species richness (or activity density, in the case of bats). They also highlight the danger of relying upon percentage built surface cover as an indicator of urban biodiversity potential

    Preserving the impossible: conservation of soft-sediment hominin footprint sites and strategies for three-dimensional digital data capture.

    Get PDF
    Human footprints provide some of the most publically emotive and tangible evidence of our ancestors. To the scientific community they provide evidence of stature, presence, behaviour and in the case of early hominins potential evidence with respect to the evolution of gait. While rare in the geological record the number of footprint sites has increased in recent years along with the analytical tools available for their study. Many of these sites are at risk from rapid erosion, including the Ileret footprints in northern Kenya which are second only in age to those at Laetoli (Tanzania). Unlithified, soft-sediment footprint sites such these pose a significant geoconservation challenge. In the first part of this paper conservation and preservation options are explored leading to the conclusion that to 'record and digitally rescue' provides the only viable approach. Key to such strategies is the increasing availability of three-dimensional data capture either via optical laser scanning and/or digital photogrammetry. Within the discipline there is a developing schism between those that favour one approach over the other and a requirement from geoconservationists and the scientific community for some form of objective appraisal of these alternatives is necessary. Consequently in the second part of this paper we evaluate these alternative approaches and the role they can play in a 'record and digitally rescue' conservation strategy. Using modern footprint data, digital models created via optical laser scanning are compared to those generated by state-of-the-art photogrammetry. Both methods give comparable although subtly different results. This data is evaluated alongside a review of field deployment issues to provide guidance to the community with respect to the factors which need to be considered in digital conservation of human/hominin footprints

    Using Electronic Technology to Improve Clinical Care -- Results from a Before-after Cluster Trial to Evaluate Assessment and Classification of Sick Children According to Integrated Management of Childhood Illness (IMCI) Protocol in Tanzania.

    Get PDF
    Poor adherence to the Integrated Management of Childhood Illness (IMCI) protocol reduces the potential impact on under-five morbidity and mortality. Electronic technology could improve adherence; however there are few studies demonstrating the benefits of such technology in a resource-poor settings. This study estimates the impact of electronic technology on adherence to the IMCI protocols as compared to the current paper-based protocols in Tanzania. In four districts in Tanzania, 18 clinics were randomly selected for inclusion. At each site, observers documented critical parts of the clinical assessment of children aged 2 months to 5 years. The first set of observations occurred during examination of children using paper-based IMCI (pIMCI) and the next set of observations occurred during examination using the electronic IMCI (eIMCI). Children were re-examined by an IMCI expert and the diagnoses were compared. A total of 1221 children (671 paper, 550 electronic) were observed. For all ten critical IMCI items included in both systems, adherence to the protocol was greater for eIMCI than for pIMCI. The proportion assessed under pIMCI ranged from 61% to 98% compared to 92% to 100% under eIMCI (p < 0.05 for each of the ten assessment items). Use of electronic systems improved the completeness of assessment of children with acute illness in Tanzania. With the before-after nature of the design, potential for temporal confounding is the primary limitation. However, the data collection for both phases occurred over a short period (one month) and so temporal confounding was expected to be minimal. The results suggest that the use of electronic IMCI protocols can improve the completeness and consistency of clinical assessments and future studies will examine the long-term health and health systems impact of eIMCI

    Hospital characteristics associated with highly automated and usable clinical information systems in Texas, United States

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A hospital's clinical information system may require a specific environment in which to flourish. This environment is not yet well defined. We examined whether specific hospital characteristics are associated with highly automated and usable clinical information systems.</p> <p>Methods</p> <p>This was a cross-sectional survey of 125 urban hospitals in Texas, United States using the Clinical Information Technology Assessment Tool (CITAT), which measures a hospital's level of automation based on physician interactions with the information system. Physician responses were used to calculate a series of CITAT scores: automation and usability scores, four automation sub-domain scores, and an overall clinical information technology (CIT) score. A multivariable regression analysis was used to examine the relation between hospital characteristics and CITAT scores.</p> <p>Results</p> <p>We received a sufficient number of physician responses at 69 hospitals (55% response rate). Teaching hospitals, hospitals with higher IT operating expenses (>1millionannually),ITcapitalexpenses(>1 million annually), IT capital expenses (>75,000 annually) and hospitals with larger IT staff (≥ 10 full-time staff) had higher automation scores than hospitals that did not meet these criteria (p < 0.05 in all cases). These findings held after adjustment for bed size, total margin, and ownership (p < 0.05 in all cases). There were few significant associations between the hospital characteristics tested in this study and usability scores.</p> <p>Conclusion</p> <p>Academic affiliation and larger IT operating, capital, and staff budgets are associated with more highly automated clinical information systems.</p

    Hospital deaths and adverse events in Brazil

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Adverse events are considered a major international problem related to the performance of health systems. Evaluating the occurrence of adverse events involves, as any other outcome measure, determining the extent to which the observed differences can be attributed to the patient's risk factors or to variations in the treatment process, and this in turn highlights the importance of measuring differences in the severity of the cases. The current study aims to evaluate the association between deaths and adverse events, adjusted according to patient risk factors.</p> <p>Methods</p> <p>The study is based on a random sample of 1103 patient charts from hospitalizations in the year 2003 in 3 teaching hospitals in the state of Rio de Janeiro, Brazil. The methodology involved a retrospective review of patient charts in two stages - screening phase and evaluation phase. Logistic regression was used to evaluate the relationship between hospital deaths and adverse events.</p> <p>Results</p> <p>The overall mortality rate was 8.5%, while the rate related to the occurrence of an adverse event was 2.9% (32/1103) and that related to preventable adverse events was 2.3% (25/1103). Among the 94 deaths analyzed, 34% were related to cases involving adverse events, and 26.6% of deaths occurred in cases whose adverse events were considered preventable. The models tested showed good discriminatory capacity. The unadjusted odds ratio (OR 11.43) and the odds ratio adjusted for patient risk factors (OR 8.23) between death and preventable adverse event were high.</p> <p>Conclusions</p> <p>Despite discussions in the literature regarding the limitations of evaluating preventable adverse events based on peer review, the results presented here emphasize that adverse events are not only prevalent, but are associated with serious harm and even death. These results also highlight the importance of risk adjustment and multivariate models in the study of adverse events.</p

    Connecting the dots in pharmacy education: The FIP International Pharmaceutical Federation Global Competency Framework for Educators and Trainers in Pharmacy (FIP-GCFE)

    Get PDF
    The FIP (International Pharmaceutical Federation) Global Competency Framework for Educators and Trainers in Pharmacy (FIP-GCFE) is an ongoing project of the Academic Pharmacy Section of FIP in cooperation and collaboration with Sections, Special Interest Groups and Working Groups across the Federation. It was developed by a group of experts in pharmaceutical education to enable and promote the continuing professional development of pharmacists and pharmaceutical scientists who plan to advance their competence as educators and trainers in pharmacy and the pharmaceutical sciences, whether in a formal or informal context, and at all levels of education and professional development. The FIP-GCFE will be an essential resource for multiple stakeholders including individual educators, faculties of pharmacy, and accreditation agencies. This article presents the introductory text of the GCFE first version, connecting previously launched concepts and tools and explaining the integration with all other FIP workforce support frameworks, to provide a holistic approach to global workforce development

    A Biometric Model for Mineralization of Type-I Collagen Fibrils

    Get PDF
    The bone and dentin mainly consist of type-I collagen fibrils mineralized by hydroxyapatite (HAP) nanocrystals. In vitro biomimetic models based on self-assembled collagen fibrils have been widely used in studying the mineralization mechanism of type-I collagen. In this chapter, the protocol we used to build a biomimetic model for the mechanistic study of type-I collagen mineralization is described. Type-I collagen extracted from rat tail tendon or horse tendon is self-assembled into fibrils and mineralized by HAP in vitro. The mineralization process is monitored by cryoTEM in combination with two-dimensional (2D) and three-dimensional (3D) stochastic optical reconstruction microscopy (STORM), which enables in situ and high-resolution visualization of the process

    Impact of Continuous Axenic Cultivation in Leishmania infantum Virulence

    Get PDF
    Experimental infections with visceral Leishmania spp. are frequently performed referring to stationary parasite cultures that are comprised of a mixture of metacyclic and non-metacyclic parasites often with little regard to time of culture and metacyclic purification. This may lead to misleading or irreproducible experimental data. It is known that the maintenance of Leishmania spp. in vitro results in a progressive loss of virulence that can be reverted by passage in a mammalian host. In the present study, we aimed to characterize the loss of virulence in culture comparing the in vitro and in vivo infection and immunological profile of L. infantum stationary promastigotes submitted to successive periods of in vitro cultivation. To evaluate the effect of axenic in vitro culture in parasite virulence, we submitted L. infantum promastigotes to 4, 21 or 31 successive in vitro passages. Our results demonstrated a rapid and significant loss of parasite virulence when parasites are sustained in axenic culture. Strikingly, the parasite capacity to modulate macrophage activation decreased significantly with the augmentation of the number of in vitro passages. We validated these in vitro observations using an experimental murine model of infection. A significant correlation was found between higher parasite burdens and lower number of in vitro passages in infected Balb/c mice. Furthermore, we have demonstrated that the virulence deficit caused by successive in vitro passages results from an inadequate capacity to differentiate into amastigote forms. In conclusion, our data demonstrated that the use of parasites with distinct periods of axenic in vitro culture induce distinct infection rates and immunological responses and correlated this phenotype with a rapid loss of promastigote differentiation capacity. These results highlight the need for a standard operating protocol (SOP) when studying Leishmania species

    Effects of automated alerts on unnecessarily repeated serology tests in a cardiovascular surgery department: a time series analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Laboratory testing is frequently unnecessary, particularly repetitive testing. Among the interventions proposed to reduce unnecessary testing, Computerized Decision Support Systems (CDSS) have been shown to be effective, but their impact depends on their technical characteristics. The objective of the study was to evaluate the impact of a Serology-CDSS providing point of care reminders of previous existing serology results, embedded in a Computerized Physician Order Entry at a university teaching hospital in Paris, France.</p> <p>Methods</p> <p>A CDSS was implemented in the Cardiovascular Surgery department of the hospital in order to decrease inappropriate repetitions of viral serology tests (HBV).</p> <p>A time series analysis was performed to assess the impact of the alert on physicians' practices. The study took place between January 2004 and December 2007. The primary outcome was the proportion of unnecessarily repeated HBs antigen tests over the periods of the study. A test was considered unnecessary when it was ordered within 90 days after a previous test for the same patient. A secondary outcome was the proportion of potentially unnecessary HBs antigen test orders cancelled after an alert display.</p> <p>Results</p> <p>In the pre-intervention period, 3,480 viral serology tests were ordered, of which 538 (15.5%) were unnecessarily repeated. During the intervention period, of the 2,095 HBs antigen tests performed, 330 unnecessary repetitions (15.8%) were observed. Before the intervention, the mean proportion of unnecessarily repeated HBs antigen tests increased by 0.4% per month (absolute increase, 95% CI 0.2% to 0.6%, <it>p </it>< 0.001). After the intervention, a significant trend change occurred, with a monthly difference estimated at -0.4% (95% CI -0.7% to -0.1%, <it>p </it>= 0.02) resulting in a stable proportion of unnecessarily repeated HBs antigen tests. A total of 380 unnecessary tests were ordered among 500 alerts displayed (compliance rate 24%).</p> <p>Conclusions</p> <p>The proportion of unnecessarily repeated tests immediately dropped after CDSS implementation and remained stable, contrasting with the significant continuous increase observed before. The compliance rate confirmed the effect of the alerts. It is necessary to continue experimentation with dedicated systems in order to improve understanding of the diversity of CDSS and their impact on clinical practice.</p
    corecore