11,173 research outputs found

    Drone Technology in Agriculture Appraisal

    Get PDF
    The purpose of this project is to conduct a cost-benefit analysis of implementing a UAV/drone technology within agricultural appraisals. This project is categorized into four categories; introduction, literature review, methodology, and conclusions. The methodology portion of the project is a capital budget analysis measuring the effectiveness of UAVs within the appraisal department. The capital budget will be further developed by conducting a Present Value (NPV) analysis. The NPV analysis consists of a five-year NPV analysis measuring changes in productivity and total revenue caused by UAV technology. Results were found by using @Risk simulation to simulate data gathered during the project. Data was gathered by doing filed experiments on drone effectiveness during the appraisal process. The simulated data was incorporated into various budgets and used to create a five-year NPV analysis. Three separate scenarios were created, representing a Best-Case, Average-Case, and Worst-Case scenario. An NPV analysis was conducted for each of the three scenarios. Data for all three scenarios was found by conducting field experiments data. Also, all three scenarios have a discount rate of 10%. For each scenario, Year 1 includes the initial investment of purchasing the drone. The conclusion portion of this project will include a summary of the project, point out potential weaknesses within the project, and state the final consensus about UAV technology used within the agricultural appraisal industry. Thought the course of this project, the terms UAVs and drones will be used interchangeably. The final consensus about UAV use within agriculture appraisal is that despite the legal and time risks, UAVs are still beneficial to agricultural appraisers. As mentioned in the report, done use can increase efficiency and accuracy when inspecting larger tracts of land. UAVs allow appraisers to view areas of the property that are difficult to assess due to weather, accessibility, or terrain

    Brachial Artery Constriction during Brachial Artery Reactivity Testing Predicts Major Adverse Clinical Outcomes in Women with Suspected Myocardial Ischemia: Results from the NHLBI-Sponsored Women's Ischemia Syndrome Evaluation (WISE) Study

    Get PDF
    Background:Limited brachial artery (BA) flow-mediated dilation during brachial artery reactivity testing (BART) has been linked to increased cardiovascular risk. We report on the phenomenon of BA constriction (BAC) following hyperemia.Objectives:To determine whether BAC predicts adverse CV outcomes and/or mortality in the women's ischemic Syndrome Evaluation Study (WISE). Further, as a secondary objective we sought to determine the risk factors associated with BAC.Methods:We performed BART on 377 women with chest pain referred for coronary angiography and followed for a median of 9.5 years. Forearm ischemia was induced with 4 minutes occlusion by a cuff placed distal to the BA and inflated to 40mm Hg > systolic pressure. BAC was defined as >4.8% artery constriction following release of the cuff. The main outcome was major adverse events (MACE) including all-cause mortality, non-fatal MI, non-fatal stroke, or hospitalization for heart failure.Results:BA diameter change ranged from -20.6% to +44.9%, and 41 (11%) women experienced BAC. Obstructive CAD and traditional CAD risk factors were not predictive of BAC. Overall, 39% of women with BAC experienced MACE vs. 22% without BAC (p=0.004). In multivariate Cox proportional hazards regression, BAC was a significant independent predictor of MACE (p=0.018) when adjusting for obstructive CAD and traditional risk factors.Conclusions:BAC predicts almost double the risk for major adverse events compared to patients without BAC. This risk was not accounted for by CAD or traditional risk factors. The novel risk marker of BAC requires further investigation in women. © 2013 Sedlak et al

    Role of healthcare workers in early epidemic spread of Ebola: policy implications of prophylactic compared to reactive vaccination policy in outbreak prevention and control

    Get PDF
    Ebola causes severe illness in humans and has epidemic potential. How to deploy vaccines most effectively is a central policy question since different strategies have implications for ideal vaccine profile. More than one vaccine may be needed. A vaccine optimised for prophylactic vaccination in high-risk areas but when the virus is not actively circulating should be safe, well tolerated, and provide long-lasting protection; a two- or three-dose strategy would be realistic. Conversely, a reactive vaccine deployed in an outbreak context for ring-vaccination strategies should have rapid onset of protection with one dose, but longevity of protection is less important. In initial cases, before an outbreak is recognised, healthcare workers (HCWs) are at particular risk of acquiring and transmitting infection, thus potentially augmenting early epidemics. We hypothesise that many early outbreak cases could be averted, or epidemics aborted, by prophylactic vaccination of HCWs. This paper explores the potential impact of prophylactic versus reactive vaccination strategies of HCWs in preventing early epidemic transmissions. To do this, we use the limited data available from Ebola epidemics (current and historic) to reconstruct transmission trees and illustrate the theoretical impact of these vaccination strategies. Our data suggest a substantial potential benefit of prophylactic versus reactive vaccination of HCWs in preventing early transmissions. We estimate that prophylactic vaccination with a coverage >99% and theoretical 100% efficacy could avert nearly two-thirds of cases studied; 75% coverage would still confer clear benefit (40% cases averted), but reactive vaccination would be of less value in the early epidemic. A prophylactic vaccination campaign for front-line HCWs is not a trivial undertaking; whether to prioritise long-lasting vaccines and provide prophylaxis to HCWs is a live policy question. Prophylactic vaccination is likely to have a greater impact on the mitigation of future epidemics than reactive strategies and, in some cases, might prevent them. However, in a confirmed outbreak, reactive vaccination would be an essential humanitarian priority. The value of HCW Ebola vaccination is often only seen in terms of personal protection of the HCW workforce. A prophylactic vaccination strategy is likely to bring substantial additional benefit by preventing early transmission and might abort some epidemics. This has implications both for policy and for the optimum product profile for vaccines currently in development

    Hypothermia and Fever After Organophosphorus Poisoning in Humans—A Prospective Case Series

    Get PDF
    There have been many animal studies on the effects of organophosphorus pesticide (OP) poisoning on thermoregulation with inconsistent results. There have been no prospective human studies. Our aim was to document the changes in body temperature with OP poisoning. A prospective study was conducted in a rural hospital in Polonnaruwa, Sri Lanka. We collected data on sequential patients with OP poisoning and analyzed 12 patients selected from 53 presentations who had overt signs and symptoms of OP poisoning and who had not received atropine prior to arrival. All patients subsequently received specific management with atropine and/or pralidoxime and general supportive care. Tympanic temperature, ambient temperature, heart rate, and clinical examination and interventions were recorded prospectively throughout their hospitalization. Initial hypothermia as low as 32°C was observed in untreated patients. Tympanic temperature increased over time from an early hypothermia (<35°C in 6/12 patients) to later fever (7/12 patients >38°C at some later point). While some of the late high temperatures occurred in the setting of marked tachycardia, it was also apparent that in some cases fever was not accompanied by tachycardia, making excessive atropine or severe infection an unlikely explanation for all the fevers. In humans, OP poisoning causes an initial hypothermia, and this is followed by a period of normal to high body temperature. Atropine and respiratory complications may contribute to fever but do not account for all cases

    Choice of first-line antiretroviral therapy regimen and treatment outcomes for HIV in a middle income compared to a high income country: a cohort study

    Get PDF
    BACKGROUND: The range of combination antiretroviral therapy (cART) regimens available in many middle-income countries differs from those suggested in international HIV treatment guidelines. We compared first-line cART regimens, timing of initiation and treatment outcomes in a middle income setting (HIV Centre, Belgrade, Serbia - HCB) with a high-income country (Royal Free London Hospital, UK - RFH). METHODS: All antiretroviral-naïve HIV-positive individuals from HCB and RFH starting cART between 2003 and 2012 were included. 12-month viral load and CD4 count responses were compared, considering the first available measurement 12-24 months post-cART. The percentage that had made an antiretroviral switch for any reason, or for toxicity and the percentage that had died by 36 months (the latest time at which sufficient numbers remained under follow-up) were investigated using standard survival methods. RESULTS: 361/597 (61 %) of individuals initiating cART at HCB had a prior AIDS diagnosis, compared to 337/1763 (19 %) at RFH. Median pre-ART CD4 counts were 177 and 238 cells/mm(3) respectively (p < 0.0001). The most frequently prescribed antiretrovirals were zidovudine with lamivudine (149; 25 %) and efavirenz [329, 55 %] at HCB and emtricitabine with tenofovir (899; 51 %) and efavirenz [681, 39 %] at RFH. At HCB, a median of 2 CD4 count measurements in the first year of cART were taken, compared to 5 at RFH (p < 0.0001). Median (IQR) CD4 cell increase after 12 months was +211 (+86, +359) and +212 (+105, +318) respectively. 287 (48 %) individuals from HCB and 1452 (82 %) from RFH had an available viral load measurement, of which 271 (94 %) and 1280 (88 %) were <400 copies/mL (p < 0.0001). After 36 months, comparable percentages had made at least one antiretroviral switch (77 % HCB vs. 78 % RFH; p = 0.23). However, switches for toxicity/patient choice were more common at RFH. After 12 and 36 months of cART 3 % and 8 % of individuals died at HCB, versus 2 % and 4 % at RFH (p < 0.0001). CONCLUSION: In middle-income countries, cART is usually started at an advanced stage of HIV disease, resulting in higher mortality rates than in high income countries, supporting improved testing campaigns for early detection of HIV infection and early introduction of newer cART regimens

    Primary Progressive Aphasia: Toward a Pathophysiological Synthesis

    Get PDF
    PURPOSE OF REVIEW: The term primary progressive aphasia (PPA) refers to a diverse group of dementias that present with prominent and early problems with speech and language. They present considerable challenges to clinicians and researchers. RECENT FINDINGS: Here, we review critical issues around diagnosis of the three major PPA variants (semantic variant PPA, nonfluent/agrammatic variant PPA, logopenic variant PPA), as well as considering 'fragmentary' syndromes. We next consider issues around assessing disease stage, before discussing physiological phenotyping of proteinopathies across the PPA spectrum. We also review evidence for core central auditory impairments in PPA, outline critical challenges associated with treatment, discuss pathophysiological features of each major PPA variant, and conclude with thoughts on key challenges that remain to be addressed. New findings elucidating the pathophysiology of PPA represent a major step forward in our understanding of these diseases, with implications for diagnosis, care, management, and therapies
    • …
    corecore