355 research outputs found

    Near-source passive sampling for monitoring viral outbreaks within a university residential setting

    Get PDF
    \ua9 2024 Cambridge University Press. All rights reserved. Wastewater based epidemiology (WBE) has proven to be a powerful tool for the population-level monitoring of pathogens, particularly SARS-CoV-2. For accurate and timely assessment, several wastewater sampling regimes and methods of viral concentration have been investigated, mainly targeting SARS-CoV-2. However, the use of passive samplers in near-source environments for a range of viruses in wastewater is yet under-investigated. To address this, near-source passive samples were taken at four locations targeting student halls of residence. These were chosen as an exemplar due to their high population density and perceived risk of disease transmission. Viruses investigated were SARS-CoV-2 and its variants of concern (VOCs), influenza-A and B viruses and enteroviruses. Sampling was conducted either in the morning, where passive samplers were in place overnight (17 h) and during the day, where samplers remained in the sewer for 7 h. We demonstrated the usefulness of near-source passive sampling for the detection of VOCs using qPCR and Next Generation Sequencing. Furthermore, several outbreaks of influenza-A and sporadic outbreaks of enteroviruses (some associated with enterovirus D68 and coxsackieviruses) were identified amongst the resident student population, providing evidence of the usefulness of near-source, in-sewer sampling for monitoring the health of high population density communities

    Prey resources are equally important as climatic conditions for predicting the distribution of a broad-ranged apex predator

    Get PDF
    Aim A current biogeographic paradigm states that climate regulates species distributions at continental scales and that biotic interactions are undetectable at coarse-grain extents. However, advances in spatial modelling show that incorporating food resource distributions are important for improving model predictions at large distribution scales. This is particularly relevant to understand the factors limiting distribution of widespread apex predators whose diets are likely to vary across their range. Location Neotropical Central and South America Methods The harpy eagle (Harpia harpyja) is a large raptor, whose diet is largely comprised of arboreal mammals, all with broad distributions across Neotropical lowland forest. Here, we used a hierarchical modelling approach to determine the relative importance of abiotic factors and prey resource distribution on harpy eagle range limits. Our hierarchical approach consisted of the following modelling sequence of explanatory variables: (a) abiotic covariates, (b) prey resource distributions predicted by an equivalent modelling for each prey, (c) the combination of (a) and (b), and (d) as in (c) but with prey resources considered as a single prediction equivalent to prey species richness. Results Incorporating prey distributions improved model predictions but using solely biotic covariates still resulted in a high performing model. In the Abiotic model, Climatic Moisture Index (CMI) was the most important predictor, contributing 76 % to model prediction. Three-toed sloth (Bradypus spp.) was the most important prey resource, contributing 64 % in a combined Abiotic-Biotic model, followed by CMI contributing 30 %. Harpy eagle distribution had high environmental overlap across all individual prey distributions, with highest coincidence through Central America, eastern Colombia, and across the Guiana Shield into northern Amazonia. Main conclusions With strong reliance on prey distributions across its range, harpy eagle conservation programs must therefore consider its most important food resources as a key element in the protection of this threatened raptor

    Haemoglobin mass and running time trial performance after recombinant human erythropoietin administration in trained men

    Get PDF
    <p>Recombinant human erythropoietin (rHuEpo) increases haemoglobin mass (Hbmass) and maximal oxygen uptake (v˙ O2 max).</p> <p>Purpose: This study defined the time course of changes in Hbmass, v˙ O2 max as well as running time trial performance following 4 weeks of rHuEpo administration to determine whether the laboratory observations would translate into actual improvements in running performance in the field.</p> <p>Methods: 19 trained men received rHuEpo injections of 50 IUNkg21 body mass every two days for 4 weeks. Hbmass was determined weekly using the optimized carbon monoxide rebreathing method until 4 weeks after administration. v˙ O2 max and 3,000 m time trial performance were measured pre, post administration and at the end of the study.</p> <p>Results: Relative to baseline, running performance significantly improved by ,6% after administration (10:3061:07 min:sec vs. 11:0861:15 min:sec, p,0.001) and remained significantly enhanced by ,3% 4 weeks after administration (10:4661:13 min:sec, p,0.001), while v˙ O2 max was also significantly increased post administration (60.765.8 mLNmin21Nkg21 vs. 56.066.2 mLNmin21Nkg21, p,0.001) and remained significantly increased 4 weeks after rHuEpo (58.065.6 mLNmin21Nkg21, p = 0.021). Hbmass was significantly increased at the end of administration compared to baseline (15.261.5 gNkg21 vs. 12.761.2 gNkg21, p,0.001). The rate of decrease in Hbmass toward baseline values post rHuEpo was similar to that of the increase during administration (20.53 gNkg21Nwk21, 95% confidence interval (CI) (20.68, 20.38) vs. 0.54 gNkg21Nwk21, CI (0.46, 0.63)) but Hbmass was still significantly elevated 4 weeks after administration compared to baseline (13.761.1 gNkg21, p<0.001).</p> <p>Conclusion: Running performance was improved following 4 weeks of rHuEpo and remained elevated 4 weeks after administration compared to baseline. These field performance effects coincided with rHuEpo-induced elevated v˙ O2 max and Hbmass.</p&gt

    Could sound be used as a strategy for reducing symptoms of perceived motion sickness?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Working while exposed to motions, physically and psychologically affects a person. Traditionally, motion sickness symptom reduction has implied use of medication, which can lead to detrimental effects on performance. Non-pharmaceutical strategies, in turn, often require cognitive and perceptual attention. Hence, for people working in high demand environments where it is impossible to reallocate focus of attention, other strategies are called upon. The aim of the study was to investigate possible impact of a mitigation strategy on perceived motion sickness and psychophysiological responses, based on an artificial sound horizon compared with a non-positioned sound source.</p> <p>Methods</p> <p>Twenty-three healthy subjects were seated on a motion platform in an artificial sound horizon or in non-positioned sound, in random order with one week interval between the trials. Perceived motion sickness (Mal), maximum duration of exposure (ST), skin conductance, blood volume pulse, temperature, respiration rate, eye movements and heart rate were measured continuously throughout the trials.</p> <p>Results</p> <p>Mal scores increased over time in both sound conditions, but the artificial sound horizon, applied as a mitigation strategy for perceived motion sickness, showed no significant effect on Mal scores or ST. The number of fixations increased with time in the non-positioned sound condition. Moreover, fixation time was longer in the non-positioned sound condition compared with sound horizon, indicating that the subjects used more time to fixate and, hence, assumingly made fewer saccades.</p> <p>Conclusion</p> <p>A subliminally presented artificial sound horizon did not significantly affect perceived motion sickness, psychophysiological variables or the time the subjects endured the motion sickness triggering stimuli. The number of fixations and fixation times increased over time in the non-positioned sound condition.</p

    Zebrafish Larvae Exhibit Rheotaxis and Can Escape a Continuous Suction Source Using Their Lateral Line

    Get PDF
    Zebrafish larvae show a robust behavior called rheotaxis, whereby they use their lateral line system to orient upstream in the presence of a steady current. At 5 days post fertilization, rheotactic larvae can detect and initiate a swimming burst away from a continuous point-source of suction. Burst distance and velocity increase when fish initiate bursts closer to the suction source where flow velocity is higher. We suggest that either the magnitude of the burst reflects the initial flow stimulus, or fish may continually sense flow during the burst to determine where to stop. By removing specific neuromasts of the posterior lateral line along the body, we show how the location and number of flow sensors play a role in detecting a continuous suction source. We show that the burst response critically depends on the presence of neuromasts on the tail. Flow information relayed by neuromasts appears to be involved in the selection of appropriate behavioral responses. We hypothesize that caudally located neuromasts may be preferentially connected to fast swimming spinal motor networks while rostrally located neuromasts are connected to slow swimming motor networks at an early age

    Statin Induced Myopathy and Myalgia: Time Trend Analysis and Comparison of Risk Associated with Statin Class from 1991–2006

    Get PDF
    BACKGROUND: Statins are widely used as a cholesterol lowering medication, reduce cardiovascular mortality and morbidity in high risk patients; and only rarely cause serious adverse drug reactions (ADRs). UK primary care databases of morbidity and prescription data, which now cover several million people, have potential for more powerful analytical approaches to study ADRs including adjusting for confounders and examining temporal effects. METHODS: Case-crossover design in detecting statin associated myopathy ADR in 93, 831 patients, using two independent primary care databases (1991-2006). We analysed risk by drug class, by disease code and cumulative year, exploring different cut-off exposure times and confounding by temporality. RESULTS: Using a 12 and 26 week exposure period, large risk ratios (RR) are associated with all classes of statins and fibrates for myopathy: RR 10.6 (9.8-11.4) and 19.9 (17.6-22.6) respectively. At 26 weeks, the largest risks are with fluvastatin RR 33.3 (95% CI 16.8-66.0) and ciprofibrate (with previous statin use) RR 40.5 (95% CI 13.4-122.0). AT 12 weeks the differences between cerivastatin and atorvastatin RR for myopathy were found to be significant, RR 2.05 (95% CI 1.2-3.5), and for rosuvastatin and fluvastatin RR 3.0 (95% CI 1.6-5.7). After 12 months of statin initiation, the relative risk for myopathy for all statins and fibrates increased to 25.7 (95% CI 21.8-30.3). Furthermore, this signal was detected within 2 years of first events being recorded. Our data suggests an annual incidence of statin induced myopathy or myalgia of around 11.4 for 16, 591 patients or 689 per million per year. CONCLUSION: There may be differential risks associated with some classes of statin and fibrate. Myopathy related to statin or fibrate use may persist after a long exposure time (12 months or more). These methods could be applied for early detection of harmful drug side effects, using similar primary care diagnostic and prescribing data

    The Red Sea, Coastal Landscapes, and Hominin Dispersals

    Get PDF
    This chapter provides a critical assessment of environment, landscape and resources in the Red Sea region over the past five million years in relation to archaeological evidence of hominin settlement, and of current hypotheses about the role of the region as a pathway or obstacle to population dispersals between Africa and Asia and the possible significance of coastal colonization. The discussion assesses the impact of factors such as topography and the distribution of resources on land and on the seacoast, taking account of geographical variation and changes in geology, sea levels and palaeoclimate. The merits of northern and southern routes of movement at either end of the Red Sea are compared. All the evidence indicates that there has been no land connection at the southern end since the beginning of the Pliocene period, but that short sea crossings would have been possible at lowest sea-level stands with little or no technical aids. More important than the possibilities of crossing the southern channel is the nature of the resources available in the adjacent coastal zones. There were many climatic episodes wetter than today, and during these periods water draining from the Arabian escarpment provided productive conditions for large mammals and human populations in coastal regions and eastwards into the desert. During drier episodes the coastal region would have provided important refugia both in upland areas and on the emerged shelves exposed by lowered sea level, especially in the southern sector and on both sides of the Red Sea. Marine resources may have offered an added advantage in coastal areas, but evidence for their exploitation is very limited, and their role has been over-exaggerated in hypotheses of coastal colonization

    Reflections on integrating bioinformatics into the undergraduate curriculum:The Lancaster experience

    Get PDF
    Bioinformatics is an essential discipline for biologists. It also has a reputation of being difficult for those without a strong quantitative and computer science background. At Lancaster University, we have developed modules for the integration of bioinformatics skills training into our undergraduate biology degree portfolio. This article describes those modules, situating them in the context of the accumulated quarter century of literature on bioinformatics education. The constant evolution of bioinformatics as a discipline is emphasized, drawing attention to the continual necessity to revise and upgrade those skills being taught, even at undergraduate level. Our overarching aim is to equip students both with a portfolio of skills in the currently most essential bioinformatics tools and with the confidence to continue their own bioinformatics skills development at postgraduate or professional level

    Natriuretic peptide vs. clinical information for diagnosis of left ventricular systolic dysfunction in primary care

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Screening of primary care patients at risk for left ventricular systolic dysfunction by a simple blood-test might reduce referral rates for echocardiography. Whether or not natriuretic peptide testing is a useful and cost-effective diagnostic instrument in primary care settings, however, is still a matter of debate.</p> <p>Methods</p> <p>N-terminal pro-brain natriuretic peptide (NT-proBNP) levels, clinical information, and echocardiographic data of left ventricular systolic function were collected in 542 family practice patients with at least one cardiovascular risk factor. We determined the diagnostic power of the NT-proBNP assessment in ruling out left ventricular systolic dysfunction and compared it to a risk score derived from a logistic regression model of easily acquired clinical information.</p> <p>Results</p> <p>23 of 542 patients showed left ventricular systolic dysfunction. Both NT-proBNP and the clinical risk score consisting of dyspnea at exertion and ankle swelling, coronary artery disease and diuretic treatment showed excellent diagnostic power for ruling out left ventricular systolic dysfunction. AUC of NT-proBNP was 0.83 (95% CI, 0.75 to 0.92) with a sensitivity of 0.91 (95% CI, 0.71 to 0.98) and a specificity of 0.46 (95% CI, 0.41 to 0.50). AUC of the clinical risk score was 0.85 (95% CI, 0.79 to 0.91) with a sensitivity of 0.91 (95% CI, 0.71 to 0.98) and a specificity of 0.64 (95% CI, 0.59 to 0.67). 148 misclassifications using NT-proBNP and 55 using the clinical risk score revealed a significant difference (McNemar test; p < 0.001) that was based on the higher specificity of the clinical risk score.</p> <p>Conclusion</p> <p>The evaluation of clinical information is at least as effective as NT-proBNP testing in ruling out left ventricular systolic dysfunction in family practice patients at risk. If these results are confirmed in larger cohorts and in different samples, family physicians should be encouraged to rely on the diagnostic power of the clinical information from their patients.</p
    corecore