84 research outputs found

    Tilting the balance between RNA interference and replication eradicates Leishmania RNA virus 1 and mitigates the inflammatory response.

    Get PDF
    Many Leishmania (Viannia) parasites harbor the double-stranded RNA virus Leishmania RNA virus 1 (LRV1), which has been associated with increased disease severity in animal models and humans and with drug treatment failures in humans. Remarkably, LRV1 survives in the presence of an active RNAi pathway, which in many organisms controls RNA viruses. We found significant levels (0.4 to 2.5%) of small RNAs derived from LRV1 in both Leishmania braziliensis and Leishmania guyanensis, mapping across both strands and with properties consistent with Dicer-mediated cleavage of the dsRNA genome. LRV1 lacks cis- or trans-acting RNAi inhibitory activities, suggesting that virus retention must be maintained by a balance between RNAi activity and LRV1 replication. To tilt this balance toward elimination, we targeted LRV1 using long-hairpin/stem-loop constructs similar to those effective against chromosomal genes. LRV1 was completely eliminated, at high efficiency, accompanied by a massive overproduction of LRV1-specific siRNAs, representing as much as 87% of the total. For both L. braziliensis and L. guyanensis, RNAi-derived LRV1-negative lines were no longer able to induce a Toll-like receptor 3-dependent hyperinflammatory cytokine response in infected macrophages. We demonstrate in vitro a role for LRV1 in virulence of L. braziliensis, the Leishmania species responsible for the vast majority of mucocutaneous leishmaniasis cases. These findings establish a targeted method for elimination of LRV1, and potentially of other Leishmania viruses, which will facilitate mechanistic dissection of the role of LRV1-mediated virulence. Moreover, our data establish a third paradigm for RNAi-viral relationships in evolution: one of balance rather than elimination

    Biomechanical risk factors for lower extremity stress fracture

    Get PDF
    Objectives: Stress fracture injuries disproportionately affect athletes and military service members and little is known about the modifiable biomechanical risk factors associated with these injuries. The purpose of this study was to prospectively examine the association between neuromuscular and biomechanical factors upon entry to military service and the subsequent incidence of lower-extremity stress fracture injury during four years of follow-up. Methods: We analyzed data from the JUMP-ACL cohort, an existing prospective cohort study of military cadets. JUMP-ACL conducted detailed motion analysis during a jump landing task at the initiation of each subject’s military career. We limited our analyses to the class years 2009-2013 (i.e., subjects who completed baseline testing in 2005-2008). There were 1895 subjects available for analysis. Fifty-two subjects reported a history of stress fracture at baseline and were excluded from further analysis leaving 1843 subjects. Incident lower extremity-stress fracture cases were identified through the Defense Medical Surveillance System and the Cadet Injury and Illness Tracking System during the follow-up period. The electronic medical records of each potential incident case were reviewed and each case was confirmed by an adjudication committee consisting of two sports medicine fellowship trained orthopaedic surgeons. The primary outcome of interest was the incidence rate of lower-extremity stress fracture during the follow-up period. The association between incident stress fracture and sagittal, frontal, and transverse plane hip and knee kinematics during the jump-landing task were examined at initial contact (IC), 15%(T15), 50%(T50), 85%(T85) and 100%(T100) of stance phase. Descriptive plots of all biomechanical variables along with 95% confidence intervals (CI) were generated during the stance phase of the jump landing task. Univariate and multivariable Poisson regression models were used to estimate the association between baseline biomechanical factors and the incidence rate of lower-extremity stress fracture during follow-up. Results: Overall, 94 (5.1%, 95%CI: 4.14, 6.21) subjects sustained an incident stress fracture during the follow-up period. The incidence rate for stress fracture injuries among females was nearly three times greater when compared to males (IRR=2.86, 95%CI: 1.88, 4.34, p<0.001). Compared to those with greater than 5° of knee valgus, subjects with neutral or varus knee alignment experienced incidence rates for stress fracture that were 43%-53% lower at IC (IRR=0.57, 95%CI: 0.29, 1.11, p=0.10), T50 (IRR=0.47, 95%CI=0.23, 1.00, p=0.05), and T85 (IRR=0.53, 95%CI: 0.29, 0.98, p=0.04). Subjects with greater than 5° of internal knee rotation exhibited rates for stress fracture that were 2-4 times higher at T15 (IRR=2.31, 95%CI: 1.01, 5.27, p=0.05), T50 (IRR=3.98, 95%CI: 0.99, 16.00, p=0.05), and T85 (IRR=2.31, 95%CI: 0.86, 6.23, p=0.10), when compared to those with neutral or external knee rotation alignment. Conclusion: Several potentially modifiable biomechanical factors at the time of entry into military service appear to be associated with the subsequent rate of stress fracture. It is possible that injury prevention programs targeted to address these biomechanical movement patterns may reduce the risk of stress fracture injury in athletes and military service members

    Machine-learning-based calving prediction from activity, lying, and ruminating behaviors in dairy cattle

    Get PDF
    The objective of this study was to use automated activity, lying, and rumination monitors to characterize prepartum behavior and predict calving in dairy cattle. Data were collected from 20 primiparous and 33 multiparous Holstein dairy cattle from September 2011 to May 2013 at the University of Kentucky Coldstream Dairy. The HR Tag (SCR Engineers Ltd., Netanya, Israel) automatically collected neck activity and rumination data in 2-h increments. The IceQube (IceRobotics Ltd., South Queensferry, United Kingdom) automatically collected number of steps, lying time, standing time, number of transitions from standing to lying (ly-. ing bouts), and total motion, summed in 15-min increments. IceQube data were summed in 2-h increments to match HR Tag data. All behavioral data were collected for 14 d before the predicted calving date. Retrospective data analysis was performed using mixed linear models to examine behavioral changes by day in the 14 d before calving. Bihourly behavioral differences from baseline values over the 14 d before calving were also evaluated using mixed linear models. Changes in daily rumination time, total motion, lying time, and lying bouts occurred in the 14 d before calving. In the bihourly analysis, extreme values for all behaviors occurred in the final 24 h, indicating that the monitored behaviors may be useful in calving prediction. To determine whether technologies were useful at predicting calving, random forest, linear discriminant analysis, and neural network machine -learning techniques were constructed and implemented using R version 3.1.0 (R Foundation for Statistical Computing, Vienna, Austria). These methods were used on variables from each technology and all combined variables from both technologies. A neural network analysis that combined variables from both technologies at the daily level yielded 100.0% sen-sitivity and 86.8% specificity. A neural network analysis that combined variables from both technologies in bihourly increments was used to identify 2-h periods in the 8 h before calving with 82.8% sensitivity and 80.4% specificity. Changes in behavior and machine-learning alerts indicate that commercially marketed behavioral monitors may have calving prediction potential

    Importance of polyphosphate in the <i>Leishmania</i> life cycle.

    Get PDF
    Protozoan parasites contain negatively charged polymers of a few up to several hundreds of phosphate residues. In other organisms, these poly-phosphate (polyP) chains serve as an energy source and phosphate reservoir, and have been implicated in adaptation to stress and virulence of pathogenic organisms. In this study, we confirmed first that the polyP polymerase vacuolar transporter chaperone 4 ( &lt;i&gt;VTC4&lt;/i&gt; ) is responsible for polyP synthesis in &lt;i&gt;Leishmania&lt;/i&gt; parasites. During &lt;i&gt;Leishmania&lt;/i&gt; &lt;i&gt;in vitro&lt;/i&gt; culture, polyP is accumulated in logarithmic growth phase and subsequently consumed once stationary phase is reached. However, polyP is not essential since VTC4-deficient ( &lt;i&gt;vtc4 &lt;sup&gt;-&lt;/sup&gt;&lt;/i&gt; ) &lt;i&gt;Leishmania&lt;/i&gt; proliferated normally in culture and differentiated into infective metacyclic parasites and into intracellular and axenic amastigotes. In &lt;i&gt;in vivo&lt;/i&gt; mouse infections, &lt;i&gt;L. major&lt;/i&gt; &lt;i&gt;VTC4&lt;/i&gt; knockout showed a delay in lesion formation but ultimately gave rise to strong pathology, although we were unable to restore virulence by complementation to confirm this phenotype. Knockdown of &lt;i&gt;VTC4&lt;/i&gt; did not alter the course of &lt;i&gt;L. guyanensis&lt;/i&gt; infections in mice, suggesting that polyP was not required for infection, or that very low levels of it suffice for lesion development. At higher temperatures, &lt;i&gt;Leishmania&lt;/i&gt; promastigotes highly consumed polyP, and both knockdown or deletion of &lt;i&gt;VTC4&lt;/i&gt; diminished parasite survival. Thus, although polyP was not essential in the life cycle of the parasite, our data suggests a role for polyP in increasing parasite survival at higher temperatures, a situation faced by the parasite when transmitted to humans

    Landing Error Scoring System (LESS) Items are Associated with the Incidence Rate of Lower Extremity Stress Fracture

    Get PDF
    Objectives: Lower-extremity stress fracture injuries are a major cause of morbidity in physically active populations. The ability to efficiently screen for modifiable risk factors associated with injury is critical in developing and implementing effective injury prevention programs. The purpose of this study was to determine if baseline Landing Error Scoring System (LESS) scores were associated with the incidence rate of lower-extremity stress fracture during four years of follow-up. Methods: To accomplish this objective we conducted a prospective cohort study at a US Service Academy. A total of 1772 eligible subjects with complete baseline data and no history of lower-extremity stress fracture were included in this study. At baseline we conducted motion analysis during a jump landing task using the LESS. Incident lower-extremity stress fracture cases were identified during the four year follow-up period using the injury surveillance systems at our institution. The primary outcome of interest was the incidence rate of lower-extremity stress fracture during follow-up. The electronic medical records of each potential incident case were reviewed and case status was determined by an adjudication committee consisting of two sports medicine fellowship-trained orthopaedic surgeons who were blinded to baseline LESS data. The association between baseline LESS scores and the incidence rate of lower-extremity stress fracture was examined for total LESS score and for each individual LESS item. Univariate and multivariable Poisson regression models were used to estimate the association between baseline LESS scores and the incidence rate of lower-extremity stress fracture during follow-up. Results: During the follow-up period, 94 incident lower-extremity stress fractures were documented in the study cohort and the cumulative incidence of stress fracture was 5.3% (95%CI: 4.3%, 6.5%). In univariate analyses total LESS score at baseline was associated with the incidence rate of lower-extremity stress fracture during follow-up. For every additional movement error documented at baseline there was a 15% increase in the incidence rate of lower-extremity stress fracture during follow-up (IRR=1.15; 95%CI: 1.02, 1.31, p=0.025). Based on univariate analyses, several individual LESS items at baseline were also associated with the incidence rate of stress fracture during follow-up. Ankle flexion at initial contact (p=0.055), stance width at initial contact (p=0.026), asymmetrical landing at initial contact (p=0.003), trunk flexion at initial contact (p=0.036), and overall impression (p=0.021) were significantly associated with the incidence rate of stress fracture. In multivariable analyses controlling for sex and year of entry into the cohort, subjects who consistently landed flat-footed or heel-to-toe were 2.33 times (IRR=2.33; 95%CI: 1.36, 3.97, p=0.002) more likely to sustain a lower-extremity stress fracture during follow-up. Similarly, subjects who consistently demonstrated asymmetric landing at initial contact were 2.53 times (IRR=2.53; 95%CI: 1.34, 4.74, p=0.004) more likely to sustain a stress fracture during follow-up. Conclusion: These data suggest that specific LESS items may be predictive of lower-extremity stress fracture risk and may be helpful in injury screening and prevention

    Association Between Landing Error Scoring System (LESS) Items and the Incidence Rate of Lower Extremity Stress Fracture

    Get PDF
    Background: Lower extremity stress fracture injuries are a major cause of morbidity in physically active populations. The ability to screen for modifiable risk factors associated with injury is critical in developing injury-prevention programs.Purpose:To determine if baseline Landing Error Scoring System (LESS) scores are associated with the incidence rate of lower extremity stress fracture.Study Design:Cohort study; Level of evidence, 2. Methods: A total of 1772 participants with no history of lower extremity stress fracture were included. At preinjury baseline, the authors conducted a lower extremity movement assessment during a jump-landing task using the LESS. Incident lower extremity stress fractures were identified during a 4-year follow-up period. Potential incident cases were reviewed by 2 sports medicine fellowship-trained orthopaedic surgeons blinded to baseline LESS data. Univariate and multivariable Poisson regression models were used to estimate the association between baseline total LESS scores, individual LESS items, and the incidence rate ratio (IRR) of lower extremity stress fracture. Results: A total of 94 incident lower extremity stress fractures were documented, for a 5.3% (95% CI, 4.3%-6.5%) cumulative incidence. The overall LESS score was associated with the incidence rate of lower extremity stress fracture. For every additional movement error documented at baseline, there was a 15% increase in the incidence rate of lower extremity stress fracture (IRR, 1.15 [95% CI, 1.02-1.31]; P = .025). In univariate analyses, ankle flexion, stance width, asymmetrical landing, and trunk flexion at initial contact, in addition to overall impression, were associated with the incidence rate of stress fracture. After controlling for sex and year of entry into the study cohort, participants who consistently landed flat-footed or heel-to-toe were 2.33 times (95% CI, 1.36-3.97; P = .002) more likely to sustain a lower extremity stress fracture. Similarly, participants who consistently demonstrated asymmetric landing at initial contact were 2.53 times (95% CI, 1.34-4.74; P = .004) more likely to sustain a stress fracture. Conclusion: Components of the LESS may be associated with increased lower extremity stress fracture risk and may be helpful in efficiently assessing high-risk lower extremity biomechanics in large groups

    Origins of the Ambient Solar Wind: Implications for Space Weather

    Full text link
    The Sun's outer atmosphere is heated to temperatures of millions of degrees, and solar plasma flows out into interplanetary space at supersonic speeds. This paper reviews our current understanding of these interrelated problems: coronal heating and the acceleration of the ambient solar wind. We also discuss where the community stands in its ability to forecast how variations in the solar wind (i.e., fast and slow wind streams) impact the Earth. Although the last few decades have seen significant progress in observations and modeling, we still do not have a complete understanding of the relevant physical processes, nor do we have a quantitatively precise census of which coronal structures contribute to specific types of solar wind. Fast streams are known to be connected to the central regions of large coronal holes. Slow streams, however, appear to come from a wide range of sources, including streamers, pseudostreamers, coronal loops, active regions, and coronal hole boundaries. Complicating our understanding even more is the fact that processes such as turbulence, stream-stream interactions, and Coulomb collisions can make it difficult to unambiguously map a parcel measured at 1 AU back down to its coronal source. We also review recent progress -- in theoretical modeling, observational data analysis, and forecasting techniques that sit at the interface between data and theory -- that gives us hope that the above problems are indeed solvable.Comment: Accepted for publication in Space Science Reviews. Special issue connected with a 2016 ISSI workshop on "The Scientific Foundations of Space Weather." 44 pages, 9 figure

    The 22-Year Hale Cycle in cosmic ray flux: evidence for direct heliospheric modulation

    Get PDF
    The ability to predict times of greater galactic cosmic ray (GCR) fluxes is important for reducing the hazards caused by these particles to satellite communications, aviation, or astronauts. The 11-year solar-cycle variation in cosmic rays is highly correlated with the strength of the heliospheric magnetic field. Differences in GCR flux during alternate solar cycles yield a 22-year cycle, known as the Hale Cycle, which is thought to be due to different particle drift patterns when the northern solar pole has predominantly positive (denoted as qA>0 cycle) or negative (qA0 cycles than for qA0 and more sharply peaked for qA0 solar cycles, when the difference in GCR flux is most apparent. This suggests that particle drifts may not be the sole mechanism responsible for the Hale Cycle in GCR flux at Earth. However, we also demonstrate that these polarity-dependent heliospheric differences are evident during the space-age but are much less clear in earlier data: using geomagnetic reconstructions, we show that for the period of 1905 - 1965, alternate polarities do not give as significant a difference during the declining phase of the solar cycle. Thus we suggest that the 22-year cycle in cosmic-ray flux is at least partly the result of direct modulation by the heliospheric magnetic field and that this effect may be primarily limited to the grand solar maximum of the space-age
    corecore