2,853 research outputs found
Coevolution in a One Predator–Two Prey System
Background: Our understanding of coevolution in a predator–prey system is based mostly on pair-wise interactions. Methodology and Principal Findings: Here I analyze a one-predator–two-prey system in which the predator’s attack ability and the defense abilities of the prey all evolve. The coevolutionary consequences can differ dramatically depending on the initial trait value and the timing of the alternative prey’s invasion into the original system. If the invading prey species has relatively low defense ability when it invades, its defense is likely to evolve to a lower level, stabilizing the population dynamics. In contrast, if when it invades its defense ability is close to that of the resident prey, its defense can evolve to a higher level and that of the resident prey may suddenly cease to evolve, destabilizing the population dynamics. Destabilization due to invasion is likely when the invading prey is adaptively superior (evolution of its defense is less constrained and fast), and it can also occur in a broad condition even when the invading prey is adaptively inferior. In addition, invasion into a resident system far from equilibrium characterized by population oscillations is likely to cause further destabilization
The intestinal expulsion of the roundworm Ascaris suum is associated with eosinophils, intra-epithelial T cells and decreased intestinal transit time
Ascaris lumbricoides remains the most common endoparasite in humans, yet there is still very little information available about the immunological principles of protection, especially those directed against larval stages. Due to the natural host-parasite relationship, pigs infected with A. suum make an excellent model to study the mechanisms of protection against this nematode. In pigs, a self-cure reaction eliminates most larvae from the small intestine between 14 and 21 days post infection. In this study, we investigated the mucosal immune response leading to the expulsion of A. suum and the contribution of the hepato-tracheal migration. Self-cure was independent of previous passage through the liver or lungs, as infection with lung stage larvae did not impair self-cure. When animals were infected with 14-day-old intestinal larvae, the larvae were being driven distally in the small intestine around 7 days post infection but by 18 days post infection they re-inhabited the proximal part of the small intestine, indicating that more developed larvae can counter the expulsion mechanism. Self-cure was consistently associated with eosinophilia and intra-epithelial T cells in the jejunum. Furthermore, we identified increased gut movement as a possible mechanism of self-cure as the small intestinal transit time was markedly decreased at the time of expulsion of the worms. Taken together, these results shed new light on the mechanisms of self-cure that occur during A. suum infections
Advantages and Limitations of Commercially Available Electrocuting Grids for Studying Mosquito Behaviour.
Mosquito feeding behaviour plays a major role in determining malaria transmission intensity and the impact of specific prevention measures. Human Landing Catch (HLC) is currently the only method that can directly and consistently measure the biting rates of anthropophagic mosquitoes, both indoors and outdoors. However, this method exposes the participant to mosquito-borne pathogens, therefore new exposure-free methods are needed to replace it. Commercially available electrocuting grids (EGs) were evaluated as an alternative to HLC using a Latin Square experimental design in Dar es Salaam, Tanzania. Both HLC and EGs were used to estimate the proportion of human exposure to mosquitoes occurring indoors (πi), as well as its two underlying parameters: the proportion of mosquitoes caught indoors (Pi) and the proportion of mosquitoes caught between the first and last hour when most people are indoors (Pfl). HLC and EGs methods accounted for 69% and 31% of the total number of female mosquitoes caught respectively and both methods caught more mosquitoes outdoors than indoors. Results from the gold standard HLC suggest that An. gambiae s.s. in Dar es Salaam is neither exophagic nor endophagic (Pi ≈ 0.5), whereas An. arabiensis is exophagic (Pi < < 0.5). Both species prefer to feed after 10 pm when most people are indoors (Pfl > >0.5). EGs yielded estimates of Pi for An. gambiae s.s., An. arabiensis and An. coustani, that were approximately equivalent to those with HLC but significantly underestimated Pfl for An. gambiae s.s. and An. coustani. The relative sampling sensitivity of EGs declined over the course of the night (p ≤ 0.001) for all mosquito taxa except An. arabiensis. Commercial EGs sample human-seeking mosquitoes with high sensitivity both indoors and outdoors and accurately measure the propensity of Anopheles malaria vectors to bite indoors rather than outdoors. However, further modifications are needed to stabilize sampling sensitivity over a full nocturnal cycle so that they can be used to survey patterns of human exposure to mosquitoes
Efficacy of tranexamic acid in reducing blood loss in posterior lumbar spine surgery for degenerative spinal stenosis with instability: a retrospective case control study
<p>Abstract</p> <p>Background</p> <p>Degenerative spinal stenosis and instability requiring multilevel spine surgery has been associated with large blood losses. Factors that affect perioperative blood loss include time of surgery, surgical procedure, patient height, combined anterior/posterior approaches, number of levels fused, blood salvage techniques, and the use of anti-fibrinolytic medications. This study was done to evaluate the efficacy of tranexamic acid in reducing blood loss in spine surgery.</p> <p>Methods</p> <p>This retrospective case control study includes 97 patients who had to undergo surgery because of degenerative lumbar spinal stenosis and instability. All operations included spinal decompression, interbody fusion and posterior instrumentation (4-5 segments). Forty-six patients received 1 g tranexamic acid intravenous, preoperative and six hours and twelve hours postoperative; 51 patients without tranexamic acid administration were evaluated as a control group. Based on the records, the intra- and postoperative blood losses were measured by evaluating the drainage and cell saver systems 6, 12 and 24 hours post operation. Additionally, hemoglobin concentration and platelet concentration were reviewed. Furthermore, the number of red cell transfusions given and complications associated with tranexamic acid were assessed.</p> <p>Results</p> <p>The postoperative hemoglobin concentration demonstrated a statistically significant difference with a p value of 0.0130 showing superiority for tranexamic acid use (tranexamic acid group: 11.08 g/dl, SD: 1.68; control group: 10.29 g/dl, SD: 1.39). The intraoperative cell saver volume and drainage volume after 24 h demonstrated a significant difference as well, which indicates a less blood loss in the tranexamic acid group than the control group. The postoperative drainage volume at12 hours showed no significant differences; nor did the platelet concentration Allogenic blood transfusion (two red cell units) was needed for eight patients in the tranexamic acid group and nine in the control group because of postoperative anemia. Complications associated with the administration of tranexamic acid, e.g. renal failure, deep vein thrombosis or pulmonary embolism did not occur.</p> <p>Conclusions</p> <p>This study suggests a less blood loss when administering tranexamic acid in posterior lumbar spine surgery as demonstrated by the higher postoperative hemoglobin concentration and the less blood loss. But given the relatively small volume of blood loss in the patients of this study it is underpowered to show a difference in transfusion rates.</p
Phase III randomised trial of doxorubicin-based chemotherapy compared with platinum-based chemotherapy in small-cell lung cancer
This randomised trial compared platinum-based to anthracycline-based chemotherapy in patients with small-cell lung cancer (limited or extensive stage) and ⩽2 adverse prognostic factors. Patients were randomised to receive six cycles of either ACE (doxorubicin 50 mg/m2 i.v., cyclophosphamide 1 g/m2 i.v. and etoposide 120 mg/m2 i.v. on day 1, then etoposide 240 mg/m2 orally for 2 days) or PE (cisplatin 80 mg/m2 and etoposide 120 mg/m2 i.v. on day 1, then etoposide 240 mg/m2 orally for 2 days) given for every 3 weeks. For patients where cisplatin was not suitable, carboplatin (AUC6) was substituted. A total of 280 patients were included (139 ACE, 141 PE). The response rates were 72% for ACE and 77% for PE. One-year survival rates were 34 and 38% (P=0.497), respectively and 2-year survival was the same (12%) for both arms. For LD patients, the median survival was 10.9 months for ACE and 12.6 months for PE (P=0.51); for ED patients median survival was 8.3 months and 7.5 months, respectively. More grades 3 and 4 neutropenia (90 vs 57%, P<0.005) and grades 3 and 4 infections (73 vs 29%, P<0.005) occurred with ACE, resulting in more days of hospitalisation and greater i.v. antibiotic use. ACE was associated with a higher risk of neutropenic sepsis than PE and with a trend towards worse outcome in patients with LD, and should not be studied further in this group of patients
Evolution of Competitive Ability: An Adaptation Speed vs. Accuracy Tradeoff Rooted in Gene Network Size
Ecologists have increasingly come to understand that evolutionary change on short
time-scales can alter ecological dynamics (and vice-versa), and this idea is
being incorporated into community ecology research programs. Previous research
has suggested that the size and topology of the gene network underlying a
quantitative trait should constrain or facilitate adaptation and thereby alter
population dynamics. Here, I consider a scenario in which two species with
different genetic architectures compete and evolve in fluctuating environments.
An important trade-off emerges between adaptive accuracy and adaptive speed,
driven by the size of the gene network underlying the ecologically-critical
trait and the rate of environmental change. Smaller, scale-free networks confer
a competitive advantage in rapidly-changing environments, but larger networks
permit increased adaptive accuracy when environmental change is sufficiently
slow to allow a species time to adapt. As the differences in network
characteristics increase, the time-to-resolution of competition decreases. These
results augment and refine previous conclusions about the ecological
implications of the genetic architecture of quantitative traits, emphasizing a
role of adaptive accuracy. Along with previous work, in particular that
considering the role of gene network connectivity, these results provide a set
of expectations for what we may observe as the field of ecological genomics
develops
Survival of patients treated with intra-aortic balloon counterpulsation at a tertiary care center in Pakistan – patient characteristics and predictors of in-hospital mortality
BACKGROUND: Intra-aortic balloon counterpulsation (IABC) has an established role in the treatment of patients presenting with critical cardiac illnesses, including cardiogenic shock, refractory ischemia and for prophylaxis and treatment of complications of percutaneous coronary interventions (PCI). Patients requiring IABC represent a high-risk subset with an expected high mortality. There are virtually no data on usage patterns as well as outcomes of patients in the Indo-Pakistan subcontinent who require IABC. This is the first report on a sizeable experience with IABC from Pakistan. METHODS: Hospital charts of 95 patients (mean age 58.8 (± 10.4) years; 78.9% male) undergoing IABC between 2000–2002 were reviewed. Logistic regression was used to determine univariate and multivariate predictors of in-hospital mortality. RESULTS: The most frequent indications for IABC were cardiogenic shock (48.4%) and refractory ischemia (24.2%). Revascularization (surgical or PCI) was performed in 74 patients (77.9%). The overall in-hospital mortality rate was 34.7%. Univariate predictors of in-hospital mortality included (odds ratio [95% CI]) age (OR 1.06 [1.01–1.11] for every year increase in age); diabetes (OR 3.68 [1.51–8.92]) and cardiogenic shock at presentation (OR 4.85 [1.92–12.2]). Furthermore, prior CABG (OR 0.12 [0.04–0.34]), and in-hospital revascularization (OR 0.05 [0.01–0.189]) was protective against mortality. In the multivariate analysis, independent predictors of in-hospital mortality were age (OR 1.13 [1.05–1.22] for every year increase in age); diabetes (OR 6.35 [1.61–24.97]) and cardiogenic shock at presentation (OR 10.0 [2.33–42.95]). Again, revascularization during hospitalization (OR 0.02 [0.003–0.12]) conferred a protective effect. The overall complication rate was low (8.5%). CONCLUSIONS: Patients requiring IABC represent a high-risk group with substantial in-hospital mortality. Despite this high mortality, over two-thirds of patients do leave the hospital alive, suggesting that IABC is a feasible therapeutic device, even in a developing country
Charged-Higgs phenomenology in the Aligned two-Higgs-doublet model
The alignment in flavour space of the Yukawa matrices of a general
two-Higgs-doublet model results in the absence of tree-level flavour-changing
neutral currents. In addition to the usual fermion masses and mixings, the
aligned Yukawa structure only contains three complex parameters, which are
potential new sources of CP violation. For particular values of these three
parameters all known specific implementations of the model based on discrete
Z_2 symmetries are recovered. One of the most distinctive features of the
two-Higgs-doublet model is the presence of a charged scalar. In this work, we
discuss its main phenomenological consequences in flavour-changing processes at
low energies and derive the corresponding constraints on the parameters of the
aligned two-Higgs-doublet model.Comment: 46 pages, 19 figures. Version accepted for publication in JHEP.
References added. Discussion slightly extended. Conclusions unchange
Recommended from our members
Impact of progressive global warming on the global-scale yield of maize and soybean
Global surface temperature is projected to warm over the coming decades, with regional differences expected in temperature change, rainfall and the frequency of extreme events. Temperature is a major determinant of crop growth and development, affecting planting date, growing season length and yield. We investigated the effects of increments of mean global temperature warming from 0.5 °C to 4 °C on soybean and maize development and yield, both globally and for the main producing countries, and simulated adaptation through changing planting date and variety. Increasing temperature resulted in reduced growing season lengths and ultimately reduced yields for both crops. The global yield for maize decreased as temperature increased, although the severity of the decrease was dependent on geographic region. Small temperature increases of 0.5 °C had no effect on soybean yield, although yield decreased as temperature increased. These negative effects, however, were partly compensated for by the implementation of adaptation strategies including planting earlier in the season and changing variety. The degree of compensation was dependent on geographical area and crop, with maize adaptation delaying the negative effects of temperature on yield, compared to soybean adaptation which increased yield in China, India and Korea DPR as well as delaying the effects in the remaining countries. The results of this paper indicate the degree to which farmer-controlled adaptation strategies can alleviate the negative impacts of increasing temperature on two major crop species
Genetic factors influencing drug induced liver injury:do they have a role in prevention and diagnosis?
- …