4,849 research outputs found
Numerical comparison of mathematical and computational models for the simulation of stochastic neutron kinetics problems
This paper concerns numerical comparisons between five mathematical models capable of modelling the stochastic behaviour of neutrons in low extraneous (extrinsic or fixed) neutron source applications. These models include analog Monte-Carlo (AMC), forward probability balance equations (FPB), generating function form of the forward probability balance equations (FGF), generating function form of the backward probability balance equations (P´al-Bell), and an Itˆo calculus model using both an explicit and implicit Euler-Maruyama discretization scheme. Results such as the survival probability, extinction probability, neutron population mean and standard deviation, and neutron population cumulative distribution function have all been compared. The least computationally demanding mathematical model has been found to be the use of the P´al-Bell equations which on average take four orders of magnitude less time to compute than the other methods in this study. The accuracy of the AMC and FPB models have been found to be strongly linked to the computational e ciency of the models. The computational e ciency of the models decrease significantly as the maximum allowable neutron population is approached. The Itˆo calculus methods, utilising explicit and implicit Euler-Maruyama discretization schemes, have been found to be unsuitable for modelling very low neutron populations. However, improved results, using the Itˆo calculus methods, have been achieved for systems containing a greater number of neutrons
The Mannose Receptor Mediates Dengue Virus Infection of Macrophages
Macrophages (MØ) and mononuclear phagocytes are major targets of infection by dengue virus (DV), a mosquito-borne flavivirus that can cause haemorrhagic fever in humans. To our knowledge, we show for the first time that the MØ mannose receptor (MR) binds to all four serotypes of DV and specifically to the envelope glycoprotein. Glycan analysis, ELISA, and blot overlay assays demonstrate that MR binds via its carbohydrate recognition domains to mosquito and human cell–produced DV antigen. This binding is abrogated by deglycosylation of the DV envelope glycoprotein. Surface expression of recombinant MR on NIH3T3 cells confers DV binding. Furthermore, DV infection of primary human MØ can be blocked by anti-MR antibodies. MR is a prototypic marker of alternatively activated MØ, and pre-treatment of human monocytes or MØ with type 2 cytokines (IL-4 or IL-13) enhances their susceptibility to productive DV infection. Our findings indicate a new functional role for the MR in DV infection
The liver transplant waiting list—a single-center analysis
At this transplant center 1340 patients were entered on the liver transplant waiting list during the first 25 months (October 1987 to November 1989) after the initiation of the UNOS allocation system for liver grafts. Of these 972 (72.5%) of the patients received a graft, 120 (9.0%) died waiting for a graft, 109 (8.1%) remained on the active list as of the study endpoint of December 15, 1989, 123 (9.2%) were withdrawn from candidacy, and 16 (1.2%) received a transplant at another center. A total of 1201 patients were candidates for a first graft. Of the 812 primary candidates who received a graft, 64.8% received their graft within one month of entry on the waiting list. Of the 109 primary candidates who died before a graft could be found, 79.0% died within a month of entry onto the waiting list. At time of transplantation, 135 (16.6%) primary recipients of a graft were UNOS class 1, 326 (40.1%) were UNOS class 2, 190 (23.4%) were UNOS class 3, and 161 (19.8%) were UNOS class 4. Actuarial survival rates (percentage) at 6 months for recipients in UNOS class 1, class 2, class 3, and class 4 were 88.7±2.9, 82.6+2.1, 78.4±3.2, and 68.4±3.9, respectively (P<0.001). At the time of death of recipients who failed to get a graft, 6 (5.5%) were UNOS class 1, 14 (12.8%) were UNOS class 2, 23 (21.1%) were UNOS class 3, and 66 (60.6%) were UNOS class 4. These results indicate that a high proportion of liver transplant candidates are in urgent need of a graft and that the UNOS system succeeds in giving these patients high priority. However patient mortality on the waiting list and after transplantation would lessen significantly if more patients with end-stage liver disease were referred to the transplant center in a timely manner before their condition reaches the point where the probability of survival is diminished. © 1991 by Williams & Wilkins
Gestational age at delivery and special educational need: retrospective cohort study of 407,503 schoolchildren
<STRONG>Background</STRONG> Previous studies have demonstrated an association between preterm delivery and increased risk of special educational need (SEN). The aim of our study was to examine the risk of SEN across the full range of gestation. <STRONG>Methods and Findings</STRONG>
We conducted a population-based, retrospective study by linking school census data on the 407,503 eligible school-aged children resident in 19 Scottish Local Authority areas (total population 3.8 million) to their routine birth data. SEN was recorded in 17,784 (4.9%) children; 1,565 (8.4%) of those born preterm and 16,219 (4.7%) of those born at term. The risk of SEN increased across the whole range of gestation from 40 to 24 wk: 37–39 wk adjusted odds ratio (OR) 1.16, 95% confidence interval (CI) 1.12–1.20; 33–36 wk adjusted OR 1.53, 95% CI 1.43–1.63; 28–32 wk adjusted OR 2.66, 95% CI 2.38–2.97; 24–27 wk adjusted OR 6.92, 95% CI 5.58–8.58. There was no interaction between elective versus spontaneous delivery. Overall, gestation at delivery accounted for 10% of the adjusted population attributable fraction of SEN. Because of their high frequency, early term deliveries (37–39 wk) accounted for 5.5% of cases of SEN compared with preterm deliveries (<37 wk), which accounted for only 3.6% of cases. <STRONG>Conclusions</STRONG> Gestation at delivery had a strong, dose-dependent relationship with SEN that was apparent across the whole range of gestation. Because early term delivery is more common than preterm delivery, the former accounts for a higher percentage of SEN cases. Our findings have important implications for clinical practice in relation to the timing of elective delivery
Caspase-2 is upregulated after sciatic nerve transection and its inhibition protects dorsal root ganglion neurons from Apoptosis after serum withdrawal
Sciatic nerve (SN) transection-induced apoptosis of dorsal root ganglion neurons (DRGN) is one factor determining the efficacy of peripheral axonal regeneration and the return of sensation. Here, we tested the hypothesis that caspase-2(CASP2) orchestrates apoptosis of axotomised DRGN both in vivo and in vitro by disrupting the local neurotrophic supply to DRGN. We observed significantly elevated levels of cleaved CASP2 (C-CASP2), compared to cleaved caspase-3 (C-CASP3), within TUNEL+DRGN and DRG glia (satellite and Schwann cells) after SN transection. A serum withdrawal cell culture model, which induced 40% apoptotic death in DRGN and 60% in glia, was used to model DRGN loss after neurotrophic factor withdrawal. Elevated C-CASP2 and TUNEL were observed in both DRGN and DRG glia, with C-CASP2 localisation shifting from the cytosol to the nucleus, a required step for induction of direct CASP2-mediated apoptosis. Furthermore, siRNAmediated downregulation of CASP2 protected 50% of DRGN from apoptosis after serum withdrawal, while downregulation of CASP3 had no effect on DRGN or DRG glia survival. We conclude that CASP2 orchestrates the death of SN-axotomised DRGN directly and also indirectly through loss of DRG glia and their local neurotrophic factor support. Accordingly, inhibiting CASP2 expression is a potential therapy for improving both the SN regeneration response and peripheral sensory recovery
Outcome of ATP-based tumor chemosensitivity assay directed chemotherapy in heavily pre-treated recurrent ovarian carcinoma
BACKGROUND: We wished to evaluate the clinical response following ATP-Tumor Chemosensitivity Assay (ATP-TCA) directed salvage chemotherapy in a series of UK patients with advanced ovarian cancer. The results are compared with that of a similar assay used in a different country in terms of evaluability and clinical endpoints. METHODS: From November 1998 to November 2001, 46 patients with pre-treated, advanced ovarian cancer were given a total of 56 courses of chemotherapy based on in-vitro ATP-TCA responses obtained from fresh tumor samples or ascites. Forty-four patients were evaluable for results. Of these, 18 patients had clinically platinum resistant disease (relapse < 6 months after first course of chemotherapy). There was evidence of cisplatin resistance in 31 patients from their first ATP-TCA. Response to treatment was assessed by radiology, clinical assessment and tumor marker level (CA 125). RESULTS: The overall response rate was 59% (33/56) per course of chemotherapy, including 12 complete responses, 21 partial responses, 6 with stable disease, and 15 with progressive disease. Two patients were not evaluable for response having received just one cycle of chemotherapy: if these were excluded the response rate is 61%. Fifteen patients are still alive. Median progression free survival (PFS) was 6.6 months per course of chemotherapy; median overall survival (OAS) for each patient following the start of TCA-directed therapy was 10.4 months (95% confidence interval 7.9-12.8 months). CONCLUSION: The results show similar response rates to previous studies using ATP-TCA directed therapy in recurrent ovarian cancer. The assay shows high evaluability and this study adds weight to the reproducibility of results from different centre
Recommended from our members
Clinician and patient perspectives on the barriers and facilitators to physical rehabilitation in intensive care: a qualitative interview study
Data availability statement:
No data are available beyond what is reported in this manuscript and the supplemental files due to participant confidentiality requirements.Objectives: The objective of this study is to explore patient, relative/carer and clinician perceptions of barriers to early physical rehabilitation in intensive care units (ICUs) within an associated group of hospitals in the UK and how they can be overcome.
Design: Qualitative study using semi-structured interviews and thematic framework analysis.
Setting: Four ICUs over three hospital sites in London, UK.
Participants: Former ICU patients or their relatives/carers with personal experience of ICU rehabilitation. ICU clinicians, including doctors, nurses, physiotherapists and occupational therapists, involved in the delivery of physical rehabilitation or decisions over its initiation.
Primary and secondary outcomes measures: Views and experiences on the barriers and facilitators to ICU physical rehabilitation.
Results: Interviews were carried out with 11 former patients, 3 family members and 16 clinicians. The themes generated related to: safety and physiological concerns, patient participation and engagement, clinician experience and knowledge, teamwork, equipment and environment and risks and benefits of rehabilitation in intensive care. The overarching theme for overcoming barriers was a change in working model from ICU clinicians having separate responsibilities (a multidisciplinary approach) to one where all parties have a shared aim of providing patient-centred ICU physical rehabilitation (an interdisciplinary approach).
Conclusions: The results have revealed barriers that can be modified to improve rehabilitation delivery in an ICU. Interdisciplinary working could overcome many of these barriers to optimise recovery from critical illness.Clinical Doctoral Research Fellowship, awarded to HRW, (ICA-CDRF-2015-01-026), supported by the National Institute for Health Research (NIHR) and Health Education England. We acknowledge the support of the NIHR Clinical Research Network and infrastructure support for this research was provided by the NIHR Imperial Biomedical Research Centre (BRC). HRW is supported by the NIHR Imperial BRC
The Interstellar Medium In Galaxies Seen A Billion Years After The Big Bang
Evolution in the measured rest frame ultraviolet spectral slope and
ultraviolet to optical flux ratios indicate a rapid evolution in the dust
obscuration of galaxies during the first 3 billion years of cosmic time (z>4).
This evolution implies a change in the average interstellar medium properties,
but the measurements are systematically uncertain due to untested assumptions,
and the inability to measure heavily obscured regions of the galaxies. Previous
attempts to directly measure the interstellar medium in normal galaxies at
these redshifts have failed for a number of reasons with one notable exception.
Here we report measurements of the [CII] gas and dust emission in 9 typical
(~1-4L*) star-forming galaxies ~1 billon years after the big bang (z~5-6). We
find these galaxies have >12x less thermal emission compared with similar
systems ~2 billion years later, and enhanced [CII] emission relative to the
far-infrared continuum, confirming a strong evolution in the interstellar
medium properties in the early universe. The gas is distributed over scales of
1-8 kpc, and shows diverse dynamics within the sample. These results are
consistent with early galaxies having significantly less dust than typical
galaxies seen at z<3 and being comparable to local low-metallicity systems.Comment: Submitted to Nature, under review after referee report. 22 pages, 4
figures, 4 Extended Data Figures, 5 Extended Data table
- …