822 research outputs found
Recommended from our members
Expression of an anti-CD33 single-chain antibody by Pichia pastoris
CD33 is a cell surface glycoprotein expressed on cells of myelomonocytic lineage, leukaemic cells, but not haematopoietic stem cells. By virtue of its expression pattern, CD33 has become a popular target for new immunotherapeutic approaches to treat acute myeloid leukaemia. The methylotrophic yeast Pichia pastoris strain KM71H was used to produce an anti-CD33 single chain variable fragment (scFv), with the intention of conjugation to a radioisotope, for therapeutic use. To direct secreted expression of the anti-CD33-scFv the alpha-mating factor secretory signal sequence (alpha-MF) was used, with constructs containing a complete (CS) and incomplete (INCS) cleavage site to accommodate the potential outcomes of dibasic endopeptidase, Kex2, and dipeptidyl amino peptidase, Ste13, processing. The anti-CD33-scFv was expressed in BMMY cultures using both constructs, with a final yield of 48 mg/l (CS) and 11 mg/l (INCS). N-terminal sequencing showed that the CS-scFv had not been cleaved by Ste13, leaving amino acids EAEA at the N-terminus. The INCS-scFv construct produced a mixture of 50% authentic scFv and 50% with 11 amino acids from the alpha-MF remaining at the N-terminus. Despite the aberrations in alpha-MF processing, the anti-CD33-scFv's produced from both constructs were found to be functional. Flow cytometry and Biacore analysis demonstrated binding to target antigen CD33 on the surface of human leukaemic cell line HL-60, and to recombinant soluble CD33 respectively
Interpolation in waveform space: enhancing the accuracy of gravitational waveform families using numerical relativity
Matched-filtering for the identification of compact object mergers in
gravitational-wave antenna data involves the comparison of the data stream to a
bank of template gravitational waveforms. Typically the template bank is
constructed from phenomenological waveform models since these can be evaluated
for an arbitrary choice of physical parameters. Recently it has been proposed
that singular value decomposition (SVD) can be used to reduce the number of
templates required for detection. As we show here, another benefit of SVD is
its removal of biases from the phenomenological templates along with a
corresponding improvement in their ability to represent waveform signals
obtained from numerical relativity (NR) simulations. Using these ideas, we
present a method that calibrates a reduced SVD basis of phenomenological
waveforms against NR waveforms in order to construct a new waveform approximant
with improved accuracy and faithfulness compared to the original
phenomenological model. The new waveform family is given numerically through
the interpolation of the projection coefficients of NR waveforms expanded onto
the reduced basis and provides a generalized scheme for enhancing
phenomenological models.Comment: 10 pages, 7 figure
High Performance P3M N-body code: CUBEP3M
This paper presents CUBEP3M, a publicly-available high performance
cosmological N-body code and describes many utilities and extensions that have
been added to the standard package. These include a memory-light runtime SO
halo finder, a non-Gaussian initial conditions generator, and a system of
unique particle identification. CUBEP3M is fast, its accuracy is tuneable to
optimize speed or memory, and has been run on more than 27,000 cores, achieving
within a factor of two of ideal weak scaling even at this problem size. The
code can be run in an extra-lean mode where the peak memory imprint for large
runs is as low as 37 bytes per particles, which is almost two times leaner than
other widely used N-body codes. However, load imbalances can increase this
requirement by a factor of two, such that fast configurations with all the
utilities enabled and load imbalances factored in require between 70 and 120
bytes per particles. CUBEP3M is well designed to study large scales
cosmological systems, where imbalances are not too large and adaptive
time-stepping not essential. It has already been used for a broad number of
science applications that require either large samples of non-linear
realizations or very large dark matter N-body simulations, including
cosmological reionization, halo formation, baryonic acoustic oscillations, weak
lensing or non-Gaussian statistics. We discuss the structure, the accuracy,
known systematic effects and the scaling performance of the code and its
utilities, when applicable.Comment: 20 pages, 17 figures, added halo profiles, updated to match MNRAS
accepted versio
Improving initialization and evolution accuracy of cosmological neutrino simulations
Neutrino mass constraints are a primary focus of current and future
large-scale structure (LSS) surveys. Non-linear LSS models rely heavily on
cosmological simulations -- the impact of massive neutrinos should therefore be
included in these simulations in a realistic, computationally tractable, and
controlled manner. A recent proposal to reduce the related computational cost
employs a symmetric neutrino momentum sampling strategy in the initial
conditions. We implement a modified version of this strategy into the
Hardware/Hybrid Accelerated Cosmology Code (HACC) and perform convergence tests
on its internal parameters. We illustrate that this method can impart
numerical artifacts on the total matter field on small
scales, similar to previous findings, and present a method to remove these
artifacts using Fourier-space filtering of the neutrino density field.
Moreover, we show that the converged neutrino power spectrum does not follow
linear theory predictions on relatively large scales at early times at the
level, prompting a more careful study of systematics in particle-based
neutrino simulations. We also present an improved method for backscaling linear
transfer functions for initial conditions in massive neutrino cosmologies that
is based on achieving the same relative neutrino growth as computed with
Boltzmann solvers. Our self-consistent backscaling method yields sub-percent
accuracy in the total matter growth function. Comparisons for the non-linear
power spectrum with the Mira-Titan emulator at a neutrino mass of
are in very good agreement with the expected level
of errors in the emulator and in the direct N-body simulation.Comment: 33 pages, 8 figures, 1 table. To be submitted to JCA
Effect of statins on atrial fibrillation: collaborative meta-analysis of published and unpublished evidence from randomised controlled trials
Objective To examine whether statins can reduce the risk of atrial fibrillation.
Design Meta-analysis of published and unpublished results from larger scale statin trials, with comparison of the findings against the published results from smaller scale or shorter duration studies.
Data sources Medline, Embase, and Cochrane's CENTRAL up to October 2010. Unpublished data from longer term trials were obtained through contact with investigators.
Study selection Randomised controlled trials comparing statin with no statin or comparing high dose versus standard dose statin; all longer term trials had at least 100 participants and at least six months' follow-up.
Results In published data from 13 short term trials (4414 randomised patients, 659 events), statin treatment seemed to reduce the odds of an episode of atrial fibrillation by 39% (odds ratio 0.61, 95% confidence interval 0.51 to 0.74; P<0.001), but there was significant heterogeneity (P<0.001) between the trials. In contrast, among 22 longer term and mostly larger trials of statin versus control (105 791 randomised patients, 2535 events), statin treatment was not associated with a significant reduction in atrial fibrillation (0.95, 0.88 to 1.03; P=0.24) (P<0.001 for test of difference between the two sets of trials). Seven longer term trials of more intensive versus standard statin regimens (28 964 randomised patients and 1419 events) also showed no evidence of a reduction in the risk of atrial fibrillation (1.00, 0.90 to 1.12; P=0.99).
Conclusions The suggested beneficial effect of statins on atrial fibrillation from published shorter term studies is not supported by a comprehensive review of published and unpublished evidence from larger scale trials
Numerical Discreteness Errors in Multi-Species Cosmological N-body Simulations
We present a detailed analysis of numerical discreteness errors in
two-species, gravity-only, cosmological simulations using the density power
spectrum as a diagnostic probe. In a simple setup where both species are
initialized with the same total matter transfer function, biased growth of
power forms on small scales when the solver force resolution is finer than the
mean interparticle separation. The artificial bias is more severe when
individual density and velocity transfer functions are applied. In particular,
significant large-scale offsets in power are measured between simulations with
conventional offset grid initial conditions when compared against converged
high-resolution results where the force resolution scale is matched to the
interparticle separation. These offsets persist even when the cosmology is
chosen so that the two particle species have the same mass, indicating that the
error is sourced from discreteness in the total matter field as opposed to
unequal particle mass. We further investigate two mitigation strategies to
address discreteness errors: the frozen potential method and softened
interspecies short-range forces. The former evolves particles under the
approximately "frozen" total matter potential in linear theory at early times,
while the latter filters cross-species gravitational interactions on small
scales in low density regions. By modeling closer to the continuum limit, both
mitigation strategies demonstrate considerable reductions in large-scale power
spectrum offsets.Comment: Accepted for publication in MNRA
On the road to percent accuracy III: non-linear reaction of the matter power spectrum to massive neutrinos
We analytically model the non-linear effects induced by massive neutrinos on the total matter power spectrum using the halo model reaction framework of Cataneo et al. In this approach, the halo model is used to determine the relative change to the matter power spectrum caused by new physics beyond the concordance cosmology. Using standard fitting functions for the halo abundance and the halo massâconcentration relation, the total matter power spectrum in the presence of massive neutrinos is predicted to perâcent-level accuracy, out to k=10hMpcâ1â . We find that refining the prescriptions for the halo properties using N-body simulations improves the recovered accuracy to better than 1âperâcent. This paper serves as another demonstration for how the halo model reaction framework, in combination with a single suite of standard Î cold dark matter (ÎCDM) simulations, can recover perâcent-level accurate predictions for beyond ÎCDM matter power spectra, well into the non-linear regime
Recommended from our members
Neural Signatures of Spatial Statistical Learning: Characterizing the Extraction of Structure from Complex Visual Scenes
Behavioral evidence has shown that humans automatically develop internal representations adapted to the temporal and spatial statistics of the environment. Building on prior fMRI studies that have focused on statistical learning of temporal sequences, we investigated the neural substrates and mechanisms underlying statistical learning from scenes with a structured spatial layout. Our goals were twofold: (1) to determine discrete brain regions in which degree of learning (i.e., behavioral performance) was a significant predictor of neural activity during acquisition of spatial regularities and (2) to examine how connectivity between this set of areas and the rest of the brain changed over the course of learning. Univariate activity analyses indicated a diffuse set of dorsal striatal and occipitoparietal activations correlated with individual differences in participants' ability to acquire the underlying spatial structure of the scenes. In addition, bilateral medial-temporal activation was linked to participants' behavioral performance, suggesting that spatial statistical learning recruits additional resources from the limbic system. Connectivity analyses examined, across the time course of learning, psychophysiological interactions with peak regions defined by the initial univariate analysis. Generally, we find that task-based connectivity with these regions was significantly greater in early relative to later periods of learning. Moreover, in certain cases, decreased task-based connectivity between time points was predicted by overall posttest performance. Results suggest a narrowing mechanism whereby the brain, confronted with a novel structured environment, initially boosts overall functional integration and then reduces interregional coupling over time
- âŠ