854 research outputs found
Warm Dark Matter versus Bumpy Power Spectra
In this paper we are exploring the differences between a Warm Dark Matter
model and a CDM model where the power on a certain scale is reduced by
introducing a narrow negative feature ("dip"). This dip is placed in a way so
as to mimic the loss of power in the WDM model: both models have the same
integrated power out to the scale where the power of the Dip model rises to the
level of the unperturbed CDM spectrum again.
Using N-body simulations we show that some of the large-scale clustering
patterns of this new model follow more closely the usual CDM scenario while
simultaneously suppressing small scale structures (within galactic halos) even
more efficiently than WDM. The analysis in the paper shows that the new Dip
model appears to be a viable alternative to WDM but it is based on different
physics. Where WDM requires the introduction of a new particle species the Dip
model is based on a non-standard inflationary period. If we are looking for an
alternative to the currently challenged standard LCDM structure formation
scenario, neither the LWDM nor the new Dip model can be ruled out based on the
analysis presented in this paper. They both make very similar predictions and
the degeneracy between them can only be broken with observations yet to come.Comment: 7 pages, 8 figures, replaced with MNRAS accepted version (minor
revisions), high-resolution figures at
http://astronomy.swin.edu.au/staff/aknebe
Quantum criticality of dipolar spin chains
We show that a chain of Heisenberg spins interacting with long-range dipolar
forces in a magnetic field h perpendicular to the chain exhibits a quantum
critical point belonging to the two-dimensional Ising universality class.
Within linear spin-wave theory the magnon dispersion for small momenta k is
[Delta^2 + v_k^2 k^2]^{1/2}, where Delta^2 \propto |h - h_c| and v_k^2 \propto
|ln k|. For fields close to h_c linear spin-wave theory breaks down and we
investigate the system using density-matrix and functional renormalization
group methods. The Ginzburg regime where non-Gaussian fluctuations are
important is found to be rather narrow on the ordered side of the transition,
and very broad on the disordered side.Comment: 6 pages, 5 figure
Millihertz Quasi-Periodic Oscillations from Marginally Stable Nuclear Burning on an Accreting Neutron Star
We investigate marginally stable nuclear burning on the surface of accreting
neutron stars as an explanation for the mHz quasi-periodic oscillations (QPOs)
observed from three low mass X-ray binaries. At the boundary between unstable
and stable burning, the temperature dependence of the nuclear heating rate and
cooling rate almost cancel. The result is an oscillatory mode of burning, with
an oscillation period close to the geometric mean of the thermal and accretion
timescales for the burning layer. We describe a simple one-zone model which
illustrates this basic physics, and then present detailed multizone
hydrodynamical calculations of nuclear burning close to the stability boundary
using the KEPLER code. Our models naturally explain the characteristic 2 minute
period of the mHz QPOs, and why they are seen only in a very narrow range of
X-ray luminosities. The oscillation period is sensitive to the accreted
hydrogen fraction and the surface gravity, suggesting a new way to probe these
parameters. A major puzzle is that the accretion rate at which the oscillations
appear in the theoretical models is an order of magnitude larger than the rate
implied by the X-ray luminosity when the mHz QPOs are seen. We discuss the
implications for our general understanding of nuclear burning on accreting
neutron stars. One possibility is that the accreted material covers only part
of the neutron star surface at luminosities Lx > ~1E37 erg/s.Comment: 10 pages, 9 figures, submitted to Ap
Recommended from our members
Precision nomenclature for the new genomics
The confluence of two scientific disciplines may lead to nomenclature conflicts that require new terms while respecting historical definitions. This is the situation with the current state of cytology and genomics, which offer examples of distinct nomenclature and vocabularies that require reconciliation. In this article, we propose the new terms C-scaffold (for chromosome-scale assemblies of sequenced DNA fragments, commonly named scaffolds) and scaffotype (the resulting collection of C-scaffolds that represent an organism\u27s genome). This nomenclature avoids conflict with the historical definitions of the terms chromosome (a microscopic body made of DNA and protein) and karyotype (the collection of images of all chromosomes of an organism or species). As large-scale sequencing projects progress, adoption of this nomenclature will assist end users to properly classify genome assemblies, thus facilitating genomic analysis
Derivative corrections to the Born-Infeld action through beta-function calculations in N=2 boundary superspace
We calculate the beta-functions for an open string sigma-model in the
presence of a U(1) background. Passing to N=2 boundary superspace, in which the
background is fully characterized by a scalar potential, significantly
facilitates the calculation. Performing the calculation through three loops
yields the equations of motion up to five derivatives on the fieldstrengths,
which upon integration gives the bosonic sector of the effective action for a
single D-brane in trivial bulk background fields through four derivatives and
to all orders in alpha'. Finally, the present calculation shows that demanding
ultra-violet finiteness of the non-linear sigma-model can be reformulated as
the requirement that the background is a deformed stable holomorphic U(1)
bundle.Comment: 25 pages, numerous figure
Expression and reactivation of HIV in a chemokine induced model of HIV latency in primary resting CD4+ T cells
<p>Abstract</p> <p>Background</p> <p>We recently described that HIV latent infection can be established <it>in vitro </it>following incubation of resting CD4+ T-cells with chemokines that bind to CCR7. The main aim of this study was to fully define the post-integration blocks to virus replication in this model of CCL19-induced HIV latency.</p> <p>Results</p> <p>High levels of integrated HIV DNA but low production of reverse transcriptase (RT) was found in CCL19-treated CD4+ T-cells infected with either wild type (WT) NL4.3 or single round envelope deleted NL4.3 pseudotyped virus (NL4.3- Δenv). Supernatants from CCL19-treated cells infected with either WT NL4.3 or NL4.3- Δenv did not induce luciferase expression in TZM-bl cells, and there was no expression of intracellular p24. Following infection of CCL19-treated CD4+ T-cells with NL4.3 with enhanced green fluorescent protein (EGFP) inserted into the <it>nef </it>open reading frame (NL4.3- Δnef-EGFP), there was no EGFP expression detected. These data are consistent with non-productive latent infection of CCL19-treated infected CD4+ T-cells. Treatment of cells with phytohemagluttinin (PHA)/IL-2 or CCL19, prior to infection with WT NL4.3, resulted in a mean fold change in unspliced (US) RNA at day 4 compared to day 0 of 21.2 and 1.1 respectively (p = 0.01; n = 5), and the mean expression of multiply spliced (MS) RNA was 56,000, and 5,000 copies/million cells respectively (p = 0.01; n = 5). In CCL19-treated infected CD4+ T-cells, MS-RNA was detected in the nucleus and not in the cytoplasm; in contrast to PHA/IL-2 activated infected cells where MS RNA was detected in both. Virus could be recovered from CCL19-treated infected CD4+ T-cells following mitogen stimulation (with PHA and phorbyl myristate acetate (PMA)) as well as TNFα, IL-7, prostratin and vorinostat.</p> <p>Conclusions</p> <p>In this model of CCL19-induced HIV latency, we demonstrate HIV integration without spontaneous production of infectious virus, detection of MS RNA in the nucleus only, and the induction of virus production with multiple activating stimuli. These data are consistent with <it>ex vivo </it>findings from latently infected CD4+ T-cells from patients on combination antiretroviral therapy, and therefore provide further support of this model as an excellent <it>in vitro </it>model of HIV latency.</p
Predicting radiotherapy patient outcomes with real-time clinical data using mathematical modelling
Longitudinal tumour volume data from head-and-neck cancer patients show that
tumours of comparable pre-treatment size and stage may respond very differently
to the same radiotherapy fractionation protocol. Mathematical models are often
proposed to predict treatment outcome in this context, and have the potential
to guide clinical decision-making and inform personalised fractionation
protocols. Hindering effective use of models in this context is the sparsity of
clinical measurements juxtaposed with the model complexity required to produce
the full range of possible patient responses. In this work, we present a
compartment model of tumour volume and tumour composition, which, despite
relative simplicity, is capable of producing a wide range of patient responses.
We then develop novel statistical methodology and leverage a cohort of existing
clinical data to produce a predictive model of both tumour volume progression
and the associated level of uncertainty that evolves throughout a patient's
course of treatment. To capture inter-patient variability, all model parameters
are patient specific, with a bootstrap particle filter-like Bayesian approach
developed to model a set of training data as prior knowledge. We validate our
approach against a subset of unseen data, and demonstrate both the predictive
ability of our trained model and its limitations
Predicting radiotherapy patient outcomes with real-time clinical data using mathematical modelling
Longitudinal tumour volume data from head-and-neck cancer patients show that tumours of comparable pre-treatment size and stage may respond very differently to the same radiotherapy fractionation protocol. Mathematical models are often proposed to predict treatment outcome in this context, and have the potential to guide clinical decision-making and inform personalised fractionation protocols. Hindering effective use of models in this context is the sparsity of clinical measurements juxtaposed with the model complexity required to produce the full range of possible patient responses. In this work, we present a compartment model of tumour volume and tumour composition, which, despite relative simplicity, is capable of producing a wide range of patient responses. We then develop novel statistical methodology and leverage a cohort of existing clinical data to produce a predictive model of both tumour volume progression and the associated level of uncertainty that evolves throughout a patient’s course of treatment. To capture inter-patient variability, all model parameters are patient specific, with a bootstrap particle filter-like Bayesian approach developed to model a set of training data as prior knowledge. We validate our approach against a subset of unseen data, and demonstrate both the predictive ability of our trained model and its limitations
Sequence analysis and editing for bisulphite genomic sequencing projects
Bisulphite genomic sequencing is a widely used technique for detailed analysis of the methylation status of a region of DNA. It relies upon the selective deamination of unmethylated cytosine to uracil after treatment with sodium bisulphite, usually followed by PCR amplification of the chosen target region. Since this two-step procedure replaces all unmethylated cytosine bases with thymine, PCR products derived from unmethylated templates contain only three types of nucleotide, in unequal proportions. This can create a number of technical difficulties (e.g. for some base-calling methods) and impedes manual analysis of sequencing results (since the long runs of T or A residues are difficult to align visually with the parent sequence). To facilitate the detailed analysis of bisulphite PCR products (particularly using multiple cloned templates), we have developed a visually intuitive program that identifies the methylation status of CpG dinucleotides by analysis of raw sequence data files produced by MegaBace or ABI sequencers as well as Staden SCF trace files and plain text files. The program then also collates and presents data derived from independent templates (e.g. separate clones). This results in a considerable reduction in the time required for completion of a detailed genomic methylation project
- …