1,484 research outputs found
The role of p53 in atherosclerosis
Although the role of the tumour suppressor gene p53 is well known in cancer, recent studies have highlighted a fundamental role for p53 in regulating cells in the advanced atherosclerotic plaque, the major cause of heart attacks and stroke. In particular, p53 is activated in the complex environment of the plaque, in part by DNA damage within the lesion, and regulates growth arrest, cell senescence and apoptosis of vascular smooth muscle cells (VSMCs). The role of endogenous p53 has been determined using p53 knockout in mice developing advanced atherosclerosis, using bone marrow transplant to separate effects on blood cells from vessel wall cells. These studies have produced apparently contradictory and surprising results. In particular, recent studies have identified a role for endogenous p53 in protection of VSMCs from apoptosis, trans-differentiation of bone marrow stromal cells into VSMCs in atherosclerosis, and altering the mode of cell death in the plaque
Structure of Pairs in Heavy Weakly-Bound Nuclei
We study the structure of nucleon pairs within a simple model consisting of a
square well in three dimensions and a delta-function residual interaction
between two weakly-bound particles at the Fermi surface. We include the
continuum by enclosing the entire system in a large spherical box. To a good
approximation, the continuum can be replaced by a small set of
optimally-determined resonance states, suggesting that in many nuclei far from
stability it may be possible to incorporate continuum effects within
traditional shell-model based approximations.Comment: REVTEX format, 9 pages, 2 figures, 2 table
New, efficient and robust, fiber-based quantum key distribution schemes
We present a new fiber based quantum key distribution (QKD) scheme which can
be regarded as a modification of an idea proposed by Inoue, Waks and Yamamoto
(IWY) [1]. The scheme described here uses a single phase modulator and two
differential delay elements in series at the transmitter that form an
interferometer when combined with a third differential delay element at the
receiver. The protocol is characterized by a high efficiency, reduced exposure
to an attack by an eavesdropper, and higher sensitivity to such an attack when
compared to other QKD schemes. For example, the efficiency with which
transmitted data contribute to the private key is 3/4 compared with 1/4 for
BB84 [2]. Moreover, an eavesdropper can aquire a maximum of 1/3 of the key
which leads to an error probability in the private key of 1/3. This can be
compared to 1/2 and 1/4 for these same parameters in both BB84 and IWY. The
combination of these considerations should lead to increased range and key
distribution rate over present fiber-based QKD schemes.Comment: 4 pages, 5 figures, 1 equatio
Perfect state distinguishability and computational speedups with postselected closed timelike curves
Bennett and Schumacher's postselected quantum teleportation is a model of
closed timelike curves (CTCs) that leads to results physically different from
Deutsch's model. We show that even a single qubit passing through a
postselected CTC (P-CTC) is sufficient to do any postselected quantum
measurement, and we discuss an important difference between "Deutschian" CTCs
(D-CTCs) and P-CTCs in which the future existence of a P-CTC might affect the
present outcome of an experiment. Then, based on a suggestion of Bennett and
Smith, we explicitly show how a party assisted by P-CTCs can distinguish a set
of linearly independent quantum states, and we prove that it is not possible
for such a party to distinguish a set of linearly dependent states. The power
of P-CTCs is thus weaker than that of D-CTCs because the Holevo bound still
applies to circuits using them regardless of their ability to conspire in
violating the uncertainty principle. We then discuss how different notions of a
quantum mixture that are indistinguishable in linear quantum mechanics lead to
dramatically differing conclusions in a nonlinear quantum mechanics involving
P-CTCs. Finally, we give explicit circuit constructions that can efficiently
factor integers, efficiently solve any decision problem in the intersection of
NP and coNP, and probabilistically solve any decision problem in NP. These
circuits accomplish these tasks with just one qubit traveling back in time, and
they exploit the ability of postselected closed timelike curves to create
grandfather paradoxes for invalid answers.Comment: 15 pages, 4 figures; Foundations of Physics (2011
Inducing safer oblique trees without costs
Decision tree induction has been widely studied and applied. In safety applications, such as determining whether a chemical process is safe or whether a person has a medical condition, the cost of misclassification in one of the classes is significantly higher than in the other class. Several authors have tackled this problem by developing cost-sensitive decision tree learning algorithms or have suggested ways of changing the
distribution of training examples to bias the decision tree learning process so as to take account of costs. A prerequisite for applying such algorithms is the availability of costs of misclassification.
Although this may be possible for some applications, obtaining reasonable estimates of costs of misclassification is not easy in the area of safety.
This paper presents a new algorithm for applications where the cost of misclassifications cannot be quantified, although the cost of misclassification in one class is known to be significantly higher than in another class. The algorithm utilizes linear discriminant analysis to identify oblique relationships between continuous attributes and then carries out an appropriate modification to ensure that the resulting tree errs on the side of safety. The algorithm is evaluated with respect to one of the best known cost-sensitive algorithms (ICET), a well-known oblique decision tree algorithm (OC1) and an algorithm that utilizes robust linear programming
Targeted free energy perturbation
A generalization of the free energy perturbation identity is derived, and a
computational strategy based on this result is presented. A simple example
illustrates the efficiency gains that can be achieved with this method.Comment: 8 pages + 1 color figur
Evolution of cosmic string configurations
We extend and develop our previous work on the evolution of a network of
cosmic strings. The new treatment is based on an analysis of the probability
distribution of the end-to-end distance of a randomly chosen segment of
left-moving string of given length. The description involves three distinct
length scales: , related to the overall string density, , the
persistence length along the string, and , describing the small-scale
structure, which is an important feature of the numerical simulations that have
been done of this problem. An evolution equation is derived describing how the
distribution develops in time due to the combined effects of the universal
expansion, of intercommuting and loop formation, and of gravitational
radiation. With plausible assumptions about the unknown parameters in the
model, we confirm the conclusions of our previous study, that if gravitational
radiation and small-scale structure effects are neglected, the two dominant
length scales both scale in proportion to the horizon size. When the extra
effects are included, we find that while and grow,
initially does not. Eventually, however, it does appear to scale, at a much
lower level, due to the effects of gravitational back-reaction.Comment: 61 pages, requires RevTex v3.0, SUSSEX-TH-93/3-4,
IMPERIAL/TP/92-93/4
Where and when to revegetate : a quantitative method for scheduling landscape reconstruction
Restoration of native vegetation is required in many regions of the world, but determining priority locations for revegetation is a complex problem. We consider the problem of determining spatial and temporal priorities for revegetation to maximize habitat for 62 bird species within a heavily cleared agricultural region, 11 000 km2 in area. We show how a reserve-selection framework can be applied to a complex, large-scale restoration-planning problem to account for multi-species objectives and connectivity requirements at a spatial extent and resolution relevant to management. Our approach explicitly accounts for time lags in planting and development of habitat resources, which is intended to avoid future population bottlenecks caused by delayed provision of critical resources, such as tree hollows. We coupled species-specific models of expected habitat quality and fragmentation effects with the dynamics of habitat suitability following replanting to produce species-specific maps for future times. Spatial priorities for restoration were determined by ranking locations (150-m grid cells) by their expected contribution to species habitat through time using the conservation planning tool, ‘‘Zonation.’’ We evaluated solutions by calculating expected trajectories of habitat availability for each species. We produced a spatially explicit revegetation schedule for the region that resulted in a balanced increase in habitat for all species. Priority areas for revegetation generally were clustered around existing vegetation, although not always. Areas on richer soils and with high rainfall were more highly ranked, reflecting their potential to support high-quality habitats that have been disproportionately cleared for agriculture. Accounting for delayed development of habitat resources altered the rank-order of locations in the derived revegetation plan and led to improved expected outcomes for fragmentation-sensitive species. This work demonstrates the potential for systematic restoration planning at large scales that accounts for multiple objectives, which is urgently needed by land and natural resource managers
Reducing the communication complexity with quantum entanglement
We propose a probabilistic two-party communication complexity scenario with a
prior nonmaximally entangled state, which results in less communication than
that is required with only classical random correlations. A simple all-optical
implementation of this protocol is presented and demonstrates our conclusion.Comment: 4 Pages, 2 Figure
Cosmological gravitino problem confronts electroweak physics
A generic feature of gauge-mediated supersymmetry breaking models is that the
gravitino is the lightest supersymmetric particle (LSP). In order not to
overclose the universe, the gravitino LSP should be light enough (~ 1 keV), or
appropriately heavy (~ 1 GeV). We study further constraints on the mass of the
gravitino imposed by electroweak experiments, i.e., muon g-2 measurements,
electroweak precision measurements, and direct searches for supersymmetric
particles at LEP2. We find that the heavy gravitino is strongly disfavored from
the lower mass bound on the next-to-LSP. The sufficiently light gravitino, on
the other hand, has rather sizable allowed regions in the model parameter
space.Comment: 11 pages, 8 figures, version to appear in PR
- …
