40,737 research outputs found
Fragilities of Liquids Predicted from the Random First Order Transition Theory of Glasses
A microscopically motivated theory of glassy dynamics based on an underlying
random first order transition is developed to explain the magnitude of free
energy barriers for glassy relaxation. A variety of empirical correlations
embodied in the concept of liquid "fragility" are shown to be quantitatively
explained by such a model. The near universality of a Lindemann ratio
characterizing the maximal amplitude of thermal vibrations within an amorphous
minimum explains the variation of fragility with a liquid's configurational
heat capacity density. Furthermore the numerical prefactor of this correlation
is well approximated by the microscopic calculation. The size of heterogeneous
reconfiguring regions in a viscous liquid is inferred and the correlation of
nonexponentiality of relaxation with fragility is qualitatively explained. Thus
the wide variety of kinetic behavior in liquids of quite disparate chemical
nature reflects quantitative rather than qualitative differences in their
energy landscapes.Comment: 10 pages including 4 eps figure
The Origin of the Boson Peak and the Thermal Conductivity Plateau in Low Temperature Glasses
We argue that the intrinsic glassy degrees of freedom in amorphous solids
giving rise to the thermal conductivity plateau and the ``boson peak'' in the
heat capacity at moderately low temperatures are directly connected to those
motions giving rise to the two-level like excitations seen at still lower
temperatures. These degrees of freedom can be thought of as strongly anharmonic
transitions between the local minima of the glassy energy landscape that are
accompanied by ripplon-like domain wall motions of the glassy mosaic structure
predicted to occur at by the random first order transition theory. The
energy spectrum of the vibrations of the mosaic depends on the glass transition
temperature, the Debye frequency and the molecular length scale. The resulting
spectrum reproduces the experimental low temperature Boson peak. The
``non-universality'' of the thermal conductivity plateau depends on and arises from calculable interactions with the phonons.Comment: 4 pages, submitted to PR
Simulations of an energy dechirper based on dielectric lined waveguides
Terahertz frequency wakefields can be excited by ultra-short relativistic
electron bunches travelling through dielectric lined waveguide (DLW)
structures. These wakefields can either accelerate a witness bunch with high
gradient, or modulate the energy of the driving bunch. In this paper, we study
a passive dechirper based on the DLW to compensate the correlated energy spread
of the bunches accelerated by the laser plasma wakefield accelerator (LWFA). A
rectangular waveguide structure was employed taking advantage of its
continuously tunable gap during operation. The assumed 200 MeV driving bunch
had a Gaussian distribution with a bunch length of 3.0 {\mu}m, a relative
correlated energy spread of 1%, and a total charge of 10 pC. Both of the CST
Wakefield Solver and PIC Solver were used to simulate and optimize such a
dechirper. Effect of the time-dependent self-wake on the driving bunch was
analyzed in terms of the energy modulation and the transverse phase space
Open vs Closed Access Femtocells in the Uplink
Femtocells are assuming an increasingly important role in the coverage and
capacity of cellular networks. In contrast to existing cellular systems,
femtocells are end-user deployed and controlled, randomly located, and rely on
third party backhaul (e.g. DSL or cable modem). Femtocells can be configured to
be either open access or closed access. Open access allows an arbitrary nearby
cellular user to use the femtocell, whereas closed access restricts the use of
the femtocell to users explicitly approved by the owner. Seemingly, the network
operator would prefer an open access deployment since this provides an
inexpensive way to expand their network capabilities, whereas the femtocell
owner would prefer closed access, in order to keep the femtocell's capacity and
backhaul to himself. We show mathematically and through simulations that the
reality is more complicated for both parties, and that the best approach
depends heavily on whether the multiple access scheme is orthogonal (TDMA or
OFDMA, per subband) or non-orthogonal (CDMA). In a TDMA/OFDMA network,
closed-access is typically preferable at high user densities, whereas in CDMA,
open access can provide gains of more than 200% for the home user by reducing
the near-far problem experienced by the femtocell. The results of this paper
suggest that the interests of the femtocell owner and the network operator are
more compatible than typically believed, and that CDMA femtocells should be
configured for open access whereas OFDMA or TDMA femtocells should adapt to the
cellular user density.Comment: 21 pages, 8 figures, 2 tables, submitted to IEEE Trans. on Wireless
Communication
Fundamentals of Inter-cell Overhead Signaling in Heterogeneous Cellular Networks
Heterogeneous base stations (e.g. picocells, microcells, femtocells and
distributed antennas) will become increasingly essential for cellular network
capacity and coverage. Up until now, little basic research has been done on the
fundamentals of managing so much infrastructure -- much of it unplanned --
together with the carefully planned macro-cellular network. Inter-cell
coordination is in principle an effective way of ensuring different
infrastructure components behave in a way that increases, rather than
decreases, the key quality of service (QoS) metrics. The success of such
coordination depends heavily on how the overhead is shared, and the rate and
delay of the overhead sharing. We develop a novel framework to quantify
overhead signaling for inter-cell coordination, which is usually ignored in
traditional 1-tier networks, and assumes even more importance in multi-tier
heterogeneous cellular networks (HCNs). We derive the overhead quality contour
for general K-tier HCNs -- the achievable set of overhead packet rate, size,
delay and outage probability -- in closed-form expressions or computable
integrals under general assumptions on overhead arrivals and different overhead
signaling methods (backhaul and/or wireless). The overhead quality contour is
further simplified for two widely used models of overhead arrivals: Poisson and
deterministic arrival process. This framework can be used in the design and
evaluation of any inter-cell coordination scheme. It also provides design
insights on backhaul and wireless overhead channels to handle specific overhead
signaling requirements.Comment: 21 pages, 9 figure
Adaptive confidence intervals for regression functions under shape constraints
Adaptive confidence intervals for regression functions are constructed under
shape constraints of monotonicity and convexity. A natural benchmark is
established for the minimum expected length of confidence intervals at a given
function in terms of an analytic quantity, the local modulus of continuity.
This bound depends not only on the function but also the assumed function
class. These benchmarks show that the constructed confidence intervals have
near minimum expected length for each individual function, while maintaining a
given coverage probability for functions within the class. Such adaptivity is
much stronger than adaptive minimaxity over a collection of large parameter
spaces.Comment: Published in at http://dx.doi.org/10.1214/12-AOS1068 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
- …