5,403 research outputs found
Breaking the self-averaging properties of spatial galaxy fluctuations in the Sloan Digital Sky Survey - Data Release Six
Statistical analyses of finite sample distributions usually assume that
fluctuations are self-averaging, i.e. that they are statistically similar in
different regions of the given sample volume. By using the scale-length method,
we test whether this assumption is satisfied in several samples of the Sloan
Digital Sky Survey Data Release Six. We find that the probability density
function (PDF) of conditional fluctuations, filtered on large enough spatial
scales (i.e., r>30 Mpc/h), shows relevant systematic variations in different
sub-volumes of the survey. Instead for scales r<30 Mpc/h the PDF is
statistically stable, and its first moment presents scaling behavior with a
negative exponent around one. Thus while up to 30 Mpc/h galaxy structures have
well-defined power-law correlations, on larger scales it is not possible to
consider whole sample average quantities as meaningful and useful statistical
descriptors. This situation is due to the fact that galaxy structures
correspond to density fluctuations which are too large in amplitude and too
extended in space to be self-averaging on such large scales inside the sample
volumes: galaxy distribution is inhomogeneous up to the largest scales, i.e. r
~ 100 Mpc/h, probed by the SDSS samples. We show that cosmological corrections,
as K-corrections and standard evolutionary corrections, do not qualitatively
change the relevant behaviors. Finally we show that the large amplitude galaxy
fluctuations observed in the SDSS samples are at odds with the predictions of
the standard LCDM model of structure formation.(Abridged version).Comment: 32 pages, 28 figures, accepted for publication in Astronomy and
Astrophysics. A higher resolution version is available at
http://pil.phys.uniroma1.it/~sylos/fsl_highlights.html . Version v2 has been
corrected to match the published on
A Lightweight Distributed Solution to Content Replication in Mobile Networks
Performance and reliability of content access in mobile networks is
conditioned by the number and location of content replicas deployed at the
network nodes. Facility location theory has been the traditional, centralized
approach to study content replication: computing the number and placement of
replicas in a network can be cast as an uncapacitated facility location
problem. The endeavour of this work is to design a distributed, lightweight
solution to the above joint optimization problem, while taking into account the
network dynamics. In particular, we devise a mechanism that lets nodes share
the burden of storing and providing content, so as to achieve load balancing,
and decide whether to replicate or drop the information so as to adapt to a
dynamic content demand and time-varying topology. We evaluate our mechanism
through simulation, by exploring a wide range of settings and studying
realistic content access mechanisms that go beyond the traditional
assumptionmatching demand points to their closest content replica. Results show
that our mechanism, which uses local measurements only, is: (i) extremely
precise in approximating an optimal solution to content placement and
replication; (ii) robust against network mobility; (iii) flexible in
accommodating various content access patterns, including variation in time and
space of the content demand.Comment: 12 page
Recommended from our members
TAO Conceptual Design Report: A Precision Measurement of the Reactor Antineutrino Spectrum with Sub-percent Energy Resolution
The Taishan Antineutrino Observatory (TAO, also known as JUNO-TAO) is a
satellite experiment of the Jiangmen Underground Neutrino Observatory (JUNO). A
ton-level liquid scintillator detector will be placed at about 30 m from a core
of the Taishan Nuclear Power Plant. The reactor antineutrino spectrum will be
measured with sub-percent energy resolution, to provide a reference spectrum
for future reactor neutrino experiments, and to provide a benchmark measurement
to test nuclear databases. A spherical acrylic vessel containing 2.8 ton
gadolinium-doped liquid scintillator will be viewed by 10 m^2 Silicon
Photomultipliers (SiPMs) of >50% photon detection efficiency with almost full
coverage. The photoelectron yield is about 4500 per MeV, an order higher than
any existing large-scale liquid scintillator detectors. The detector operates
at -50 degree C to lower the dark noise of SiPMs to an acceptable level. The
detector will measure about 2000 reactor antineutrinos per day, and is designed
to be well shielded from cosmogenic backgrounds and ambient radioactivities to
have about 10% background-to-signal ratio. The experiment is expected to start
operation in 2022
On the Sensor Pattern Noise Estimation in Image Forensics: A Systematic Empirical Evaluation
Extracting a fingerprint of a digital camera has fertile applications in image forensics, such as source camera identification and image authentication. In the last decade, Photo Response Non_Uniformity (PRNU) has been well established as a reliable unique fingerprint of digital imaging devices. The PRNU noise appears in every image as a very weak signal, and its reliable estimation is crucial for the success rate of the forensic application. In this paper, we present a novel methodical evaluation of 21 state-of-the-art PRNU estimation/enhancement techniques that have been proposed in the literature in various frameworks. The techniques are classified and systematically compared based on their role/stage in the PRNU estimation procedure, manifesting their intrinsic impacts. The performance of each technique is extensively demonstrated over a large-scale experiment to conclude this case-sensitive study. The experiments have been conducted on our created database and a public image database, the 'Dresden image databas
- …