3,129 research outputs found
WOODCUT BLOCKS OF THE âSCUOLA DEL LIBROâ OF URBINO: A SCIENTIFIC APPROACH FOR CONSERVATION
none4noopenM.L. Amadori, N. Macchioni, G. Adami, C. CaprettiAmadori, M. L.; Nicola, Macchioni; Giampiero, Adami; Chiara, Caprett
Cluster luminosity function and n^th ranked magnitude as a distance indicator
We define here a standard candle to determine the distance of clusters of
galaxies and to investigate their peculiar velocities by using the n^{th} rank
galaxy (magnitude m). We address the question of the universality of the
luminosity function for a sample of 28 rich clusters of galaxies () in order to model the influence on of cluster richness. This
luminosity function is found to be universal and the fit of a Schechter profile
gives and in the range
[-21,-17]. The uncorrected distance indicator is more efficient for the
first ranks n. With n=5, we have a dispersion of 0.61 magnitude for the
(m,5log(cz)) relation. When we correct for the richness effect and subtract
the background galaxies we reduce the uncertainty to 0.21 magnitude with n=15.
Simulations show that a large part of this dispersion originates from the
intrinsic scatter of the standard candle itself. These provide upper bounds on
the amplitude of cluster radial peculiar motions. At a confidence
level of 90%, the dispersion is 0.13 magnitude and is limited to
1200 km/s for our sample of clusters.Comment: 9 pages, 7 postscript figures, LateX A&A, accepted in A&
Reduction criterion for separability
We introduce a separability criterion based on the positive map Î:Ïâ(Tr Ï)-Ï, where Ï is a trace-class Hermitian operator. Any separable state is mapped by the tensor product of Î and the identity into a non-negative operator, which provides a simple necessary condition for separability. This condition is generally not sufficient because it is vulnerable to the dilution of entanglement. In the special case where one subsystem is a quantum bit, Î reduces to time reversal, so that this separability condition is equivalent to partial transposition. It is therefore also sufficient for 2Ă2 and 2Ă3 systems. Finally, a simple connection between this map for two qubits and complex conjugation in the âmagicâ basis [Phys. Rev. Lett. 78, 5022 (1997)] is displayed
Holographic Heat engine within the framework of massive gravity
Heat engine models are constructed within the framework of massive gravity in
this paper. For the four-dimensional charged black holes in massive gravity, it
is shown that the heat engines have a higher efficiency for the cases
than for the case when . Considering a specific example, we
show that the maximum efficiency can reach while the efficiency for
reads . The existence of graviton mass improves the heat engine
efficiency significantly. The situation is more complicated for the
five-dimensional neutral black holes. Not only the exert
influence on the efficiency, but also the constant corresponding to the
third massive potential contributes to the efficiency. When is higher than that of
the case . By studying the ratio , we also probe how the
massive gravity influences the behavior of the heat engine efficiency
approaching the Carnot efficiency.Comment: 9pages,4figure
Redshifts in the Southern Abell Redshift Survey Clusters. I. The Data
The Southern Abell Redshift Survey contains 39 clusters of galaxies with
redshifts in the range 0.0 < z < 0.31 and a median redshift depth of z =
0.0845. SARS covers the region 0 21h (while
avoiding the LMC and SMC) with b > 40. Cluster locations were chosen from the
Abell and Abell-Corwin-Olowin catalogs while galaxy positions were selected
from the Automatic Plate Measuring Facility galaxy catalog with
extinction-corrected magnitudes in the range 15 <= b_j < 19. SARS utilized the
Las Campanas 2.5 m duPont telescope, observing either 65 or 128 objects
concurrently over a 1.5 sq deg field. New redshifts for 3440 galaxies are
reported in the fields of these 39 clusters of galaxies.Comment: 20 pages, 5 figures, accepted for publication in the Astronomical
Journal, Table 2 can be downloaded in its entirety from
http://trotsky.arc.nasa.gov/~mway/SARS1/sars1-table2.cs
The use of Minimal Spanning Tree to characterize the 2D cluster galaxy distribution
We use the Minimal Spanning Tree to characterize the aggregation level of
given sets of points. We test 3 distances based on the histogram of the MST
edges to discriminate between the distributions. We calibrate the method by
using artificial sets following Poisson, King or NFW distributions. The
distance using the mean, the dispersion and the skewness of the histogram of
MST edges provides the more efficient results. We apply this distance to a
subsample of the ENACS clusters and we show that the bright galaxies are
significantly more aggregated than the faint ones. The contamination provided
by uniformly distributed field galaxies is neglectible. On the other hand, we
show that the presence of clustered groups on the same cluster line of sight
masked the variation of the distance with the considered magnitude.Comment: 9 pages, 7 postscript figures, LateX A\{&}A, accepted in A\{&}
Image Coding with Face Descriptors Embedding
4siContent descriptors, useful for browsing and retrieval tasks, are generally extracted and treated as a separate entity with respect to the nature of the content itself. At the same time, conventional coding processes do not take into account information carried out by content descriptors. Content descriptors are closely related to the content itself, and they potentially can be used to exploit redundancy in entropy coding processes. Embedding content descriptors in the bitstream can reduce content description extraction load, and at the same time, it can reduce the rate associated to the compressed content and its description. In this paper an effective implementation of this approach is presented, where image descriptors are actively used in the coding process for exploiting redundancy. First of all, image areas containing faces are detected and encoded using a scalable method, where the base layer is represented by the corresponding eigenface, and the enhancement layer is formed by the prediction error. The remaining areas are then encoded by using a traditional approach. Simulations show that achievable compression performances are comparable with those provided by conventional, making the proposed approach very convenient for source coding and content description.partially_openpartially_openBoschetti A.; Adami N.; Leonardi R.; Okuda M.Boschetti, Alberto; Adami, Nicola; Leonardi, Riccardo; Okuda, M
Validation instruments for health promotion in the community pharmacy setting
The developments during the past fifty years have resulted in a complete shift in the role of the
community pharmacist from that of mainly compounding of medicines to becoming an advisor on health-related issues (Schaefer, 1998). This shift resulted in highlighting the intervention of the
pharmacist as the initial contact point for the provision of primary health care. An initiative undertaken in the United Kingdom in 1995, 'Pharmacy in a New Age', identified health promotion as one
of the areas that community pharmacists should focus more on (Royal Pharmaceutical Society of
Great Britain, 1996). In this day and age of cost containment. evidence-based practice is required to confirm the provision of professional services, including the provision of health promotion (Rupp, 1997).
This prompted the development of the Validation Method for Community Pharmacy, which is a
process carried out to confirm the effectiveness of the pharmacist in the community setting
(Azzopardi, 2000).peer-reviewe
- âŠ