140 research outputs found

    The “Pieve di Santa Maria” in Arezzo (Italy). From the Laser Scanner Survey to the Knowledge of the Architectural Structure

    Get PDF
    The parish church of “Santa Maria” is considered one of the most important medieval buildings of Arezzo Although the church is attested from 11th century, it is between the 13th and 14th centuries that reached its current consistency, characterized by the particular façade with small columns on several levels and an imposing bell tower. Later, from the 16th to the 18th century, the church underwent profound transformations, that were almost completely erased by extensive restoration works in the second half of the 19th century. The architectural survey of the parish church of “Santa Maria” was carried out with a phase-shift laser scanner and a digital reflex camera (Z+F 5006h). 189 scans were performed for generating the 3D model of the church: 180 of them with high density and normal quality, lasting 5-6 minutes; the remaining ones with super high density and high quality, lasting 13-14 minutes. Vectorial drawings of plans and sections were then created from the 3D model. Thanks to laser scanner survay of the chuch, it was possible to highlight the singularity of the structure of the basilical body and the transept. The tilt of walls and columns, the variations in the thickness of the walls, the considerable deformations of some arches, the cracks and textures of the wall facing were thus shown. The information obtained attested an architectural structure created by complex construction events that over time have affected this building. The cnstructive singularities involve the medieval genesis of the building, the transformations during the following centuries and the following restoration works. These composite features are specific and common to every ancient building. This peculiar epistemological condition eschews from simplifications and requires deep and complex studies closely linked to the problems of conservation of the structures

    Bulk Etch Rate Measurements and Calibrations of Plastic Nuclear Track Detectors

    Get PDF
    New calibrations of CR39 and Makrofol nuclear track detectors have been obtained using 158 A GeV Pb (82+) and In (49+) ions; a new method for the bulk etch rate determination, using both cone height and base diameter measurements was developed. The CR39 charge resolution based on the etch-pit base area measurement is adequate to identify nuclear fragments in the interval 7 <= Z/beta <= 49. For CR39 the detection threshold is at REL~50 MeV cm^2/g, corresponding to a nuclear fragment with Z/beta~7. Base cone area distributions for Makrofol foils exposed to Pb (82+) ions have shown for the first time all peaks due to nuclear fragments with Z > 50; the distribution of the etched cone heights shows well separated individual peaks for Z/beta = 78 - 83 (charge pickup). The Makrofol detection threshold is at REL 2700 MeV cm^2/g, corresponding to a nuclear fragment with Z/beta~50.Comment: 11 pages, 5 EPS figures. Submitted to Nucl. Instr. Meth.

    New MACRO results on atmospheric neutrino oscillations

    Full text link
    The final results of the MACRO experiment on atmospheric neutrino oscillations are presented and discussed. The data concern different event topologies with average neutrino energies of ~3 and ~50 GeV. Multiple Coulomb Scattering of the high energy muons in absorbers was used to estimate the neutrino energy of each event. The angular distributions, the L/E_nu distribution, the particle ratios and the absolute fluxes all favour nu_mu --> nu_tau oscillations with maximal mixing and Delta m^2 =0.0023 eV^2. A discussion is made on the Monte Carlos used for the atmospheric neutrino flux. Some results on neutrino astrophysics are also briefly discussed.Comment: Invited Paper at the NANP03 Int. Conf., Dubna, 200

    Results of the Search for Strange Quark Matter and Q-balls with the SLIM Experiment

    Full text link
    The SLIM experiment at the Chacaltaya high altitude laboratory was sensitive to nuclearites and Q-balls, which could be present in the cosmic radiation as possible Dark Matter components. It was sensitive also to strangelets, i.e. small lumps of Strange Quark Matter predicted at such altitudes by various phenomenological models. The analysis of 427 m^2 of Nuclear Track Detectors exposed for 4.22 years showed no candidate event. New upper limits on the flux of downgoing nuclearites and Q-balls at the 90% C.L. were established. The null result also restricts models for strangelets propagation through the Earth atmosphere.Comment: 14 pages, 11 EPS figure

    A Complete Skull of an Early Cretaceous Sauropod and the Evolution of Advanced Titanosaurians

    Get PDF
    Advanced titanosaurian sauropods, such as nemegtosaurids and saltasaurids, were diverse and one of the most important groups of herbivores in the terrestrial biotas of the Late Cretaceous. However, little is known about their rise and diversification prior to the Late Cretaceous. Furthermore, the evolution of their highly-modified skull anatomy has been largely hindered by the scarcity of well-preserved cranial remains. A new sauropod dinosaur from the Early Cretaceous of Brazil represents the earliest advanced titanosaurian known to date, demonstrating that the initial diversification of advanced titanosaurians was well under way at least 30 million years before their known radiation in the latest Cretaceous. The new taxon also preserves the most complete skull among titanosaurians, further revealing that their low and elongated diplodocid-like skull morphology appeared much earlier than previously thought

    Euclid preparation. XXV. The Euclid Morphology Challenge -- Towards model-fitting photometry for billions of galaxies

    Full text link
    The ESA Euclid mission will provide high-quality imaging for about 1.5 billion galaxies. A software pipeline to automatically process and analyse such a huge amount of data in real time is being developed by the Science Ground Segment of the Euclid Consortium; this pipeline will include a model-fitting algorithm, which will provide photometric and morphological estimates of paramount importance for the core science goals of the mission and for legacy science. The Euclid Morphology Challenge is a comparative investigation of the performance of five model-fitting software packages on simulated Euclid data, aimed at providing the baseline to identify the best suited algorithm to be implemented in the pipeline. In this paper we describe the simulated data set, and we discuss the photometry results. A companion paper (Euclid Collaboration: Bretonni\`ere et al. 2022) is focused on the structural and morphological estimates. We created mock Euclid images simulating five fields of view of 0.48 deg2 each in the IEI_E band of the VIS instrument, each with three realisations of galaxy profiles (single and double S\'ersic, and 'realistic' profiles obtained with a neural network); for one of the fields in the double S\'ersic realisation, we also simulated images for the three near-infrared YEY_E, JEJ_E and HEH_E bands of the NISP-P instrument, and five Rubin/LSST optical complementary bands (uu, gg, rr, ii, and zz). To analyse the results we created diagnostic plots and defined ad-hoc metrics. Five model-fitting software packages (DeepLeGATo, Galapagos-2, Morfometryka, ProFit, and SourceXtractor++) were compared, all typically providing good results. (cut)Comment: 29 pages, 33 figures. Euclid pre-launch key paper. Companion paper: Bretonniere et al. 202

    Euclid preparation: XXII. Selection of Quiescent Galaxies from Mock Photometry using Machine Learning

    Full text link
    The Euclid Space Telescope will provide deep imaging at optical and near-infrared wavelengths, along with slitless near-infrared spectroscopy, across ~15,000 sq deg of the sky. Euclid is expected to detect ~12 billion astronomical sources, facilitating new insights into cosmology, galaxy evolution, and various other topics. To optimally exploit the expected very large data set, there is the need to develop appropriate methods and software. Here we present a novel machine-learning based methodology for selection of quiescent galaxies using broad-band Euclid I_E, Y_E, J_E, H_E photometry, in combination with multiwavelength photometry from other surveys. The ARIADNE pipeline uses meta-learning to fuse decision-tree ensembles, nearest-neighbours, and deep-learning methods into a single classifier that yields significantly higher accuracy than any of the individual learning methods separately. The pipeline has `sparsity-awareness', so that missing photometry values are still informative for the classification. Our pipeline derives photometric redshifts for galaxies selected as quiescent, aided by the `pseudo-labelling' semi-supervised method. After application of the outlier filter, our pipeline achieves a normalized mean absolute deviation of ~< 0.03 and a fraction of catastrophic outliers of ~< 0.02 when measured against the COSMOS2015 photometric redshifts. We apply our classification pipeline to mock galaxy photometry catalogues corresponding to three main scenarios: (i) Euclid Deep Survey with ancillary ugriz, WISE, and radio data; (ii) Euclid Wide Survey with ancillary ugriz, WISE, and radio data; (iii) Euclid Wide Survey only. Our classification pipeline outperforms UVJ selection, in addition to the Euclid I_E-Y_E, J_E-H_E and u-I_E,I_E-J_E colour-colour methods, with improvements in completeness and the F1-score of up to a factor of 2. (Abridged)Comment: 37 pages (including appendices), 26 figures; accepted for publication in Astronomy & Astrophysic

    Euclid preparation. XXXI. The effect of the variations in photometric passbands on photometric-redshift accuracy

    Full text link
    The technique of photometric redshifts has become essential for the exploitation of multi-band extragalactic surveys. While the requirements on photo-zs for the study of galaxy evolution mostly pertain to the precision and to the fraction of outliers, the most stringent requirement in their use in cosmology is on the accuracy, with a level of bias at the sub-percent level for the Euclid cosmology mission. A separate, and challenging, calibration process is needed to control the bias at this level of accuracy. The bias in photo-zs has several distinct origins that may not always be easily overcome. We identify here one source of bias linked to the spatial or time variability of the passbands used to determine the photometric colours of galaxies. We first quantified the effect as observed on several well-known photometric cameras, and found in particular that, due to the properties of optical filters, the redshifts of off-axis sources are usually overestimated. We show using simple simulations that the detailed and complex changes in the shape can be mostly ignored and that it is sufficient to know the mean wavelength of the passbands of each photometric observation to correct almost exactly for this bias; the key point is that this mean wavelength is independent of the spectral energy distribution of the source}. We use this property to propose a correction that can be computationally efficiently implemented in some photo-z algorithms, in particular template-fitting. We verified that our algorithm, implemented in the new photo-z code Phosphoros, can effectively reduce the bias in photo-zs on real data using the CFHTLS T007 survey, with an average measured bias Delta z over the redshift range 0.4<z<0.7 decreasing by about 0.02, specifically from Delta z~0.04 to Delta z~0.02 around z=0.5. Our algorithm is also able to produce corrected photometry for other applications.Comment: 19 pages, 13 figures; Accepted for publication in A&

    Study of the effects induced by lead on the emulsion films of the OPERA experiment

    Get PDF
    The OPERA neutrino oscillation experiment is based on the use of the Emulsion Cloud Chamber (ECC). In the OPERA ECC, nuclear emulsion films acting as very high precision tracking detectors are interleaved with lead plates providing a massive target for neutrino interactions. We report on studies related to the effects occurring from the contact between emulsion and lead. A low radioactivity lead is required in order to minimize the number of background tracks in emulsions and to achieve the required performance in the reconstruction of neutrino events. It was observed that adding other chemical elements to the lead, in order to improve the mechanical properties, may significantly increase the level of radioactivity on the emulsions. A detailed study was made in order to choose a lead alloy with good mechanical properties and an appropriate packing technique so as to have a low enough effective radioactivity.Comment: 19 pages, 11 figure
    corecore