3,008 research outputs found

    Star Formation Rates for photometric samples of galaxies using machine learning methods

    Full text link
    Star Formation Rates or SFRs are crucial to constrain theories of galaxy formation and evolution. SFRs are usually estimated via spectroscopic observations requiring large amounts of telescope time. We explore an alternative approach based on the photometric estimation of global SFRs for large samples of galaxies, by using methods such as automatic parameter space optimisation, and supervised Machine Learning models. We demonstrate that, with such approach, accurate multi-band photometry allows to estimate reliable SFRs. We also investigate how the use of photometric rather than spectroscopic redshifts, affects the accuracy of derived global SFRs. Finally, we provide a publicly available catalogue of SFRs for more than 27 million galaxies extracted from the Sloan Digital Sky survey Data Release 7. The catalogue is available through the Vizier facility at the following link ftp://cdsarc.u-strasbg.fr/pub/cats/J/MNRAS/486/1377

    Seeking for the rational basis of the median model: the optimal combination of multi-model ensemble results

    No full text
    International audienceIn this paper we present an approach for the statistical analysis of multi-model ensemble results. The models considered here are operational long-range transport and dispersion models, also used for the real-time simulation of pollutant dispersion or the accidental release of radioactive nuclides. We first introduce the theoretical basis (with its roots sinking into the Bayes theorem) and then apply this approach to the analysis of model results obtained during the ETEX-1 exercise. We recover some interesting results, supporting the heuristic approach called "median model", originally introduced in Galmarini et al. (2004a, b). This approach also provides a way to systematically reduce (and quantify) model uncertainties, thus supporting the decision-making process and/or regulatory-purpose activities in a very effective manner

    AIDA, a Modular Web Application for Astronomical Data Analysis and Instrument Monitoring Services

    Get PDF
    In the last decade, Astronomy has been the scene of the realization of panchromatic surveys, with sophisticated instruments acquiring a huge quantity of exceptional quality data. This poses the need to integrate advanced data-driven science methodologies for the automatic exploration of huge data archives, and the need for efficient short- and long-term monitoring and diagnostics systems. The goal is to keep the quality of the observations under control and to detect and circumscribe anomalies and malfunctions, facilitating rapid and effective corrections, ensuring correct maintenance of all components and the good health of scientific data over time. In particular, this requirement is crucial for space-borne observation systems, both in logistical and economic terms. AIDA (Advanced Infrastructure for Data Analysis) is a portable and modular web application, designed to provide an efficient and intuitive software infrastructure to support monitoring of data acquiring systems over time, diagnostics and both scientific and engineering data quality analysis, particularly suited for astronomical instruments. Given its modular system prerogative, it is possible to extend its functionalities, by integrating and customizing monitoring and diagnostics systems, as well as scientific data analysis solutions, including machine/deep learning and data mining techniques and methods. A specialized version of AIDA has been recently appointed as focal plane instrument operation diagnostics, analytics and monitoring service within the Science Ground Segment of the Euclid space mission

    Overview of the Current Literature on the Most Common Neurological Diseases in Dogs with a Particular Focus on Rehabilitation

    Get PDF
    Simple Summary This paper aims to report an overview of the most common neurological diseases (intervertebral disc herniation, degenerative myelopathy, fibrocartilaginous embolism, and polyradiculoneuritis), with a main focus on rehabilitative options and outcomes, reported in recent veterinary literature. Literature seems to be positively oriented on the efficacy of the rehabilitation approach, reporting a careful and prudent choice of the protocol to be applied for the correct recovery of the patient. However, blinded, controlled, prospective studies are still necessary, above all for degenerative myelopathy, fibrocartilaginous embolism, and polyradiculoneuritis. Intervertebral disc herniation, degenerative myelopathy, fibrocartilaginous embolism and polyradiculoneuritis often affect dogs; and physiotherapy may improve the patient's quality of life and/or reduce recovery times. The aim of this review was to evaluate the current scientific outcomes on these four neurological diseases and on their physiotherapy approaches. From the analysis of the published articles, it emerged that intervertebral disc herniation can be treated, with different rates of success, through a conservative or a surgical approach followed by physiotherapy. The literature is generally oriented toward the efficacy of the rehabilitation approach in this specific canine disease, often proposing intensive post-surgery physiotherapy for the most severe conditions with the absence of deep pain perception. When degenerative myelopathy, fibrocartilaginous embolism or polyradiculoneuritis occur, the existing literature supports the use of a physiotherapeutic approach: allowing a delay in the onset and worsening of the clinical signs in degenerative myelopathy, physical improvement, and, sometimes, complete remission during fibrocartilaginous embolism or acute idiopathic polyradiculoneuritis. However, papers on rehabilitation in dogs affected by polyradiculoneuritis are currently limited to single clinical cases and further blinded, controlled, prospective studies are still advisable for all four neurological diseases

    Sull’utilizzo dell’energia cinetica per produzione additiva: primi risultati di prove di fatica e confronto con lavorazioni SLM

    Get PDF
    Il cold spray (CS) è una tecnica di rivestimento a freddo in cui la deposizione delle polveri avviene grazie all’impatto ad alta velocità delle particelle contro un substrato e alla conseguente elevata deformazione plastica, con l’instaurarsi delle condizioni di instabilità adiabatica di taglio. Nel presente lavoro sono stati considerati provini in In718 prodotti con CS e con SLM, sottoposti a diversi trattamenti termici, a valle della lavorazione dei provini. La caratterizzazione dei provini ha compreso l’analisi microstrutturale, la misura degli sforzi residui e della la porosità, mentre le prove meccaniche hanno previsto prove di trazione statiche e di fatica assiale. I risultati mostrano caratteristiche e resistenza comparabili a quelle dei provini SLM, suggerendo che il CS, grazie alla minore temperatura del processo e al ridotto impegno energetico, possa divenire una tecnologia additiva alternativa o complementare rispetto alle più consolidate tecnologie laser

    Promoting Work in Public Housing: The Effectiveness of Job-Plus

    Get PDF
    Measures the effectiveness of employment related assistance, use of rent breaks as an incentive to work more, and activities that promote neighbor-to-neighbor support for work in Baltimore, Chattanooga, Dayton, Los Angeles, St. Paul, and Seattle

    Cross-influence between intra-laminar damages and fibre bridging at the skin-stringer interface in stiffened composite panels under compression

    Get PDF
    In this paper, the skin-stringer separation phenomenon that occurs in stiffened composite panels under compression is numerically studied. Since the mode I fracture toughness and, consequently, the skin-stringer separation can be influenced by the fibre bridging phenomenon at the skin-stringer interface, in this study, comparisons among three different material systems with different fibre bridging sensitivities have been carried out. Indeed, a reference material system has been compared, in terms of toughness performance, against two materials with different degrees of sensitivity to fibre bridging. A robust numerical procedure for the delamination assessment has been used to mimic the skin-stringer separation. When analysing the global compressive behaviour of the stiffened panel, intra-laminar damages have been considered in conjunction with skin-stringer debonding to evaluate the effect of the fibre and matrix breakage on the separation between the skin and the stringer for the three analysed material systems. The latter are characterised by different toughness characteristics and fibre bridging sensitivities, resulting in a different material toughness

    Statistical Characterization and Classification of Astronomical Transients with Machine Learning in the era of the Vera C. Rubin Observatory

    Get PDF
    Astronomy has entered the multi-messenger data era and Machine Learning has found widespread use in a large variety of applications. The exploitation of synoptic (multi-band and multi-epoch) surveys, like LSST (Legacy Survey of Space and Time), requires an extensive use of automatic methods for data processing and interpretation. With data volumes in the petabyte domain, the discrimination of time-critical information has already exceeded the capabilities of human operators and crowds of scientists have extreme difficulty to manage such amounts of data in multi-dimensional domains. This work is focused on an analysis of critical aspects related to the approach, based on Machine Learning, to variable sky sources classification, with special care to the various types of Supernovae, one of the most important subjects of Time Domain Astronomy, due to their crucial role in Cosmology. The work is based on a test campaign performed on simulated data. The classification was carried out by comparing the performances among several Machine Learning algorithms on statistical parameters extracted from the light curves. The results make in evidence some critical aspects related to the data quality and their parameter space characterization, propaedeutic to the preparation of processing machinery for the real data exploitation in the incoming decade
    corecore