33 research outputs found

    Cosmological models with linearly varying deceleration parameter

    Full text link
    We propose a new law for the deceleration parameter that varies linearly with time and covers Berman's law where it is constant. Our law not only allows one to generalize many exact solutions that were obtained assuming constant deceleration parameter, but also gives a better fit with data (from SNIa, BAO and CMB), particularly concerning the late time behavior of the universe. According to our law only the spatially closed and flat universes are allowed; in both cases the cosmological fluid we obtain exhibits quintom like behavior and the universe ends with a big-rip. This is a result consistent with recent cosmological observations.Comment: 12 pages, 7 figures; some typo corrections; to appear in International Journal of Theoretical Physic

    QCD ghost f(T)-gravity model

    Full text link
    Within the framework of modified teleparallel gravity, we reconstruct a f(T) model corresponding to the QCD ghost dark energy scenario. For a spatially flat FRW universe containing only the pressureless matter, we obtain the time evolution of the torsion scalar T (or the Hubble parameter). Then, we calculate the effective torsion equation of state parameter of the QCD ghost f(T)-gravity model as well as the deceleration parameter of the universe. Furthermore, we fit the model parameters by using the latest observational data including SNeIa, CMB and BAO data. We also check the viability of our model using a cosmographic analysis approach. Moreover, we investigate the validity of the generalized second law (GSL) of gravitational thermodynamics for our model. Finally, we point out the growth rate of matter density perturbation. We conclude that in QCD ghost f(T)-gravity model, the universe begins a matter dominated phase and approaches a de Sitter regime at late times, as expected. Also this model is consistent with current data, passes the cosmographic test, satisfies the GSL and fits the data of the growth factor well as the LCDM model.Comment: 19 pages, 9 figures, 2 tables. arXiv admin note: substantial text overlap with arXiv:1111.726

    Machine learning techniques for analysis of photometric data from the Open Supernova catalog

    No full text
    International audienceThe next generation of astronomical surveys will revolutionize our understandingof the Universe, raising unprecedented data challenges in the process. One ofthem is the impossibility to rely on human scanning for the identification ofunusual/unpredicted astrophysical objects. Moreover, given that most of theavailable data will be in the form of photometric observations, suchcharacterization cannot rely on the existence of high resolution spectroscopicobservations. The goal of this project is to detect the anomalies in the OpenSupernova Catalog (http://sne.space/) with use of machine learning. We willdevelop a pipeline where human expertise and modern machine learning techniquescan complement each other. Using supernovae as a case study, our proposal isdivided in two parts: the first developing a strategy and pipeline whereanomalous objects are identified, and a second phase where such anomalousobjects submitted to careful individual analysis. The strategy requires aninitial data set for which spectroscopic is available for training purposes, butcan be applied to a much larger data set for which we only have photometricobservations. This project represents an effective strategy to guarantee weshall not overlook exciting new science hidden in the data we fought so hard toacquire

    Enabling the discovery of fast transients: A kilonova science module for the Fink broker

    No full text
    International audienceWe describe the fast transient classification algorithm in the center of the kilonova (KN) science module currently implemented in the Fink broker and report classification results based on simulated catalogs and real data from the ZTF alert stream. We used noiseless, homogeneously sampled simulations to construct a basis of principal components (PCs). All light curves from a more realistic ZTF simulation were written as a linear combination of this basis. The corresponding coefficients were used as features in training a random forest classifier. The same method was applied to long (>30 days) and medium (<30 days) light curves. The latter aimed to simulate the data situation found within the ZTF alert stream. Classification based on long light curves achieved 73.87% precision and 82.19% recall. Medium baseline analysis resulted in 69.30% precision and 69.74% recall, thus confirming the robustness of precision results when limited to 30 days of observations. In both cases, dwarf flares and point Type Ia supernovae were the most frequent contaminants. The final trained model was integrated into the Fink broker and has been distributing fast transients, tagged as \texttt{KN\_candidates}, to the astronomical community, especially through the GRANDMA collaboration. We showed that features specifically designed to grasp different light curve behaviors provide enough information to separate fast (KN-like) from slow (non-KN-like) evolving events. This module represents one crucial link in an intricate chain of infrastructure elements for multi-messenger astronomy which is currently being put in place by the Fink broker team in preparation for the arrival of data from the Vera Rubin Observatory Legacy Survey of Space and Time

    Enabling the discovery of fast transients: A kilonova science module for the Fink broker

    No full text
    International audienceWe describe the fast transient classification algorithm in the center of the kilonova (KN) science module currently implemented in the Fink broker and report classification results based on simulated catalogs and real data from the ZTF alert stream. We used noiseless, homogeneously sampled simulations to construct a basis of principal components (PCs). All light curves from a more realistic ZTF simulation were written as a linear combination of this basis. The corresponding coefficients were used as features in training a random forest classifier. The same method was applied to long (>30 days) and medium (<30 days) light curves. The latter aimed to simulate the data situation found within the ZTF alert stream. Classification based on long light curves achieved 73.87% precision and 82.19% recall. Medium baseline analysis resulted in 69.30% precision and 69.74% recall, thus confirming the robustness of precision results when limited to 30 days of observations. In both cases, dwarf flares and point Type Ia supernovae were the most frequent contaminants. The final trained model was integrated into the Fink broker and has been distributing fast transients, tagged as \texttt{KN\_candidates}, to the astronomical community, especially through the GRANDMA collaboration. We showed that features specifically designed to grasp different light curve behaviors provide enough information to separate fast (KN-like) from slow (non-KN-like) evolving events. This module represents one crucial link in an intricate chain of infrastructure elements for multi-messenger astronomy which is currently being put in place by the Fink broker team in preparation for the arrival of data from the Vera Rubin Observatory Legacy Survey of Space and Time

    Enabling the discovery of fast transients: A kilonova science module for the Fink broker

    No full text
    International audienceWe describe the fast transient classification algorithm in the center of the kilonova (KN) science module currently implemented in the Fink broker and report classification results based on simulated catalogs and real data from the ZTF alert stream. We used noiseless, homogeneously sampled simulations to construct a basis of principal components (PCs). All light curves from a more realistic ZTF simulation were written as a linear combination of this basis. The corresponding coefficients were used as features in training a random forest classifier. The same method was applied to long (>30 days) and medium (<30 days) light curves. The latter aimed to simulate the data situation found within the ZTF alert stream. Classification based on long light curves achieved 73.87% precision and 82.19% recall. Medium baseline analysis resulted in 69.30% precision and 69.74% recall, thus confirming the robustness of precision results when limited to 30 days of observations. In both cases, dwarf flares and point Type Ia supernovae were the most frequent contaminants. The final trained model was integrated into the Fink broker and has been distributing fast transients, tagged as \texttt{KN\_candidates}, to the astronomical community, especially through the GRANDMA collaboration. We showed that features specifically designed to grasp different light curve behaviors provide enough information to separate fast (KN-like) from slow (non-KN-like) evolving events. This module represents one crucial link in an intricate chain of infrastructure elements for multi-messenger astronomy which is currently being put in place by the Fink broker team in preparation for the arrival of data from the Vera Rubin Observatory Legacy Survey of Space and Time

    Enabling the discovery of fast transients: A kilonova science module for the Fink broker

    No full text
    International audienceWe describe the fast transient classification algorithm in the center of the kilonova (KN) science module currently implemented in the Fink broker and report classification results based on simulated catalogs and real data from the ZTF alert stream. We used noiseless, homogeneously sampled simulations to construct a basis of principal components (PCs). All light curves from a more realistic ZTF simulation were written as a linear combination of this basis. The corresponding coefficients were used as features in training a random forest classifier. The same method was applied to long (>30 days) and medium (<30 days) light curves. The latter aimed to simulate the data situation found within the ZTF alert stream. Classification based on long light curves achieved 73.87% precision and 82.19% recall. Medium baseline analysis resulted in 69.30% precision and 69.74% recall, thus confirming the robustness of precision results when limited to 30 days of observations. In both cases, dwarf flares and point Type Ia supernovae were the most frequent contaminants. The final trained model was integrated into the Fink broker and has been distributing fast transients, tagged as \texttt{KN\_candidates}, to the astronomical community, especially through the GRANDMA collaboration. We showed that features specifically designed to grasp different light curve behaviors provide enough information to separate fast (KN-like) from slow (non-KN-like) evolving events. This module represents one crucial link in an intricate chain of infrastructure elements for multi-messenger astronomy which is currently being put in place by the Fink broker team in preparation for the arrival of data from the Vera Rubin Observatory Legacy Survey of Space and Time

    Spatial field reconstruction with INLA: application to IFU galaxy data

    No full text
    International audienceAstronomical observations of extended sources, such as cubes of integral field spectroscopy (IFS), encode autocorrelated spatial structures that cannot be optimally exploited by standard methodologies. This work introduces a novel technique to model IFS data sets, which treats the observed galaxy properties as realizations of an unobserved Gaussian Markov random field. The method is computationally efficient, resilient to the presence of low-signal-to-noise regions, and uses an alternative to Markov Chain Monte Carlo for fast Bayesian inference – the Integrated Nested Laplace Approximation. As a case study, we analyse 721 IFS data cubes of nearby galaxies from the CALIFA and PISCO surveys, for which we retrieve the maps of the following physical properties: age, metallicity, mass, and extinction. The proposed Bayesian approach, built on a generative representation of the galaxy properties, enables the creation of synthetic images, recovery of areas with bad pixels, and an increased power to detect structures in data sets subject to substantial noise and/or sparsity of sampling. A snippet code to reproduce the analysis of this paper is available in the COIN toolbox, together with the field reconstructions of the CALIFA and PISCO samples

    Transient Classifiers for Fink: Benchmarks for LSST

    No full text
    International audienceThe upcoming Legacy Survey of Space and Time (LSST) at the Vera Rubin Observatory is expected to detect a few million transients per night, which will generate a live alert stream during the entire 10 years of the survey. This will be distributed via community brokers whose task is to select subsets of the stream and direct them to scientific communities. Given the volume and complexity of data, machine learning (ML) algorithms will be paramount for this task. We present the infrastructure tests and classification methods developed within the {\sc Fink} broker in preparation for LSST. This work aims to provide detailed information regarding the underlying assumptions, and methods, behind each classifier, enabling users to make informed follow-up decisions from {\sc Fink} photometric classifications. Using simulated data from the Extended LSST Astronomical Time-series Classification Challenge (ELAsTiCC), we showcase the performance of binary and multi-class ML classifiers available in {\sc Fink}. These include tree-based classifiers coupled with tailored feature extraction strategies, as well as deep learning algorithms. We introduce the CBPF Alert Transient Search (CATS), a deep learning architecture specifically designed for this task. Results show that {\sc Fink} classifiers are able to handle the extra complexity which is expected from LSST data. CATS achieved 97%97\% accuracy on a multi-class classification while our best performing binary classifier achieve 99%99\% when classifying the Periodic class. ELAsTiCC was an important milestone in preparing {\sc Fink} infrastructure to deal with LSST-like data. Our results demonstrate that {\sc Fink} classifiers are well prepared for the arrival of the new stream; this experience also highlights that transitioning from current infrastructures to Rubin will require significant adaptation of currently available tools
    corecore