40 research outputs found

    Cosmological models with linearly varying deceleration parameter

    Full text link
    We propose a new law for the deceleration parameter that varies linearly with time and covers Berman's law where it is constant. Our law not only allows one to generalize many exact solutions that were obtained assuming constant deceleration parameter, but also gives a better fit with data (from SNIa, BAO and CMB), particularly concerning the late time behavior of the universe. According to our law only the spatially closed and flat universes are allowed; in both cases the cosmological fluid we obtain exhibits quintom like behavior and the universe ends with a big-rip. This is a result consistent with recent cosmological observations.Comment: 12 pages, 7 figures; some typo corrections; to appear in International Journal of Theoretical Physic

    QCD ghost f(T)-gravity model

    Full text link
    Within the framework of modified teleparallel gravity, we reconstruct a f(T) model corresponding to the QCD ghost dark energy scenario. For a spatially flat FRW universe containing only the pressureless matter, we obtain the time evolution of the torsion scalar T (or the Hubble parameter). Then, we calculate the effective torsion equation of state parameter of the QCD ghost f(T)-gravity model as well as the deceleration parameter of the universe. Furthermore, we fit the model parameters by using the latest observational data including SNeIa, CMB and BAO data. We also check the viability of our model using a cosmographic analysis approach. Moreover, we investigate the validity of the generalized second law (GSL) of gravitational thermodynamics for our model. Finally, we point out the growth rate of matter density perturbation. We conclude that in QCD ghost f(T)-gravity model, the universe begins a matter dominated phase and approaches a de Sitter regime at late times, as expected. Also this model is consistent with current data, passes the cosmographic test, satisfies the GSL and fits the data of the growth factor well as the LCDM model.Comment: 19 pages, 9 figures, 2 tables. arXiv admin note: substantial text overlap with arXiv:1111.726

    Superluminous supernova search with PineForest

    No full text
    International audienceThe advent of large astronomical surveys has made available large and complex data sets. However, the process of discovery and interpretation of each potentially new astronomical source is, many times, still handcrafted. In this context, machine learning algorithms have emerged as a powerful tool to mine large data sets and lower the burden on the domain expert. Active learning strategies are specially good in this task. In this report, we used the PineForest algorithm to search for superluminous supernova (SLSN) candidates in the Zwicky Transient Facility. We showcase how the use of previously confirmed sources can provide important information to boost the convergence of the active learning algorithm. Starting from a data set of \sim14 million objects, and using 8 previously confirmed SLSN light curves as priors, we scrutinized 120 candidates and found 8 SLSN candidates, 2 of which have not been reported before (AT 2018moa and AT 2018mob). These results demonstrate how existing spectroscopic samples can be used to improve the efficiency of active learning strategies in searching for rare astronomical sources

    Superluminous supernova search with PineForest

    No full text
    International audienceThe advent of large astronomical surveys has made available large and complex data sets. However, the process of discovery and interpretation of each potentially new astronomical source is, many times, still handcrafted. In this context, machine learning algorithms have emerged as a powerful tool to mine large data sets and lower the burden on the domain expert. Active learning strategies are specially good in this task. In this report, we used the PineForest algorithm to search for superluminous supernova (SLSN) candidates in the Zwicky Transient Facility. We showcase how the use of previously confirmed sources can provide important information to boost the convergence of the active learning algorithm. Starting from a data set of \sim14 million objects, and using 8 previously confirmed SLSN light curves as priors, we scrutinized 120 candidates and found 8 SLSN candidates, 2 of which have not been reported before (AT 2018moa and AT 2018mob). These results demonstrate how existing spectroscopic samples can be used to improve the efficiency of active learning strategies in searching for rare astronomical sources

    Machine learning techniques for analysis of photometric data from the Open Supernova catalog

    No full text
    International audienceThe next generation of astronomical surveys will revolutionize our understandingof the Universe, raising unprecedented data challenges in the process. One ofthem is the impossibility to rely on human scanning for the identification ofunusual/unpredicted astrophysical objects. Moreover, given that most of theavailable data will be in the form of photometric observations, suchcharacterization cannot rely on the existence of high resolution spectroscopicobservations. The goal of this project is to detect the anomalies in the OpenSupernova Catalog (http://sne.space/) with use of machine learning. We willdevelop a pipeline where human expertise and modern machine learning techniquescan complement each other. Using supernovae as a case study, our proposal isdivided in two parts: the first developing a strategy and pipeline whereanomalous objects are identified, and a second phase where such anomalousobjects submitted to careful individual analysis. The strategy requires aninitial data set for which spectroscopic is available for training purposes, butcan be applied to a much larger data set for which we only have photometricobservations. This project represents an effective strategy to guarantee weshall not overlook exciting new science hidden in the data we fought so hard toacquire

    Enabling the discovery of fast transients: A kilonova science module for the Fink broker

    No full text
    International audienceWe describe the fast transient classification algorithm in the center of the kilonova (KN) science module currently implemented in the Fink broker and report classification results based on simulated catalogs and real data from the ZTF alert stream. We used noiseless, homogeneously sampled simulations to construct a basis of principal components (PCs). All light curves from a more realistic ZTF simulation were written as a linear combination of this basis. The corresponding coefficients were used as features in training a random forest classifier. The same method was applied to long (>30 days) and medium (<30 days) light curves. The latter aimed to simulate the data situation found within the ZTF alert stream. Classification based on long light curves achieved 73.87% precision and 82.19% recall. Medium baseline analysis resulted in 69.30% precision and 69.74% recall, thus confirming the robustness of precision results when limited to 30 days of observations. In both cases, dwarf flares and point Type Ia supernovae were the most frequent contaminants. The final trained model was integrated into the Fink broker and has been distributing fast transients, tagged as \texttt{KN\_candidates}, to the astronomical community, especially through the GRANDMA collaboration. We showed that features specifically designed to grasp different light curve behaviors provide enough information to separate fast (KN-like) from slow (non-KN-like) evolving events. This module represents one crucial link in an intricate chain of infrastructure elements for multi-messenger astronomy which is currently being put in place by the Fink broker team in preparation for the arrival of data from the Vera Rubin Observatory Legacy Survey of Space and Time

    Enabling the discovery of fast transients: A kilonova science module for the Fink broker

    No full text
    International audienceWe describe the fast transient classification algorithm in the center of the kilonova (KN) science module currently implemented in the Fink broker and report classification results based on simulated catalogs and real data from the ZTF alert stream. We used noiseless, homogeneously sampled simulations to construct a basis of principal components (PCs). All light curves from a more realistic ZTF simulation were written as a linear combination of this basis. The corresponding coefficients were used as features in training a random forest classifier. The same method was applied to long (>30 days) and medium (<30 days) light curves. The latter aimed to simulate the data situation found within the ZTF alert stream. Classification based on long light curves achieved 73.87% precision and 82.19% recall. Medium baseline analysis resulted in 69.30% precision and 69.74% recall, thus confirming the robustness of precision results when limited to 30 days of observations. In both cases, dwarf flares and point Type Ia supernovae were the most frequent contaminants. The final trained model was integrated into the Fink broker and has been distributing fast transients, tagged as \texttt{KN\_candidates}, to the astronomical community, especially through the GRANDMA collaboration. We showed that features specifically designed to grasp different light curve behaviors provide enough information to separate fast (KN-like) from slow (non-KN-like) evolving events. This module represents one crucial link in an intricate chain of infrastructure elements for multi-messenger astronomy which is currently being put in place by the Fink broker team in preparation for the arrival of data from the Vera Rubin Observatory Legacy Survey of Space and Time

    Enabling the discovery of fast transients: A kilonova science module for the Fink broker

    No full text
    International audienceWe describe the fast transient classification algorithm in the center of the kilonova (KN) science module currently implemented in the Fink broker and report classification results based on simulated catalogs and real data from the ZTF alert stream. We used noiseless, homogeneously sampled simulations to construct a basis of principal components (PCs). All light curves from a more realistic ZTF simulation were written as a linear combination of this basis. The corresponding coefficients were used as features in training a random forest classifier. The same method was applied to long (>30 days) and medium (<30 days) light curves. The latter aimed to simulate the data situation found within the ZTF alert stream. Classification based on long light curves achieved 73.87% precision and 82.19% recall. Medium baseline analysis resulted in 69.30% precision and 69.74% recall, thus confirming the robustness of precision results when limited to 30 days of observations. In both cases, dwarf flares and point Type Ia supernovae were the most frequent contaminants. The final trained model was integrated into the Fink broker and has been distributing fast transients, tagged as \texttt{KN\_candidates}, to the astronomical community, especially through the GRANDMA collaboration. We showed that features specifically designed to grasp different light curve behaviors provide enough information to separate fast (KN-like) from slow (non-KN-like) evolving events. This module represents one crucial link in an intricate chain of infrastructure elements for multi-messenger astronomy which is currently being put in place by the Fink broker team in preparation for the arrival of data from the Vera Rubin Observatory Legacy Survey of Space and Time

    Spatial field reconstruction with INLA: application to IFU galaxy data

    No full text
    International audienceAstronomical observations of extended sources, such as cubes of integral field spectroscopy (IFS), encode autocorrelated spatial structures that cannot be optimally exploited by standard methodologies. This work introduces a novel technique to model IFS data sets, which treats the observed galaxy properties as realizations of an unobserved Gaussian Markov random field. The method is computationally efficient, resilient to the presence of low-signal-to-noise regions, and uses an alternative to Markov Chain Monte Carlo for fast Bayesian inference – the Integrated Nested Laplace Approximation. As a case study, we analyse 721 IFS data cubes of nearby galaxies from the CALIFA and PISCO surveys, for which we retrieve the maps of the following physical properties: age, metallicity, mass, and extinction. The proposed Bayesian approach, built on a generative representation of the galaxy properties, enables the creation of synthetic images, recovery of areas with bad pixels, and an increased power to detect structures in data sets subject to substantial noise and/or sparsity of sampling. A snippet code to reproduce the analysis of this paper is available in the COIN toolbox, together with the field reconstructions of the CALIFA and PISCO samples
    corecore