61,341 research outputs found

    Using gamma regression for photometric redshifts of survey galaxies

    Get PDF
    Machine learning techniques offer a plethora of opportunities in tackling big data within the astronomical community. We present the set of Generalized Linear Models as a fast alternative for determining photometric redshifts of galaxies, a set of tools not commonly applied within astronomy, despite being widely used in other professions. With this technique, we achieve catastrophic outlier rates of the order of ~1%, that can be achieved in a matter of seconds on large datasets of size ~1,000,000. To make these techniques easily accessible to the astronomical community, we developed a set of libraries and tools that are publicly available.Comment: Refereed Proceeding of "The Universe of Digital Sky Surveys" conference held at the INAF - Observatory of Capodimonte, Naples, on 25th-28th November 2014, to be published in the Astrophysics and Space Science Proceedings, edited by Longo, Napolitano, Marconi, Paolillo, Iodice, 6 pages, and 1 figur

    The Data Big Bang and the Expanding Digital Universe: High-Dimensional, Complex and Massive Data Sets in an Inflationary Epoch

    Get PDF
    Recent and forthcoming advances in instrumentation, and giant new surveys, are creating astronomical data sets that are not amenable to the methods of analysis familiar to astronomers. Traditional methods are often inadequate not merely because of the size in bytes of the data sets, but also because of the complexity of modern data sets. Mathematical limitations of familiar algorithms and techniques in dealing with such data sets create a critical need for new paradigms for the representation, analysis and scientific visualization (as opposed to illustrative visualization) of heterogeneous, multiresolution data across application domains. Some of the problems presented by the new data sets have been addressed by other disciplines such as applied mathematics, statistics and machine learning and have been utilized by other sciences such as space-based geosciences. Unfortunately, valuable results pertaining to these problems are mostly to be found only in publications outside of astronomy. Here we offer brief overviews of a number of concepts, techniques and developments, some "old" and some new. These are generally unknown to most of the astronomical community, but are vital to the analysis and visualization of complex datasets and images. In order for astronomers to take advantage of the richness and complexity of the new era of data, and to be able to identify, adopt, and apply new solutions, the astronomical community needs a certain degree of awareness and understanding of the new concepts. One of the goals of this paper is to help bridge the gap between applied mathematics, artificial intelligence and computer science on the one side and astronomy on the other.Comment: 24 pages, 8 Figures, 1 Table. Accepted for publication: "Advances in Astronomy, special issue "Robotic Astronomy

    Predicting Stellar Angular Sizes

    Full text link
    Reliable prediction of stellar diameters, particularly angular diameters, is a useful and necessary tool for the increasing number of milliarcsecond resolution studies being carried out in the astronomical community. A new and accurate technique of predicting angular sizes is presented for main sequence stars, giant and supergiant stars, and for more evolved sources such as carbon stars and Mira variables. This technique uses observed KK and either VV or BB broad-band photometry to predict V=0 or B=0 zero magnitude angular sizes, which are then readily scaled to the apparent angular sizes with the VV or BB photometry. The spread in the relationship is 2.2% for main sequence stars; for giant and supergiant stars, 11-12%; and for evolved sources, results are at the 20-26% level. Compared to other simple predictions of angular size, such as linear radius-distance methods or black-body estimates, zero magnitude angular size predictions can provide apparent angular sizes with errors that are 2 to 5 times smaller.Comment: 28 pages, 4 figures, accepted by PAS

    The Overlooked Potential of Generalized Linear Models in Astronomy-III: Bayesian Negative Binomial Regression and Globular Cluster Populations

    Get PDF
    In this paper, the third in a series illustrating the power of generalized linear models (GLMs) for the astronomical community, we elucidate the potential of the class of GLMs which handles count data. The size of a galaxy's globular cluster population NGCN_{\rm GC} is a prolonged puzzle in the astronomical literature. It falls in the category of count data analysis, yet it is usually modelled as if it were a continuous response variable. We have developed a Bayesian negative binomial regression model to study the connection between NGCN_{\rm GC} and the following galaxy properties: central black hole mass, dynamical bulge mass, bulge velocity dispersion, and absolute visual magnitude. The methodology introduced herein naturally accounts for heteroscedasticity, intrinsic scatter, errors in measurements in both axes (either discrete or continuous), and allows modelling the population of globular clusters on their natural scale as a non-negative integer variable. Prediction intervals of 99% around the trend for expected NGCN_{\rm GC}comfortably envelope the data, notably including the Milky Way, which has hitherto been considered a problematic outlier. Finally, we demonstrate how random intercept models can incorporate information of each particular galaxy morphological type. Bayesian variable selection methodology allows for automatically identifying galaxy types with different productions of GCs, suggesting that on average S0 galaxies have a GC population 35% smaller than other types with similar brightness.Comment: 14 pages, 12 figures. Accepted for publication in MNRA

    Dispelling the myth of robotic efficiency: why human space exploration will tell us more about the Solar System than will robotic exploration alone

    Full text link
    There is a widely held view in the astronomical community that unmanned robotic space vehicles are, and will always be, more efficient explorers of planetary surfaces than astronauts (e.g. Coates, 2001; Clements 2009; Rees 2011). Partly this is due to a common assumption that robotic exploration is cheaper than human exploration (although, as we shall see, this isn't necessarily true if like is compared with like), and partly from the expectation that continued developments in technology will relentlessly increase the capability, and reduce the size and cost, of robotic missions to the point that human exploration will not be able to compete. I will argue below that the experience of human exploration during the Apollo missions, more recent field analogue studies, and trends in robotic space exploration actually all point to exactly the opposite conclusion.Comment: 12 pages; 5 figures. Published, with minor modifications, in Astronomy and Geophysics, Vol. 53, pp. 2.22-2.26, 201

    Astroinformatics : Image processing and analysis of digitized astronomical data with web-based implementation

    Get PDF
    The newly born area of Astroinformatics has arisen as an interdisciplinary domain from Astronomy and Information and Communication Technologies (ICT). Recently, four institutes of the Bulgarian Academy of Sciences launched a joint project called “Astroinformatics”. The main goal of the project is the development of methods and techniques for preservation and exploitation of the scientific, cultural and historic heritage of the astronomical observations. The Wide-Field Plate Data Base is an ICT project of the Institute of Astronomy, which has been launched in 1991, supported by International Astronomic Union [1]. Now over two million photographic plates, obtained with professional telescopes at 125 observatories worldwide, are identified and collected in this database. Digitization of photographic plates is an actual task for astronomical society. So far 150 000 plates have been digitized through several European research programs. As a result, a huge amount of data is collected (with about 2TB in size). The access, manipulation and data mining of this data is a serious challenge for the ICT community. In this article we discuss goals, tasks and achievements in the area of Astroinformatics

    Unleashing the Power of Distributed CPU/GPU Architectures: Massive Astronomical Data Analysis and Visualization case study

    Full text link
    Upcoming and future astronomy research facilities will systematically generate terabyte-sized data sets moving astronomy into the Petascale data era. While such facilities will provide astronomers with unprecedented levels of accuracy and coverage, the increases in dataset size and dimensionality will pose serious computational challenges for many current astronomy data analysis and visualization tools. With such data sizes, even simple data analysis tasks (e.g. calculating a histogram or computing data minimum/maximum) may not be achievable without access to a supercomputing facility. To effectively handle such dataset sizes, which exceed today's single machine memory and processing limits, we present a framework that exploits the distributed power of GPUs and many-core CPUs, with a goal of providing data analysis and visualizing tasks as a service for astronomers. By mixing shared and distributed memory architectures, our framework effectively utilizes the underlying hardware infrastructure handling both batched and real-time data analysis and visualization tasks. Offering such functionality as a service in a "software as a service" manner will reduce the total cost of ownership, provide an easy to use tool to the wider astronomical community, and enable a more optimized utilization of the underlying hardware infrastructure.Comment: 4 Pages, 1 figures, To appear in the proceedings of ADASS XXI, ed. P.Ballester and D.Egret, ASP Conf. Serie

    Focal reducer/wide-field corrector for the C. E. Kenneth Mees telescope

    Get PDF
    Introduction of CCD astronomical imaging has had a tremendous impact on the astrophysical community. The greatest advantage to CCD astronomical imaging is the phenomenal increase in efficiency when compared to photographic plate imaging. Images which previously required exposure times in the range of several hours can be captured in a matter of seconds using a CCD astronomical imaging system. The disadvantage to using a CCD for astronomical imaging is the loss of information capture due to the change in size of the detector area. Originally equipped with a Boller and Chivens 3 x 5 photographic plate, the Mees telescope provided a 0.4 degree full-field angle. The purpose of the research was to design a focal reducer to reduce the focal ratio of the telescope system from f/13.5 to f/4.5 to enable a larger field of view to be imaged onto the smaller CCD detector area with a 9-um pixel size. Optics Software for Layout and Optimization(OSLO), lens design software was implemented for designing the system. The original goal to design the system using only readily available lens components from the OSLO database eliminates the need for expensive special order optics components. Several focal reducer/wide-field corrector designs were configured using OSLO. None of the designs using only readily available lens database components met the design criterion. Determination has been made to abandon the restriction to the lens database components in favor of a customized focal reducer/wide-field corrector design. Preliminary research suggests two possible design configurations for the customized focal reducer/wide-field corrector. The first design would consist of a 6 diameter, f/2 collimating lens located at the telescope mounting flange followed by a f/4.5 fully corrected camera system. A second design would consist of a symmetrical Biotar focal reducer design. The Biotar design would require extensive optimization using OSLO. Preliminary research suggests the Biotar design might provide a more fully corrected system
    corecore