39,574 research outputs found

    Towards More Precise Survey Photometry for PanSTARRS and LSST: Measuring Directly the Optical Transmission Spectrum of the Atmosphere

    Full text link
    Motivated by the recognition that variation in the optical transmission of the atmosphere is probably the main limitation to the precision of ground-based CCD measurements of celestial fluxes, we review the physical processes that attenuate the passage of light through the Earth's atmosphere. The next generation of astronomical surveys, such as PanSTARRS and LSST, will greatly benefit from dedicated apparatus to obtain atmospheric transmission data that can be associated with each survey image. We review and compare various approaches to this measurement problem, including photometry, spectroscopy, and LIDAR. In conjunction with careful measurements of instrumental throughput, atmospheric transmission measurements should allow next-generation imaging surveys to produce photometry of unprecedented precision. Our primary concerns are the real-time determination of aerosol scattering and absorption by water along the line of sight, both of which can vary over the course of a night's observations.Comment: 41 pages, 14 figures. Accepted PAS

    Benchmark of machine learning methods for classification of a Sentinel-2 image

    Get PDF
    Thanks to mainly ESA and USGS, a large bulk of free images of the Earth is readily available nowadays. One of the main goals of remote sensing is to label images according to a set of semantic categories, i.e. image classification. This is a very challenging issue since land cover of a specific class may present a large spatial and spectral variability and objects may appear at different scales and orientations. In this study, we report the results of benchmarking 9 machine learning algorithms tested for accuracy and speed in training and classification of land-cover classes in a Sentinel-2 dataset. The following machine learning methods (MLM) have been tested: linear discriminant analysis, k-nearest neighbour, random forests, support vector machines, multi layered perceptron, multi layered perceptron ensemble, ctree, boosting, logarithmic regression. The validation is carried out using a control dataset which consists of an independent classification in 11 land-cover classes of an area about 60 km2, obtained by manual visual interpretation of high resolution images (20 cm ground sampling distance) by experts. In this study five out of the eleven classes are used since the others have too few samples (pixels) for testing and validating subsets. The classes used are the following: (i) urban (ii) sowable areas (iii) water (iv) tree plantations (v) grasslands. Validation is carried out using three different approaches: (i) using pixels from the training dataset (train), (ii) using pixels from the training dataset and applying cross-validation with the k-fold method (kfold) and (iii) using all pixels from the control dataset. Five accuracy indices are calculated for the comparison between the values predicted with each model and control values over three sets of data: the training dataset (train), the whole control dataset (full) and with k-fold cross-validation (kfold) with ten folds. Results from validation of predictions of the whole dataset (full) show the random forests method with the highest values; kappa index ranging from 0.55 to 0.42 respectively with the most and least number pixels for training. The two neural networks (multi layered perceptron and its ensemble) and the support vector machines - with default radial basis function kernel - methods follow closely with comparable performanc

    Detection Techniques of Microsecond Gamma-Ray Bursts using Ground-Based Telescopes

    Get PDF
    Gamma-ray observations above 200 MeV are conventionally made by satellite-based detectors. The EGRET detector on the Compton Gamma Ray Observatory (CGRO) has provided good sensitivity for the detection of bursts lasting for more than 200 ms. Theoretical predictions of high-energy gamma-ray bursts produced by quantum-mechanical decay of primordial black holes (Hawking 1971) suggest the emission of bursts on shorter time scales. The final stage of a primordial black hole results in a burst of gamma-rays, peaking around 250 MeV and lasting for a tenth of a microsecond or longer depending on particle physics. In this work we show that there is an observational window using ground-based imaging Cherenkov detectors to measure gamma-ray burst emission at energies E greater than 200 MeV. This technique, with a sensitivity for bursts lasting nanoseconds to several microseconds, is based on the detection of multi-photon-initiated air showers.Comment: accepted for publication in the Astrophysical Journa

    Breakup of diminutive Rayleigh jets

    Get PDF
    Discharging a liquid from a nozzle at sufficient large velocity leads to a continuous jet that due to capillary forces breaks up into droplets. Here we investigate the formation of microdroplets from the breakup of micron-sized jets with ultra high-speed imaging. The diminutive size of the jet implies a fast breakup time scale τc=ρr3/γ\tau_\mathrm{c} = \sqrt{\rho r^3 / \gamma} of the order of 100\,ns{}, and requires imaging at 14 million frames per second. We directly compare these experiments with a numerical lubrication approximation model that incorporates inertia, surface tension, and viscosity [Eggers and Dupont, J. Fluid Mech. 262, 205 (1994); Shi, Brenner, and Nagel, Science 265, 219 (1994)]. The lubrication model allows to efficiently explore the parameter space to investigate the effect of jet velocity and liquid viscosity on the formation of satellite droplets. In the phase diagram we identify regions where the formation of satellite droplets is suppressed. We compare the shape of the droplet at pinch-off between the lubrication approximation model and a boundary integral (BI) calculation, showing deviations at the final moment of the pinch-off. Inspite of this discrepancy, the results on pinch-off times and droplet and satellite droplet velocity obtained from the lubrication approximation agree with the high-speed imaging results

    Dark Matter Structures in the Universe: Prospects for Optical Astronomy in the Next Decade

    Full text link
    The Cold Dark Matter theory of gravitationally-driven hierarchical structure formation has earned its status as a paradigm by explaining the distribution of matter over large spans of cosmic distance and time. However, its central tenet, that most of the matter in the universe is dark and exotic, is still unproven; the dark matter hypothesis is sufficiently audacious as to continue to warrant a diverse battery of tests. While local searches for dark matter particles or their annihilation signals could prove the existence of the substance itself, studies of cosmological dark matter in situ are vital to fully understand its role in structure formation and evolution. We argue that gravitational lensing provides the cleanest and farthest-reaching probe of dark matter in the universe, which can be combined with other observational techniques to answer the most challenging and exciting questions that will drive the subject in the next decade: What is the distribution of mass on sub-galactic scales? How do galaxy disks form and bulges grow in dark matter halos? How accurate are CDM predictions of halo structure? Can we distinguish between a need for a new substance (dark matter) and a need for new physics (departures from General Relativity)? What is the dark matter made of anyway? We propose that the central tool in this program should be a wide-field optical imaging survey, whose true value is realized with support in the form of high-resolution, cadenced optical/infra-red imaging, and massive-throughput optical spectroscopy.Comment: White paper submitted to the 2010 Astronomy & Astrophysics Decadal Surve

    Max '91: Flare research at the next solar maximum

    Get PDF
    To address the central scientific questions surrounding solar flares, coordinated observations of electromagnetic radiation and energetic particles must be made from spacecraft, balloons, rockets, and ground-based observatories. A program to enhance capabilities in these areas in preparation for the next solar maximum in 1991 is recommended. The major scientific issues are described, and required observations and coordination of observations and analyses are detailed. A program plan and conceptual budgets are provided
    corecore