115,554 research outputs found

    The CFEPS Kuiper Belt Survey: Strategy and Pre-survey Results

    Full text link
    We present the data acquisition strategy and characterization procedures for the Canada-France Ecliptic Plane Survey (CFEPS), a sub-component of the Canada-France-Hawaii Telescope Legacy Survey. The survey began in early 2003 and as of summer 2005 has covered 430 square degrees of sky within a few degrees of the ecliptic. Moving objects beyond the orbit of Uranus are detected to a magnitude limit of mRm_R=23 -- 24 (depending on the image quality). To track as large a sample as possible and avoid introducing followup bias, we have developed a multi-epoch observing strategy that is spread over several years. We present the evolution of the uncertainties in ephemeris position and orbital elements as the objects progress through the epochs. We then present a small 10-object sample that was tracked in this manner as part of a preliminary survey starting a year before the main CFEPS project. We describe the CFEPS survey simulator, to be released in 2006, which allows theoretical models of the Kuiper Belt to be compared with the survey discoveries since CFEPS has a well-documented pointing history with characterized detection efficiencies as a function of magnitude and rate of motion on the sky. Using the pre-survey objects we illustrate the usage of the simulator in modeling the classical Kuiper Belt.Comment: to be submitted to Icaru

    Semi-Strong Form Market Hypothesis: Evidence from CNBC\u27s Jim Cramer\u27s Mad Money Stock Recommendations

    Get PDF
    Mad Money has become one of the most popular shows on CNBC. The host, Jim Cramer, has an outlandish style and personality that viewers find intoxicating. Cramer\u27s goal for the show is to make people money. Does he succeed? This paper finds that investors can expect to gain above-average, risk adjusted returns by following Cramer\u27s stock recommendations and trading accordingly. These findings challenge the semi-strong form market hypothesis. According to this hypothesis investors should not recognize gains trading on public information since it states that the market has already adjusted prices for that information. It also contributes to current literature by providing analysis on the different segments of the Mad Money program and serving as a jumping-off point for future research on a possible Jim-Cramer-Mad-Money hedge fund strategy

    Locating and quantifying gas emission sources using remotely obtained concentration data

    Full text link
    We describe a method for detecting, locating and quantifying sources of gas emissions to the atmosphere using remotely obtained gas concentration data; the method is applicable to gases of environmental concern. We demonstrate its performance using methane data collected from aircraft. Atmospheric point concentration measurements are modelled as the sum of a spatially and temporally smooth atmospheric background concentration, augmented by concentrations due to local sources. We model source emission rates with a Gaussian mixture model and use a Markov random field to represent the atmospheric background concentration component of the measurements. A Gaussian plume atmospheric eddy dispersion model represents gas dispersion between sources and measurement locations. Initial point estimates of background concentrations and source emission rates are obtained using mixed L2-L1 optimisation over a discretised grid of potential source locations. Subsequent reversible jump Markov chain Monte Carlo inference provides estimated values and uncertainties for the number, emission rates and locations of sources unconstrained by a grid. Source area, atmospheric background concentrations and other model parameters are also estimated. We investigate the performance of the approach first using a synthetic problem, then apply the method to real data collected from an aircraft flying over: a 1600 km^2 area containing two landfills, then a 225 km^2 area containing a gas flare stack

    Evolution of the Milky Way in Semi-Analytic Models: Detecting Cold Gas at z=3 with ALMA and SKA

    Full text link
    We forecast the abilities of the Atacama Large Millimeter/submillimeter Array (ALMA) and the Square Kilometer Array (SKA) to detect CO and HI emission lines in galaxies at redshift z=3. A particular focus is set on Milky Way (MW) progenitors at z=3 for their detection within 24 h constitutes a key science goal of ALMA. The analysis relies on a semi-analytic model, which permits the construction of a MW progenitor sample by backtracking the cosmic history of all simulated present-day galaxies similar to the real MW. Results: (i) ALMA can best observe a MW at z=3 by looking at CO(3-2) emission. The probability of detecting a random model MW at 3-sigma in 24 h using 75 km/s channels is roughly 50%, and these odds can be increased by co-adding the CO(3-2) and CO(4-3) lines. These lines fall into ALMA band 3, which therefore represents the optimal choice towards MW detections at z=3. (ii) Higher CO transitions contained in the ALMA bands geq6 will be invisible, unless the considered MW progenitor coincidentally hosts a major starburst or an active black hole. (iii) The high-frequency array of SKA, fitted with 28.8 GHz receivers, would be a powerful instrument for observing CO(1-0) at z=3, able to detect nearly all simulated MWs in 24 h. (iv) HI detections in MWs at z=3 using the low-frequency array of SKA will be impossible in any reasonable observing time. (v) SKA will nonetheless be a supreme ha survey instrument through its enormous instantaneous field-of-view (FoV). A one year pointed HI survey with an assumed FoV of 410 sqdeg would reveal at least 10^5 galaxies at z=2.95-3.05. (vi) If the positions and redshifts of those galaxies are known from an optical/infrared spectroscopic survey, stacking allows the detection of HI at z=3 in less than 24 h.Comment: 14 pages, 5 figures, 5 table

    Surveys of Galaxy Clusters with the Sunyaev Zel'dovich Effect

    Get PDF
    We have created mock Sunyaev-Zel'dovich effect (SZE) surveys of galaxy clusters using high resolution N-body simulations. To the pure surveys we add `noise' contributions appropriate to instrument and primary CMB anisotropies. Applying various cluster finding strategies to these mock surveys we generate catalogues which can be compared to the known positions and masses of the clusters in the simulations. We thus show that the completeness and efficiency that can be achieved depend strongly on the frequency coverage, noise and beam characteristics of the instruments, as well as on the candidate threshold. We study the effects of matched filtering techniques on completeness, and bias. We suggest a gentler filtering method than matched filtering in single frequency analyses. We summarize the complications that arise when analyzing the SZE signal at a single frequency, and assess the limitations of such an analysis. Our results suggest that some sophistication is required when searching for `clusters' within an SZE map.Comment: 8 pages, 7 figure

    BDGS: A Scalable Big Data Generator Suite in Big Data Benchmarking

    Full text link
    Data generation is a key issue in big data benchmarking that aims to generate application-specific data sets to meet the 4V requirements of big data. Specifically, big data generators need to generate scalable data (Volume) of different types (Variety) under controllable generation rates (Velocity) while keeping the important characteristics of raw data (Veracity). This gives rise to various new challenges about how we design generators efficiently and successfully. To date, most existing techniques can only generate limited types of data and support specific big data systems such as Hadoop. Hence we develop a tool, called Big Data Generator Suite (BDGS), to efficiently generate scalable big data while employing data models derived from real data to preserve data veracity. The effectiveness of BDGS is demonstrated by developing six data generators covering three representative data types (structured, semi-structured and unstructured) and three data sources (text, graph, and table data)

    Astrophotonic micro-spectrographs in the era of ELTs

    Full text link
    The next generation of Extremely Large Telescopes (ELT), with diameters up to 39 meters, will start opera- tion in the next decade and promises new challenges in the development of instruments. The growing field of astrophotonics (the use of photonic technologies in astronomy) can partly solve this problem by allowing mass production of fully integrated and robust instruments combining various optical functions, with the potential to reduce the size, complexity and cost of instruments. In this paper, we focus on developments in integrated micro-spectrographs and their potential for ELTs. We take an inventory of the identified technologies currently in development, and compare the performance of the different concepts. We show that in the current context of single-mode instruments, integrated spectrographs making use of, e.g., a photonic lantern can be a solution to reach the desired performance. However, in the longer term, there is a clear need to develop multimode devices to improve overall the throughput and sensitivity, while decreasing the instrument complexity.Comment: 9 pages. 2 figures. Proceeding of SPIE 9147 "Ground-based and Airborne Instrumentation for Astronomy V

    High-Redshift Galaxies: Their Predicted Size and Surface Brightness Distributions and Their Gravitational Lensing Probability

    Get PDF
    Direct observations of the first generation of luminous objects will likely become feasible over the next decade. The advent of the Next Generation Space Telescope (NGST) will allow imaging of numerous galaxies and mini-quasars at redshifts z>5. We apply semi-analytic models of structure formation to estimate the rate of multiple imaging of these sources by intervening gravitational lenses. Popular CDM models for galaxy formation yield a lensing optical depth of about 1% for sources at redshift 10. The expected slope of the luminosity function of the early sources implies an additional magnification bias of about 5, bringing the fraction of lensed sources at z=10 to about 5%. We estimate the angular size distribution of high-redshift disk galaxies and find that most of them are more extended than the resolution limit of NGST, roughly 0.06 arcseconds. We also show that there is only a modest redshift evolution in the mean surface brightness of galaxies at z>2. The expected increase by 1-2 orders of magnitude in the number of resolved sources on the sky, due to observations with NGST, will dramatically improve upon the statistical significance of existing weak lensing measurements. We show that, despite this increase in the density of sources, confusion noise from z>2 galaxies is expected to be small for NGST observations.Comment: 27 pages, 8 PostScript figures (of which two are new), revised version accepted for Ap
    • …
    corecore