13 research outputs found

    The DECam Ecliptic Exploration Project (DEEP) II. Observational Strategy and Design

    Full text link
    We present the DECam Ecliptic Exploration Project (DEEP) survey strategy including observing cadence for orbit determination, exposure times, field pointings and filter choices. The overall goal of the survey is to discover and characterize the orbits of a few thousand Trans-Neptunian Objects (TNOs) using the Dark Energy Camera (DECam) on the Cerro Tololo Inter-American Observatory (CTIO) Blanco 4 meter telescope. The experiment is designed to collect a very deep series of exposures totaling a few hours on sky for each of several 2.7 square degree DECam fields-of-view to achieve a magnitude of about 26.2 using a wide VR filter which encompasses both the V and R bandpasses. In the first year, several nights were combined to achieve a sky area of about 34 square degrees. In subsequent years, the fields have been re-visited to allow TNOs to be tracked for orbit determination. When complete, DEEP will be the largest survey of the outer solar system ever undertaken in terms of newly discovered object numbers, and the most prolific at producing multi-year orbital information for the population of minor planets beyond Neptune at 30 au.Comment: 29 pages, 4 figures and 4 table

    Deep Drilling in the Time Domain with DECam: Survey Characterization

    Full text link
    This paper presents a new optical imaging survey of four deep drilling fields (DDFs), two Galactic and two extragalactic, with the Dark Energy Camera (DECam) on the 4 meter Blanco telescope at the Cerro Tololo Inter-American Observatory (CTIO). During the first year of observations in 2021, >>4000 images covering 21 square degrees (7 DECam pointings), with \sim40 epochs (nights) per field and 5 to 6 images per night per filter in gg, rr, ii, and/or zz, have become publicly available (the proprietary period for this program is waived). We describe the real-time difference-image pipeline and how alerts are distributed to brokers via the same distribution system as the Zwicky Transient Facility (ZTF). In this paper, we focus on the two extragalactic deep fields (COSMOS and ELAIS-S1), characterizing the detected sources and demonstrating that the survey design is effective for probing the discovery space of faint and fast variable and transient sources. We describe and make publicly available 4413 calibrated light curves based on difference-image detection photometry of transients and variables in the extragalactic fields. We also present preliminary scientific analysis regarding Solar System small bodies, stellar flares and variables, Galactic anomaly detection, fast-rising transients and variables, supernovae, and active galactic nuclei.Comment: 22 pages, 17 figures, 2 tables. Accepted to MNRA

    From Data to Software to Science with the Rubin Observatory LSST

    Full text link
    The Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST) dataset will dramatically alter our understanding of the Universe, from the origins of the Solar System to the nature of dark matter and dark energy. Much of this research will depend on the existence of robust, tested, and scalable algorithms, software, and services. Identifying and developing such tools ahead of time has the potential to significantly accelerate the delivery of early science from LSST. Developing these collaboratively, and making them broadly available, can enable more inclusive and equitable collaboration on LSST science. To facilitate such opportunities, a community workshop entitled "From Data to Software to Science with the Rubin Observatory LSST" was organized by the LSST Interdisciplinary Network for Collaboration and Computing (LINCC) and partners, and held at the Flatiron Institute in New York, March 28-30th 2022. The workshop included over 50 in-person attendees invited from over 300 applications. It identified seven key software areas of need: (i) scalable cross-matching and distributed joining of catalogs, (ii) robust photometric redshift determination, (iii) software for determination of selection functions, (iv) frameworks for scalable time-series analyses, (v) services for image access and reprocessing at scale, (vi) object image access (cutouts) and analysis at scale, and (vii) scalable job execution systems. This white paper summarizes the discussions of this workshop. It considers the motivating science use cases, identified cross-cutting algorithms, software, and services, their high-level technical specifications, and the principles of inclusive collaborations needed to develop them. We provide it as a useful roadmap of needs, as well as to spur action and collaboration between groups and individuals looking to develop reusable software for early LSST science.Comment: White paper from "From Data to Software to Science with the Rubin Observatory LSST" worksho

    From Data to Software to Science with the Rubin Observatory LSST

    Full text link
    editorial reviewedThe Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST) dataset will dramatically alter our understanding of the Universe, from the origins of the Solar System to the nature of dark matter and dark energy. Much of this research will depend on the existence of robust, tested, and scalable algorithms, software, and services. Identifying and developing such tools ahead of time has the potential to significantly accelerate the delivery of early science from LSST. Developing these collaboratively, and making them broadly available, can enable more inclusive and equitable collaboration on LSST science. To facilitate such opportunities, a community workshop entitled "From Data to Software to Science with the Rubin Observatory LSST" was organized by the LSST Interdisciplinary Network for Collaboration and Computing (LINCC) and partners, and held at the Flatiron Institute in New York, March 28-30th 2022. The workshop included over 50 in-person attendees invited from over 300 applications. It identified seven key software areas of need: (i) scalable cross-matching and distributed joining of catalogs, (ii) robust photometric redshift determination, (iii) software for determination of selection functions, (iv) frameworks for scalable time-series analyses, (v) services for image access and reprocessing at scale, (vi) object image access (cutouts) and analysis at scale, and (vii) scalable job execution systems. This white paper summarizes the discussions of this workshop. It considers the motivating science use cases, identified cross-cutting algorithms, software, and services, their high-level technical specifications, and the principles of inclusive collaborations needed to develop them. We provide it as a useful roadmap of needs, as well as to spur action and collaboration between groups and individuals looking to develop reusable software for early LSST science

    Sifting Through the Static: Exploring the Space Beyond Neptune with Digital Tracking

    No full text
    Thesis (Ph.D.)--University of Washington, 2022Trans-Neptunian Objects (TNOs) provide a window into the history of the Solar System, but they can be challenging to observe due to their distance from the Sun and relatively low brightness. Digital tracking helps address these challenges by algorithmically searching many possible TNO trajectories in a stack of images, enabling detection of TNOs too faint to detectin single images. Here we report the detection and characterization of 86 classical TNOs, 5 detached TNOs, 6 resonant TNOs, and 2 scattering TNOs that we could not link to any other known objects. We report measurements of semi-major axis, eccentricity, inclination, longitude of ascending node, argument of pericenter, and time of pericenter passage for these 99 objects. We also report values for the absolute magnitude H in the VR band with a largest measured H value of H=9.63. These objects are dynamically classified using 10 Myr Rebound orbital integrations. Additionally, we report the detection of 75 moving objects with short ~4 day arcs that we could not link to any other known objects and place constraints on the barycentric distance, inclination, and longitude of ascending node of these objects. We describe extensions to the Kernel-Based Moving Object Detection (KBMOD) software that helped enable these detections, including an in-line graphics processing unit (GPU) filter, a convolutional neural network (CNN) stamp filter, and an astrometric and photometric post-processing tool. These tools enable KBMOD to take advantage of difference images and help ready KBMOD for deployment on future big data surveys such as LSST

    The DECam Ecliptic Exploration Project (DEEP). II. Observational Strategy and Design

    No full text
    We present the DECam Ecliptic Exploration Project (DEEP) survey strategy, including observing cadence for orbit determination, exposure times, field pointings and filter choices. The overall goal of the survey is to discover and characterize the orbits of a few thousand Trans-Neptunian objects (TNOs) using the Dark Energy Camera (DECam) on the Cerro Tololo Inter-American Observatory Blanco 4 m telescope. The experiment is designed to collect a very deep series of exposures totaling a few hours on sky for each of several 2.7 square degree DECam fields-of-view to achieve approximate depths of magnitude 26.2 using a wide V R filter that encompasses both the V and R bandpasses. In the first year, several nights were combined to achieve a sky area of about 34 square degrees. In subsequent years, the fields have been re-visited to allow TNOs to be tracked for orbit determination. When complete, DEEP will be the largest survey of the outer solar system ever undertaken in terms of newly discovered object numbers, and the most prolific at producing multiyear orbital information for the population of minor planets beyond Neptune at 30 au

    The DECam Ecliptic Exploration Project (DEEP). I. Survey Description, Science Questions, and Technical Demonstration

    No full text
    We present here the DECam Ecliptic Exploration Project (DEEP), a 3 yr NOAO/NOIRLab Survey that was allocated 46.5 nights to discover and measure the properties of thousands of trans-Neptunian objects (TNOs) to magnitudes as faint as VR ∼ 27 mag, corresponding to sizes as small as 20 km diameter. In this paper we present the science goals of this project, the experimental design of our survey, and a technical demonstration of our approach. The core of our project is “digital tracking,” in which all collected images are combined at a range of motion vectors to detect unknown TNOs that are fainter than the single exposure depth of VR ∼ 23 mag. Through this approach, we reach a depth that is approximately 2.5 mag fainter than the standard LSST “wide fast deep” nominal survey depth of 24.5 mag. DEEP will more than double the number of known TNOs with observational arcs of 24 hr or more, and increase by a factor of 10 or more the number of known small (<50 km) TNOs. We also describe our ancillary science goals, including measuring the mean shape distribution of very small main-belt asteroids, and briefly outline a set of forthcoming papers that present further aspects of and preliminary results from the DEEP program
    corecore