53 research outputs found

    A Data Exploration Tool for Large Sets of Spectra

    Get PDF
    We present an exploration tool for very large spectrum data sets such as the SDSS (Sloan Digital Sky Survey), LAMOST (Large Sky Area Multi-Object Fiber Spectroscopic Telescope), and 4MOST (4-meter Multi-Object Spectroscopic Telescope) data sets. The tool works in two stages: the first uses batch processing and the second runs interactively. The latter employs the NASA hyperwall, a configuration of 128 workstation displays (8 by 16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. The stellar subset of the Sloan Digital Sky Survey, DR10, was chosen to show how the our tool may be used. In stage one, SDSS files for 569,740 stars are processed through our data pipeline. The pipeline fits each spectrum using an iterative continuum algorithm, distinguishing emission from absorption and handling molecular absorption bands correctly. It then measures 1659 discrete atomic and molecular spectral features that were carefully preselected based on their likelihood of being visible at some spectral type. The depths relative to the local continuum at each feature wavelength are determined for each spectrum: these depths, the local S/N (signal to noise ratio) level, and DR10-supplied variables such as magnitudes, colors, positions, and radial velocities are the basic measured quantities used on the hyperwall. In stage two, each hyperwall panel is used to display a 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the stars. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering are immediately apparent when examining and inter-comparing the different panels on the hyperwall. The interactive software allows the user to select the stars in any interesting region of any 2-D plot on the hyperwall, immediately rendering the same stars on all the other 2-D plots in a unique color. The process may be repeated multiple times, each selection displaying a distinctive color on all the plots. At any time, the spectra of the selected stars may be examined in detail on a connected workstation display. We illustrate how our approach allows us to quickly isolate and examine such interesting stellar subsets as EMP (Extremely MetalPoor) stars, CV (Cataclymic Variable) stars and C (Carbon)-rich stars

    Multi-Star Wavefront Control for the Wide-Field Infrared Survey Telescope

    Get PDF
    The Wide-Field Infrared Survey Telescope (WFIRST) is planned to have a coronagraphic instrument (CGI) to enable high-contrast direct imaging of exoplanets around nearby stars. The majority of nearby FGK stars are located in multi-star systems, including the Alpha Centauri stars which may represent the best quality targets for the CGI on account of their proximity and brightness potentially allowing the direct imaging of rocky planets. However, a binary system exhibits additional leakage from the off-axis companion star that may be brighter than the target exoplanet. Multi-Star Wavefront Control (MSWC) is a wavefront-control technique that allows suppression of starlight of both stars in a binary system thus enabling direct imaging of circumstellar planets in binary star systems such as Alpha Centauri. We explore the capabilities of the WFIRST CGI instrument to directly image multi-star systems using MSWC. We consider several simulated scenarios using the WFIRST CGI's Shaped Pupil Coronagraph Disk Mask. First, we consider close binaries such as Mu Cassiopeia that require no modifications to the WFIRST CGI instrument and can be implemented as a purely algorithmic solution. Second, we consider wide binaries such as Alpha Centauri that require a diffraction grating to enable suppression of the off-axis starlight leakage at Super-Nyquist separations. We demonstrate via simulation dark holes in 10 percent broadband compatible with the WFIRST CGI

    Interactive Visualization of High-Dimensional Petascale Ocean Data

    Get PDF
    We describe an application for interactive visualization of 5 petabytes of time-varying multivariate data from a high-resolution global ocean circulation model. The input data are 10311 hourly (ocean time) time steps of various 2D and 3D fields from a 22-billion point 1/48- degree lat-lon cap configuration of the MIT General Circulation Model (MITgcm). We map the global horizontal model domain onto our 128-screen (8x16) tiled display wall to produce a canonical tiling with approximately one MITgcm grid point per display pixel, and using this tiling we encode the entire time series for multiple native and computed scalar quantities at a collection of ocean depths. We reduce disk bandwidth requirements by converting the models floating point data to 16-bit fixed point values, and compressing those values with a lossless video encoder, which together allow synchronized playback at 24 time steps per second across all 128 displays. The application allows dynamic assignment of any two encoded tiles to any display, and has multiple interfaces for quickly specifying various orderly arrangements of tiles. All subsequent rendering is done on the fly, with run time control of colormaps, transfer functions, histogram equalization, and labeling. The two data streams on each screen can be rendered independently and combined in various ways, including blending, differencing, horizontal/ vertical wipes, and checkerboarding. The two data streams on any screen can optionally be displayed as a scatterplot in their joint attribute space. All scatterplots and map-view plots from the same x/y location and depth are linked so they all show the current brushable selection. Ocean scientists have used the system, and have found previously unidentified features in the data

    LightForce Photon-Pressure Collision Avoidance: Updated Efficiency Analysis Utilizing a Highly Parallel Simulation Approach

    Get PDF
    This paper provides an updated efficiency analysis of the LightForce space debris collision avoidance scheme. LightForce aims to prevent collisions on warning by utilizing photon pressure from ground based, commercial off the shelf lasers. Past research has shown that a few ground-based systems consisting of 10 kilowatt class lasers directed by 1.5 meter telescopes with adaptive optics could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. Our simulation approach utilizes the entire Two Line Element (TLE) catalogue in LEO for a given day as initial input. Least-squares fitting of a TLE time series is used for an improved orbit estimate. We then calculate the probability of collision for all LEO objects in the catalogue for a time step of the simulation. The conjunctions that exceed a threshold probability of collision are then engaged by a simulated network of laser ground stations. After those engagements, the perturbed orbits are used to re-assess the probability of collision and evaluate the efficiency of the system. This paper describes new simulations with three updated aspects: 1) By utilizing a highly parallel simulation approach employing hundreds of processors, we have extended our analysis to a much broader dataset. The simulation time is extended to one year. 2) We analyze not only the efficiency of LightForce on conjunctions that naturally occur, but also take into account conjunctions caused by orbit perturbations due to LightForce engagements. 3) We use a new simulation approach that is regularly updating the LightForce engagement strategy, as it would be during actual operations. In this paper we present our simulation approach to parallelize the efficiency analysis, its computational performance and the resulting expected efficiency of the LightForce collision avoidance system. Results indicate that utilizing a network of four LightForce stations with 20 kilowatt lasers, 85% of all conjunctions with a probability of collision Pc > 10 (sup -6) can be mitigated

    Implementation of an Open-Scenario, Long-Term Space Debris Simulation Approach

    Get PDF
    This paper provides a status update on the implementation of a flexible, long-term space debris simulation approach. The motivation is to build a tool that can assess the long-term impact of various options for debris-remediation, including the LightForce space debris collision avoidance concept that diverts objects using photon pressure [9]. State-of-the-art simulation approaches that assess the long-term development of the debris environment use either completely statistical approaches, or they rely on large time steps on the order of several days if they simulate the positions of single objects over time. They cannot be easily adapted to investigate the impact of specific collision avoidance schemes or de-orbit schemes, because the efficiency of a collision avoidance maneuver can depend on various input parameters, including ground station positions and orbital and physical parameters of the objects involved in close encounters (conjunctions). Furthermore, maneuvers take place on timescales much smaller than days. For example, LightForce only changes the orbit of a certain object (aiming to reduce the probability of collision), but it does not remove entire objects or groups of objects. In the same sense, it is also not straightforward to compare specific de-orbit methods in regard to potential collision risks during a de-orbit maneuver. To gain flexibility in assessing interactions with objects, we implement a simulation that includes every tracked space object in Low Earth Orbit (LEO) and propagates all objects with high precision and variable time-steps as small as one second. It allows the assessment of the (potential) impact of physical or orbital changes to any object. The final goal is to employ a Monte Carlo approach to assess the debris evolution during the simulation time-frame of 100 years and to compare a baseline scenario to debris remediation scenarios or other scenarios of interest. To populate the initial simulation, we use the entire space-track object catalog in LEO. We then use a high precision propagator to propagate all objects over the entire simulation duration. If collisions are detected, the appropriate number of debris objects are created and inserted into the simulation framework. Depending on the scenario, further objects, e.g. due to new launches, can be added. At the end of the simulation, the total number of objects above a cut-off size and the number of detected collisions provide benchmark parameters for the comparison between scenarios. The simulation approach is computationally intensive as it involves tens of thousands of objects; hence we use a highly parallel approach employing up to a thousand cores on the NASA Pleiades supercomputer for a single run. This paper describes our simulation approach, the status of its implementation, the approach to developing scenarios and examples of first test runs

    LightForce Photon-pressure Collision Avoidance: Efficiency Analysis in the Current Debris Environment and Long-Term Simulation Perspective

    Get PDF
    This work provides an efficiency analysis of the LightForce space debris collision avoidance scheme in the current debris environment and describes a simulation approach to assess its impact on the long-term evolution of the space debris environment. LightForce aims to provide just-in-time collision avoidance by utilizing photon pressure from ground-based industrial lasers. These ground stations impart minimal accelerations to increase the miss distance for a predicted conjunction between two objects. In the first part of this paper we will present research that investigates the short-term effect of a few systems consisting of 10kW class lasers directed by 1.5 m diameter telescopes using adaptive optics. The results found such a network of ground stations to mitigate more than 85 percent of conjunctions and could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. While these are impressive numbers that indicate LightForce's utility in the short-term, the remaining 15 percent of possible collisions contain (among others) conjunctions between two massive objects that would add large amount of debris if they collide. Still, conjunctions between massive objects and smaller objects can be mitigated. Hence we choose to expand the capabilities of the simulation software to investigate the overall effect of a network of LightForce stations on the long-term debris evolution. In the second part of this paper, we will present the planed simulation approach for that effort

    A Habitable-zone Earth-sized Planet Rescued from False Positive Status

    Get PDF
    We report the discovery of an Earth-sized planet in the habitable zone of a low-mass star called Kepler-1649. The planet, Kepler-1649 c, is 1.06−0.10+0.15^{+0.15}_{-0.10} times the size of Earth and transits its 0.1977 +/- 0.0051 Msun mid M-dwarf host star every 19.5 days. It receives 74 +/- 3 % the incident flux of Earth, giving it an equilibrium temperature of 234 +/- 20K and placing it firmly inside the circumstellar habitable zone. Kepler-1649 also hosts a previously-known inner planet that orbits every 8.7 days and is roughly equivalent to Venus in size and incident flux. Kepler-1649 c was originally classified as a false positive by the Kepler pipeline, but was rescued as part of a systematic visual inspection of all automatically dispositioned Kepler false positives. This discovery highlights the value of human inspection of planet candidates even as automated techniques improve, and hints that terrestrial planets around mid to late M-dwarfs may be more common than those around more massive stars.Comment: 11 pages, 3 figures, 1 table. Accepted for publication in ApJ

    Prospectus, February 16, 1978

    Get PDF
    STU-GO VACANCIES FILLED TODAY, TOMORROW; Trail is unopposed for presidential post; Ballje, Berry, Henze vie for veep\u27s job; Staff editorial: Should Parkland foot the bill for vets\u27 education?; Parkland College News in Brief: CHI helps you understand the doctor, SWAMP meets, Land lab has good season, SIU rep here today; Return of the Hilltoppers: Clambering up Mount Parkland -- \u27Because it was there!\u27; Treasury post draws two hopefuls; One running for secretary; Davis seeks PR position; Bundy unopposed in bid for student services post; Three candidates in race for convocations senator; Swanson pursuing office of day senator; Life spanning draws large crowd; Snow day melts extra study time; Will the big bands ever return?; Taped artist interviews at U of I; Toll free tax answers for Illinois residents; \u27Furry friends\u27 contest; Classifieds; State basketball tourney schedule; Women win 10th: Cobras take victory number 20; Long life program lists classes; Cherry Orchard opening is apple of Krannert\u27s eye; Bookworms invited to U of I; It\u27s tourney time; Women beat Kankakee, top .500; Bouncing Bob Basketball Bonanza: If you think LAST week was tough...; Bouncing Bob Basketball Bonanza; Men grab two more winshttps://spark.parkland.edu/prospectus_1978/1025/thumbnail.jp

    Planetary Candidates Observed by Kepler VI: Planet Sample from Q1-Q16 (47 Months)

    Get PDF
    \We present the sixth catalog of Kepler candidate planets based on nearly 4 years of high precision photometry. This catalog builds on the legacy of previous catalogs released by the Kepler project and includes 1493 new Kepler Objects of Interest (KOIs) of which 554 are planet candidates, and 131 of these candidates have best fit radii <1.5 R_earth. This brings the total number of KOIs and planet candidates to 7305 and 4173 respectively. We suspect that many of these new candidates at the low signal-to-noise limit may be false alarms created by instrumental noise, and discuss our efforts to identify such objects. We re-evaluate all previously published KOIs with orbital periods of >50 days to provide a consistently vetted sample that can be used to improve planet occurrence rate calculations. We discuss the performance of our planet detection algorithms, and the consistency of our vetting products. The full catalog is publicly available at the NASA Exoplanet Archive.Comment: 18 pages, to be published in the Astrophysical Journal Supplement Serie

    Kepler Certified False Positive Table

    Get PDF
    This document describes the Kepler Certied False Positive table hosted at the Exoplanet Archive1, herein referred to as the CFP table. This table is the result of detailed examination by the Kepler False Positive Working Group (FPWG) of declared false positives in the Kepler Object of Interest (KOI) tables (see, for example, Batalha et al. (2012); Burke et al.(2014); Rowe et al. (2015); Mullally et al. (2015); Coughlin et al. (2015b)) at the Exoplanet Archive. A KOI is considered a false positive if it is not due to a planet orbiting the KOI's target star. The CFP table contains all KOIs in the Exoplanet Archive cumulative KOI table. The purpose of the CFP table is to provide a list of certified false positive KOIs. A KOI is certified as a false positive when, in the judgement of the FPWG, there is no plausible planetary interpretation of the observational evidence, which we summarize by saying that the evidence for a false positive is compelling. This certification process involves detailed examination using all available data for each KOI, establishing a high-reliability ground truth set. The CFP table can be used to estimate the reliability of, for example, the KOI tables which are created using only Kepler photometric data, so the disposition of individual KOIs may differ in the KOI and CFP tables. Follow-up observers may find the CFP table useful to avoid observing false positives
    • …
    corecore