4 research outputs found

    System Applications Software Development and Testing for the Spaceport Command and Control System

    Get PDF
    Known as "America's Spaceport," one of Kennedy Space Center's (KSC) primary responsibilities is the successful preparation for and launch of rockets into space. KSC's Engineering Software Branch has been tasked with creating a new command and control system that will provide check-out and launch control for future rockets and spacecraft. While work on the software began several years ago, development is ongoing and the operators who use the software on a daily basis have requested several features to improve their user experience. My internship in the fall of 2018 involved developing the source code and unit tests for two of these requested features: "Display Data with Persistence" (DDP) and "Save Events Button" (SEB). DDP's primary goal is to aid with ergonomics. Currently, users must press-and-hold on the mouse button to display information about points on a data plot. Once DDP is integrated, users will have the ability to double-click on a data plot to display that same information with persistence. Independent from DDP, the SEB provides users the ability to take information about different events that occur in the control system and save that data into a simple Comma Separated File (.csv) file format for easier analysis at a future time

    A Search for Technosignatures Around 11,680 Stars with the Green Bank Telescope at 1.15-1.73 GHz

    Full text link
    We conducted a search for narrowband radio signals over four observing sessions in 2020-2023 with the L-band receiver (1.15-1.73 GHz) of the 100 m diameter Green Bank Telescope. We pointed the telescope in the directions of 62 TESS Objects of Interest, capturing radio emissions from a total of ~11,680 stars and planetary systems in the ~9 arcminute beam of the telescope. All detections were either automatically rejected or visually inspected and confirmed to be of anthropogenic nature. In this work, we also quantified the end-to-end efficiency of radio SETI pipelines with a signal injection and recovery analysis. The UCLA SETI pipeline recovers 94.0% of the injected signals over the usable frequency range of the receiver and 98.7% of the injections when regions of dense RFI are excluded. In another pipeline that uses incoherent sums of 51 consecutive spectra, the recovery rate is ~15 times smaller at ~6%. The pipeline efficiency affects calculations of transmitter prevalence and SETI search volume. Accordingly, we developed an improved Drake Figure of Merit and a formalism to place upper limits on transmitter prevalence that take the pipeline efficiency and transmitter duty cycle into account. Based on our observations, we can state at the 95% confidence level that fewer than 6.6% of stars within 100 pc host a transmitter that is detectable in our search (EIRP > 1e13 W). For stars within 20,000 ly, the fraction of stars with detectable transmitters (EIRP > 5e16 W) is at most 3e-4. Finally, we showed that the UCLA SETI pipeline natively detects the signals detected with AI techniques by Ma et al. (2023).Comment: 22 pages, 9 figures, submitted to AJ, revise

    Planning Satellite Swarm Measurements for Earth Science Models: Comparing Constraint Processing and MILP Methods

    No full text
    We compare two planner solutions for a challenging Earth science application to plan coordinated measurements (observations) for a constellation of satellites. This problem is combinatorially explosive, involving many degrees of freedom for planner choices. Each satellite carries two different sensors and is maneuverable to 61 pointing angle options. The sensors collect data to update the predictions made by a high-fidelity global soil moisture prediction model. Soil moisture is an important geophysical variable whose knowledge is used in applications such as crop health monitoring and predictions of floods, droughts, and fires. The global soil-moisture model produces soil-moisture predictions with associated prediction errors over the globe represented by a grid of 1.67 million Ground Positions (GPs). The prediction error varies over space and time and can change drastically with events like rain/fire. The planner's goal is to select measurements which reduce prediction errors to improve future predictions. This is done by targeting high-quality observations at locations of high prediction-error. Observations can be made in multiple ways, such as by using one or more instruments or different pointing angles; the planner seeks to select the way with the least measurement-error (higher observation quality). In this paper we compare two planning approaches to this problem: Dynamic Constraint Processing (DCP) and Mixed Integer Linear Programming (MILP). We match inputs and metrics for both DCP and MILP algorithms to enable a direct apples-to-apples comparison. DCP uses domain heuristics to find solutions within a reasonable time for our application but cannot be proven optimal, while the MILP produces provably optimal solutions. We demonstrate and discuss the trades between DCP flexibility and performance vs. MILP's promise of provable optimality
    corecore