960 research outputs found

    Paper Session III-B - Mars Transportation System Synthesis

    Get PDF
    President George Bush\u27s 1989 challenge to America to support the Space Exploration Initiative (SEI) of Back to the Moon and Manned Mission to Mars gives the space industry an opportunity to achieve effective and efficient space transportation systems (STS\u27s). This paper presents performance and requirements synthesized to support the manned Mars mission of the SEI. The information presented focuses primarily on the Mars transportation system (MTS), which uses nuclear thermal rocket (NTR) propulsion technology associated with accomplishing the manned Mars mission. Data are also shown for a propulsion system options comparison of chemical/ aerobrake and nuclear electric propulsion (NEP) systems

    Paper Session I-B - Nuclear Thermal Rocket Propulsion Application to Mars Missions

    Get PDF
    This paper discusses vehicle configuration options using nuclear thermal rocket (NTR) propulsion application to Mars missions. A reference mission in 2016 using an opposition-class Mars transfer trajectory is assumed. The total mission duration is 435 days. A single 75,000-lb-thrust nuclear engine is used for all major propulsive maneuvers. The studies indicate that three perigee kick burns upon leaving Earth result in the lowest stage weights required in low Earth orbit (LEO). The stay time on Mars is assumed to be 30 days. On the interplanetary return leg en route to Earth, a gravity assist by Venus is employed. The reference mission assumes that the nuclear engine delivers a specific impulse of 925 s with an engine thrust-to-weight ratio of 4. The total stage thrust-to-weight ratio was 0.06. To determine which engine parameters were most critical to good mission performance, calculations were performed over arange of specific impulses and thrust-to-weight ratios. One of the major conclusions resulting from this study is that engine specific impulse is the single most important engine parameter in reducing overall stage weight, provided the engine thrust-to-weight ratio is above approximately 4. Lower engine thrust-to-weight ratios were found to incur severe performance penalties

    Comparing index-based vulnerability assessments in the Mississippi Delta: Implications of contrasting theories, indicators, and aggregation methodologies

    Get PDF
    There are many index-based approaches for assessing vulnerability to socio-natural hazards with differences in underlying theory, indicator selection and aggregation methodology. Spatially explicit output scores depend on these characteristics and contrasting approaches can therefore lead to very different policy implications. These discrepancies call for more critical reflection on index design and utility, a discussion that has not kept pace with the impetus for vulnerability assessments and respective index creation and application following the Hyogo Framework for Action 2005–2015. Comparing index outputs is an effective approach in this regard. Here, the Social Vulnerability Index (SoVI®) and the vulnerability component of the Global Delta Risk Index (GDRI) are applied at census tract level in the Mississippi Delta and visually and quantitatively compared. While the SoVI® is grounded in the hazard/risk research paradigm with primarily socio-economic indicators and an inductive principal component methodology, the GDRI incorporates advancements from sustainability science with ecosystem-based indicators and a modular hierarchical design. Maps, class rank changes, and correlations are used to assess the convergence and divergence of these indexes across the delta. Results show that while very different theoretical frameworks influence scores through indicator selection, methodology of index calculation has an even greater effect on output. Within aggregative methodology, the treatment of inter-indicator correlation is decisive. Implications include the need for an increased focus on index methodology and validation of results, transparency, and critical reflection regarding assessment limitations, as our results imply that contradictory risk reduction policies could be considered depending on the assessment methodology used

    The SKA Particle Array Prototype: The First Particle Detector at the Murchison Radio-astronomy Observatory

    Full text link
    We report on the design, deployment, and first results from a scintillation detector deployed at the Murchison Radio-astronomy Observatory (MRO). The detector is a prototype for a larger array -- the Square Kilometre Array Particle Array (SKAPA) -- planned to allow the radio-detection of cosmic rays with the Murchison Widefield Array and the low-frequency component of the Square Kilometre Array. The prototype design has been driven by stringent limits on radio emissions at the MRO, and to ensure survivability in a desert environment. Using data taken from Nov.\ 2018 to Feb.\ 2019, we characterize the detector response while accounting for the effects of temperature fluctuations, and calibrate the sensitivity of the prototype detector to through-going muons. This verifies the feasibility of cosmic ray detection at the MRO. We then estimate the required parameters of a planned array of eight such detectors to be used to trigger radio observations by the Murchison Widefield Array.Comment: 17 pages, 14 figures, 3 table

    PEACE: Parallel Environment for Assembly and Clustering of Gene Expression

    Get PDF
    We present PEACE, a stand-alone tool for high-throughput ab initio clustering of transcript fragment sequences produced by Next Generation or Sanger Sequencing technologies. It is freely available from www.peace-tools.org. Installed and managed through a downloadable user-friendly graphical user interface (GUI), PEACE can process large data sets of transcript fragments of length 50 bases or greater, grouping the fragments by gene associations with a sensitivity comparable to leading clustering tools. Once clustered, the user can employ the GUI's analysis functions, facilitating the easy collection of statistics and allowing them to single out specific clusters for more comprehensive study or assembly. Using a novel minimum spanning tree-based clustering method, PEACE is the equal of leading tools in the literature, with an interface making it accessible to any user. It produces results of quality virtually identical to those of the WCD tool when applied to Sanger sequences, significantly improved results over WCD and TGICL when applied to the products of Next Generation Sequencing Technology and significantly improved results over Cap3 in both cases. In short, PEACE provides an intuitive GUI and a feature-rich, parallel clustering engine that proves to be a valuable addition to the leading cDNA clustering tools

    Gridded and direct Epoch of Reionisation bispectrum estimates using the Murchison Widefield Array

    Full text link
    We apply two methods to estimate the 21~cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly-spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uvuv-plane. The direct and gridded bispectrum estimators are applied to 21 hours of high-band (167--197~MHz; zz=6.2--7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 hours, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21~cm bispectrum may be accessible in less time than the 21~cm power spectrum for some wave modes, with detections in hundreds of hours.Comment: 19 pages, 10 figures, accepted for publication in PAS

    A new layout optimization technique for interferometric arrays, applied to the MWA

    Get PDF
    Antenna layout is an important design consideration for radio interferometers because it determines the quality of the snapshot point spread function (PSF, or array beam). This is particularly true for experiments targeting the 21 cm Epoch of Reionization signal as the quality of the foreground subtraction depends directly on the spatial dynamic range and thus the smoothness of the baseline distribution. Nearly all sites have constraints on where antennas can be placed---even at the remote Australian location of the MWA (Murchison Widefield Array) there are rock outcrops, flood zones, heritages areas, emergency runways and trees. These exclusion areas can introduce spatial structure into the baseline distribution that enhance the PSF sidelobes and reduce the angular dynamic range. In this paper we present a new method of constrained antenna placement that reduces the spatial structure in the baseline distribution. This method not only outperforms random placement algorithms that avoid exclusion zones, but surprisingly outperforms random placement algorithms without constraints to provide what we believe are the smoothest constrained baseline distributions developed to date. We use our new algorithm to determine antenna placements for the originally planned MWA, and present the antenna locations, baseline distribution, and snapshot PSF for this array choice.Comment: 12 pages, 6 figures, 1 table. Accepted for publication in MNRA
    corecore