20 research outputs found
New Predictions for Neutrino Telescope Event Rates
Recent measurements of the small- deep-inelastic regime at HERA translate
to new expectations for the neutrino-nucleon cross section at ultrahigh
energies. We present event rates for large underground neutrino telescopes
based on the new cross section for a variety of models of neutrino production
in Active Galactic Nuclei, and we compare these rates with earlier cross
section calculations.Comment: Talk presented by I. Sarcevic at the VIth International Workshop on
Theoretical Aspects of Underground Physics, Toledo, Spain, September 17-21,
1995, 3 p
A Multi-Faceted Approach to Enabling Large-Scale Science in a Microsat Constellation
The Polarimeter to UNify the Corona and Heliosphere (PUNCH) mission is a constellation of microsatellites that combines advances in several areas of technology enabling the use of simple imaging instrumentation to measure, to-date, inaccessible aspects of the outer corona and solar wind. The primary PUNCH measurement is brightness and polarization state of light scattered by electrons entrained in solar wind features. This measurement is made possible in the context of a small explorer budget by leveraging a combination of three key elements: (a) a constellation of four small satellites conducting synchronized observations, (b) availability of low-cost off-the-shelf components, and (c) advanced and rigorous science data processing that enables the four microsats to produce 3D images as a single virtual observatory. This paper will discuss the contribution of each of these key enablers, and present the overall status of this NASA Small Explorer mission scheduled for launch in 2025
Recommended from our members
Square Root Compression and Noise Effects in Digitally Transformed Images
We report on a particular example of noise and data representation interacting to introduce systematic error into scientific measurements. Many instruments collect integer digitized values and apply nonlinear coding, in particular square root coding, to compress the data for transfer or downlink; this can introduce surprising systematic errors when they are decoded for analysis. Square root coding and subsequent decoding typically introduces a variable ±1 count value-dependent systematic bias in the data after reconstitution. This is significant when large numbers of measurements (e.g., image pixels) are averaged together. Using direct modeling of the probability distribution of particular coded values in the presence of instrument noise, one may apply Bayes' theorem to construct a decoding table that reduces this error source to a very small fraction of a digitizer step; in our example, systematic error from square root coding is reduced by a factor of 20 from 0.23 to 0.012 count rms. The method is suitable both for new experiments such as the upcoming PUNCH mission, and also for post facto application to existing data sets—even if the instrument noise properties are only loosely known. Further, the method does not depend on the specifics of the coding formula, and may be applied to other forms of nonlinear coding or representation of data values.
</p
lowderchris/FRoDO: v0.4
The Flux Rope Detection and Organization (FRoDO) code has a few goals in mind:
Automated and consistent detection of magnetic flux rope structures
Operation independent of dataset, such that the end user only needs to provide datacubes of magnetic field components and associated grids
Tracking of detected flux ropes in time, outputting locations and associated statistics
By importing the FRoDO library of routines, there are several subroutines for use:
FRoDO.prep() - Computes magnetic vector potential values in the deVore gauge from input magnetic field data.
FRoDO.FRoDO() - Detects and organizes time histories for magnetic flux ropes.
FRoDO.erupt() - Detects and flags erupting flux rope signatures.
FRoDO.plot() - Creates a standard set of plot outputs for detected flux ropes.
FRoDO.stats() - Computes a series of statistics for erupting and non-erupting magnetic flux ropes.
A sample dataset is available on request for testing in this stage, with assistance for adapting for use in user data sets
Magnetic Flux Rope Identification and Characterization from Observationally Driven Solar Coronal Models
Formed through magnetic field shearing and reconnection in the solar corona, magnetic flux ropes are structures of twisted magnetic field, threaded along an axis. Their evolution and potential eruption are of great importance for space weather. Here we describe a new methodology for the automated detection of flux ropes in simulated magnetic fields, utilizing field-line helicity. Our Flux Rope Detection and Organization (FRoDO) code, which measures the magnetic flux and helicity content of pre-erupting flux ropes over time, as well as detecting eruptions, is publicly available. As a first demonstration, the code is applied to the output from a time-dependent magnetofrictional model, spanning 1996 June 15–2014 February 10. Over this period, 1561 erupting and 2099 non-erupting magnetic flux ropes are detected, tracked, and characterized. For this particular model data, erupting flux ropes have a mean net helicity magnitude of Mx2, while non-erupting flux ropes have a significantly lower mean of Mx2, although there is overlap between the two distributions. Similarly, the mean unsigned magnetic flux for erupting flux ropes is Mx, significantly higher than the mean value of Mx for non-erupting ropes. These values for erupting flux ropes are within the broad range expected from observational and theoretical estimates, although the eruption rate in this particular model is lower than that of observed coronal mass ejections. In the future, the FRoDO code will prove to be a valuable tool for assessing the performance of different non-potential coronal simulations and comparing them with observations
Coronal Holes and Open Magnetic Flux over Cycles 23 and 24
As the observational signature of the footprints of solar magnetic field lines open into the heliosphere, coronal holes provide a critical measure of the structure and evolution of these lines. Using a combination of Solar and Heliospheric Observatory/Extreme ultraviolet Imaging Telescope (SOHO/EIT), Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA), and Solar Terrestrial Relations Observatory/Extreme Ultraviolet Imager (STEREO/EUVI A/B) extreme ultraviolet (EUV) observations spanning 1996 – 2015 (nearly two solar cycles), coronal holes are automatically detected and characterized. Coronal hole area distributions show distinct behavior in latitude, defining the domain of polar and low-latitude coronal holes. The northern and southern polar regions show a clear asymmetry, with a lag between hemispheres in the appearance and disappearance of polar coronal holes
A Unified Framework for Manipulating N-dimensional Astronomical Data and Coordinate Transformations in Python: The NDCube 2 and Astropy APE-14 World Coordinate System APIs
The NDCube 2 API is a Python application programming interface (API) for storing and manipulating N-dimensional coordinate-aware astronomical data. While there are Python packages for handling astronomical data and coordinate transformations separately and for handling specific combinations of dimensions and transformations, none provide a unified and agnostic way of handling them simultaneously. This leads to a proliferation of different APIs for conducting the same analysis tasks on similar types of observations and introduces technical barriers between multi-instrument studies and cross-community collaboration. In this paper, we outline how the NDCube 2 API and its implementation in the open-source, community-developed ndcube package, together with the AstroPy WCS API, help to solve this problem. We discuss the guiding principles underpinning the API design and provide examples of how it is already being used to serve broad sections of the astronomy community, including agency-funded missions. The aim of this paper is to help users better understand the purpose and potential of the NDCube 2 API and ndcube package and hence how to more effectively deploy them in scientific analyses and software development