2,240 research outputs found
MC80: Quantifying Effect of Fleet Health on Sortie Execution in F-16 Fleet
By order of the Secretary of Defense, all of the US Air Force\u27s F-16 units were tasked to improve their fleet health to a Mission Capability (MC) rate of 80 percent, as part of a Department of Defense-wide push to make its Critical Aviation Platforms, and the units that employ them, more ready and lethal. This study uses historical fleet health and sortie execution data captured from LIMS-EV (Weapon System Viewer, 2020) to create a multiple regression model that quantifies the value of increased fleet health, defined as either MC rate or Aircraft Availability (AA) rate, in terms of increasing sortie output. It also uses forecasted near-future sortie demand to assess the utility of the 80 percent MC rate standard towards achieving desired sortie execution levels. This research concludes that both AA rate and MC rate correlate with increased aircraft utilization and that an increase in either fleet health metric correlates to increased annual utilization of roughly five sorties per aircraft. It also identifies AA rate as a more significant input to sortie execution than MC rate. Furthermore, it suggests that an AA rate standard of 71 percent is most appropriate for achieving the aircraft utilization levels needed to satisfy pilot training requirements
Invest to Save: Report and Recommendations of the NSF-DELOS Working Group on Digital Archiving and Preservation
Digital archiving and preservation are important areas for research and development, but there is no agreed upon set of priorities or coherent plan for research in this area. Research projects in this area tend to be small and driven by particular institutional problems or concerns. As a consequence, proposed solutions from experimental projects and prototypes tend not to scale to millions of digital objects, nor do the results from disparate projects readily build on each other. It is also unclear whether it is worthwhile to seek general solutions or whether different strategies are needed for different types of digital objects and collections. The lack of coordination in both research and development means that there are some areas where researchers are reinventing the wheel while other areas are neglected.
Digital archiving and preservation is an area that will benefit from an exercise in analysis, priority setting, and planning for future research. The WG aims to survey current research activities, identify gaps, and develop a white paper proposing future research directions in the area of digital preservation. Some of the potential areas for research include repository architectures and inter-operability among digital archives; automated tools for capture, ingest, and normalization of digital objects; and harmonization of preservation formats and metadata. There can also be opportunities for development of commercial products in the areas of mass storage systems, repositories and repository management systems, and data management software and tools.
The Whole is Greater than the Sum of the Parts: Optimizing the Joint Science Return from LSST, Euclid and WFIRST
The focus of this report is on the opportunities enabled by the combination
of LSST, Euclid and WFIRST, the optical surveys that will be an essential part
of the next decade's astronomy. The sum of these surveys has the potential to
be significantly greater than the contributions of the individual parts. As is
detailed in this report, the combination of these surveys should give us
multi-wavelength high-resolution images of galaxies and broadband data covering
much of the stellar energy spectrum. These stellar and galactic data have the
potential of yielding new insights into topics ranging from the formation
history of the Milky Way to the mass of the neutrino. However, enabling the
astronomy community to fully exploit this multi-instrument data set is a
challenging technical task: for much of the science, we will need to combine
the photometry across multiple wavelengths with varying spectral and spatial
resolution. We identify some of the key science enabled by the combined surveys
and the key technical challenges in achieving the synergies.Comment: Whitepaper developed at June 2014 U. Penn Workshop; 28 pages, 3
figure
Health Recommender Systems Development, Usage, and Evaluation from 2010 to 2022: A Scoping Review
A health recommender system (HRS) provides a user with personalized medical information based on the user’s health profile. This scoping review aims to identify and summarize the HRS development in the most recent decade by focusing on five key aspects: health domain, user, recommended item, recommendation technology, and system evaluation. We searched PubMed, ACM Digital Library, IEEE Xplore, Web of Science, and Scopus databases for English literature published between 2010 and 2022. Our study selection and data extraction followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews. The following are the primary results: sixty-three studies met the eligibility criteria and were included in the data analysis. These studies involved twenty-four health domains, with both patients and the general public as target users and ten major recommended items. The most adopted algorithm of recommendation technologies was the knowledge-based approach. In addition, fifty-nine studies reported system evaluations, in which two types of evaluation methods and three categories of metrics were applied. However, despite existing research progress on HRSs, the health domains, recommended items, and sample size of system evaluation have been limited. In the future, HRS research shall focus on dynamic user modelling, utilizing open-source knowledge bases, and evaluating the efficacy of HRSs using a large sample size. In conclusion, this study summarized the research activities and evidence pertinent to HRSs in the most recent ten years and identified gaps in the existing research landscape. Further work shall address the gaps and continue improving the performance of HRSs to empower users in terms of healthcare decision making and self-management
Reducing Zero-point Systematics in Dark Energy Supernova Experiments
We study the effect of filter zero-point uncertainties on future supernova
dark energy missions. Fitting for calibration parameters using simultaneous
analysis of all Type Ia supernova standard candles achieves a significant
improvement over more traditional fit methods. This conclusion is robust under
diverse experimental configurations (number of observed supernovae, maximum
survey redshift, inclusion of additional systematics). This approach to
supernova fitting considerably eases otherwise stringent mission calibration
requirements. As an example we simulate a space-based mission based on the
proposed JDEM satellite; however the method and conclusions are general and
valid for any future supernova dark energy mission, ground or space-based.Comment: 30 pages,8 figures, 5 table, one reference added, submitted to
Astroparticle Physic
Weak Lensing from Space I: Instrumentation and Survey Strategy
A wide field space-based imaging telescope is necessary to fully exploit the
technique of observing dark matter via weak gravitational lensing. This first
paper in a three part series outlines the survey strategies and relevant
instrumental parameters for such a mission. As a concrete example of hardware
design, we consider the proposed Supernova/Acceleration Probe (SNAP). Using
SNAP engineering models, we quantify the major contributions to this
telescope's Point Spread Function (PSF). These PSF contributions are relevant
to any similar wide field space telescope. We further show that the PSF of SNAP
or a similar telescope will be smaller than current ground-based PSFs, and more
isotropic and stable over time than the PSF of the Hubble Space Telescope. We
outline survey strategies for two different regimes - a ``wide'' 300 square
degree survey and a ``deep'' 15 square degree survey that will accomplish
various weak lensing goals including statistical studies and dark matter
mapping.Comment: 25 pages, 8 figures, 1 table, replaced with Published Versio
Supernova / Acceleration Probe: A Satellite Experiment to Study the Nature of the Dark Energy
The Supernova / Acceleration Probe (SNAP) is a proposed space-based
experiment designed to study the dark energy and alternative explanations of
the acceleration of the Universe's expansion by performing a series of
complementary systematics-controlled measurements. We describe a
self-consistent reference mission design for building a Type Ia supernova
Hubble diagram and for performing a wide-area weak gravitational lensing study.
A 2-m wide-field telescope feeds a focal plane consisting of a 0.7
square-degree imager tiled with equal areas of optical CCDs and near infrared
sensors, and a high-efficiency low-resolution integral field spectrograph. The
SNAP mission will obtain high-signal-to-noise calibrated light-curves and
spectra for several thousand supernovae at redshifts between z=0.1 and 1.7. A
wide-field survey covering one thousand square degrees resolves ~100 galaxies
per square arcminute. If we assume we live in a cosmological-constant-dominated
Universe, the matter density, dark energy density, and flatness of space can
all be measured with SNAP supernova and weak-lensing measurements to a
systematics-limited accuracy of 1%. For a flat universe, the
density-to-pressure ratio of dark energy can be similarly measured to 5% for
the present value w0 and ~0.1 for the time variation w'. The large survey area,
depth, spatial resolution, time-sampling, and nine-band optical to NIR
photometry will support additional independent and/or complementary dark-energy
measurement approaches as well as a broad range of auxiliary science programs.
(Abridged)Comment: 40 pages, 18 figures, submitted to PASP, http://snap.lbl.go
A Study of Time-Dependent CP-Violating Asymmetries and Flavor Oscillations in Neutral B Decays at the Upsilon(4S)
We present a measurement of time-dependent CP-violating asymmetries in
neutral B meson decays collected with the BABAR detector at the PEP-II
asymmetric-energy B Factory at the Stanford Linear Accelerator Center. The data
sample consists of 29.7 recorded at the
resonance and 3.9 off-resonance. One of the neutral B mesons,
which are produced in pairs at the , is fully reconstructed in
the CP decay modes , , , () and , or in flavor-eigenstate
modes involving and (). The flavor of the other neutral B meson is tagged at the time of
its decay, mainly with the charge of identified leptons and kaons. The proper
time elapsed between the decays is determined by measuring the distance between
the decay vertices. A maximum-likelihood fit to this flavor eigenstate sample
finds . The value of the asymmetry amplitude is determined from
a simultaneous maximum-likelihood fit to the time-difference distribution of
the flavor-eigenstate sample and about 642 tagged decays in the
CP-eigenstate modes. We find , demonstrating that CP violation exists in the neutral B meson
system. (abridged)Comment: 58 pages, 35 figures, submitted to Physical Review
Measurement of the Branching Fraction for B- --> D0 K*-
We present a measurement of the branching fraction for the decay B- --> D0
K*- using a sample of approximately 86 million BBbar pairs collected by the
BaBar detector from e+e- collisions near the Y(4S) resonance. The D0 is
detected through its decays to K- pi+, K- pi+ pi0 and K- pi+ pi- pi+, and the
K*- through its decay to K0S pi-. We measure the branching fraction to be
B.F.(B- --> D0 K*-)= (6.3 +/- 0.7(stat.) +/- 0.5(syst.)) x 10^{-4}.Comment: 7 pages, 1 postscript figure, submitted to Phys. Rev. D (Rapid
Communications
- …