380 research outputs found
Topographic Signatures in Aquarius Radiometer/Scatterometer Response: Initial Results
The effect of topography on remote sensing at L-band is examined using the co-located Aquarius radiometer and scatterometer observations over land. A correlation with slope standard deviation is demonstrated for both the radiometer and scatterometer at topographic scales. Although the goal of Aquarius is remote sensing of sea surface salinity, the radiometer and scatterometer are on continuously and collect data for remote sensing research over land. Research is reported here using the data over land to determine if topography could have impact on the passive remote sensing at L-band. In this study, we report observations from two study regions: North Africa between 15 deg and 30 deg Northern latitudes and Australia less the Tasmania Island. Common to these two regions are the semi-arid climate and low population density; both favorable conditions to isolate the effect of topography from other sources of scatter and emission such as vegetation and urban areas. Over these study regions, topographic scale slopes within each Aquarius pixel are computed and their standard deviations are compared with Aquarius scatterometer and radiometer observations over a 36 day period between days 275 and 311 of 2011
Increased Accuracy in the Measurement of the Dielectric Constant of Seawater at 1.413 GHz
This paper describes the latest results for the measurements of the dielectric constant at 1.413 GHz by using a resonant cavity technique. The purpose of these measurements is to develop an accurate relationship for the dependence of the dielectric constant of sea water on temperature and salinity which is needed by the Aquarius inversion algorithm to retrieve salinity. Aquarius is the major instrument on the Aquarius/SAC-D observatory, a NASA/CONAE satellite mission launched in June of20ll with the primary mission of measuring global sea surface salinity to an accuracy of 0.2 psu. Aquarius measures salinity with a 1.413 GHz radiometer and uses a scatterometer to compensate for the effects of surface roughness. The core part of the seawater dielectric constant measurement system is a brass microwave cavity that is resonant at 1.413 GHz. The seawater is introduced into the cavity through a capillary glass tube having an inner diameter of 0.1 mm. The change of resonance frequency and the cavity Q value are used to determine the real and imaginary parts of the dielectric constant of seawater introduced into the thin tube. Measurements are automated with the help of software developed at the George Washington University. In this talk, new results from measurements made since September 2010 will be presented for salinities 30, 35 and 38 psu with a temperature range of O C to 350 C in intervals of 5 C. These measurements are more accurate than earlier measurements made in 2008 because of a new method for measuring the calibration constant using methanol. In addition, the variance of repeated seawater measurements has been reduced by letting the system stabilize overnight between temperature changes. The new results are compared to the Kline Swift and Meissner Wentz model functions. The importance of an accurate model function will be illustrated by using these model functions to invert the Aquarius brightness temperature to get the salinity values. The salinity values will be compared to co-located in situ data collected by Argo buoys
Comparisons of Aquarius Measurements over Oceans with Radiative Transfer Models at L-Band
The Aquarius/SAC-D spacecraft includes three L-band (1.4 GHz) radiometers dedicated to measuring sea surface salinity. It was launched in June 2011 by NASA and CONAE (Argentine space agency). We report detailed comparisons of Aquarius measurements with radiative transfer model predictions. These comparisons are used as part of the initial assessment of Aquarius data and to estimate the radiometer calibration bias and stability. Comparisons are also being performed to assess the performance of models used in the retrieval algorithm for correcting the effect of various sources of geophysical "noise" (e.g. Faraday rotation, surface roughness). Such corrections are critical in bringing the error in retrieved salinity down to the required 0.2 practical salinity unit on monthly global maps at 150 km by 150 km resolution
A U-Net Deep Learning Framework for High Performance Vessel Segmentation in Patients With Cerebrovascular Disease
Brain vessel status is a promising biomarker for better prevention and treatment in cerebrovascular disease. However, classic rule-based vessel segmentation algorithms need to be hand-crafted and are insufficiently validated. A specialized deep learning method-the U-net -is a promising alternative. Using labeled data from 66 patients with cerebrovascular disease, the U-net framework was optimized and evaluated with three metrics: Dice coefficient, 95% Hausdorff distance (95HD) and average Hausdorff distance (AVD). The model performance was compared with the traditional segmentation method of graph-cuts. Training and reconstruction was performed using 2D patches. A full and a reduced architecture with less parameters were trained. We performed both quantitative and qualitative analyses. The U-net models yielded high performance for both the full and the reduced architecture: A Dice value of similar to 0.88, a 95HD of similar to 47 voxels and an AVD of similar to 0.4 voxels. The visual analysis revealed excellent performance in large vessels and sufficient performance in small vessels. Pathologies like cortical laminar necrosis and a rete mirabile led to limited segmentation performance in few patients. The U-net outperfomed the traditional graph-cuts method (Dice similar to 0.76, 95HD similar to 59, AVD similar to 1.97). Our work highly encourages the development of clinically applicable segmentation tools based on deep learning. Future works should focus on improved segmentation of small vessels and methodologies to deal with specific pathologies
Indirect study of 19Ne states near the 18F+p threshold
The early E < 511 keV gamma-ray emission from novae depends critically on the
18F(p,a)15O reaction. Unfortunately the reaction rate of the 18F(p,a)15O
reaction is still largely uncertain due to the unknown strengths of low-lying
proton resonances near the 18F+p threshold which play an important role in the
nova temperature regime. We report here our last results concerning the study
of the d(18F,p)19F(alpha)15N transfer reaction. We show in particular that
these two low-lying resonances cannot be neglected. These results are then used
to perform a careful study of the remaining uncertainties associated to the
18F(p,a)15O and 18F(p,g)19Ne reaction rates.Comment: 18 pages, 8 figures. Accepted in Nuclear Physics
Quantifying Robotic Swarm Coverage
In the field of swarm robotics, the design and implementation of spatial
density control laws has received much attention, with less emphasis being
placed on performance evaluation. This work fills that gap by introducing an
error metric that provides a quantitative measure of coverage for use with any
control scheme. The proposed error metric is continuously sensitive to changes
in the swarm distribution, unlike commonly used discretization methods. We
analyze the theoretical and computational properties of the error metric and
propose two benchmarks to which error metric values can be compared. The first
uses the realizable extrema of the error metric to compute the relative error
of an observed swarm distribution. We also show that the error metric extrema
can be used to help choose the swarm size and effective radius of each robot
required to achieve a desired level of coverage. The second benchmark compares
the observed distribution of error metric values to the probability density
function of the error metric when robot positions are randomly sampled from the
target distribution. We demonstrate the utility of this benchmark in assessing
the performance of stochastic control algorithms. We prove that the error
metric obeys a central limit theorem, develop a streamlined method for
performing computations, and place the standard statistical tests used here on
a firm theoretical footing. We provide rigorous theoretical development,
computational methodologies, numerical examples, and MATLAB code for both
benchmarks.Comment: To appear in Springer series Lecture Notes in Electrical Engineering
(LNEE). This book contribution is an extension of our ICINCO 2018 conference
paper arXiv:1806.02488. 27 pages, 8 figures, 2 table
Aquarius Radiometer Status
Aquarius was launched on June 10, 2011 as part of the Aquarius/SAC-D observatory and the instrument has been operating continuously since being turned on in August of the same year. The initial map of sea surface salinity was released one month later (September) and the quality of the retrieval has continuously improved since then. The Aquarius radiometers include several special features such as measurement of the third Stokes parameter, fast sampling, and careful thermal control, and a combined passive/active instrument. Aquarius is working well and in addition to helping measure salinity, the radiometer special features are generating new results
Les Houches 2015: Physics at TeV Colliders Standard Model Working Group Report
This Report summarizes the proceedings of the 2015 Les Houches workshop on
Physics at TeV Colliders. Session 1 dealt with (I) new developments relevant
for high precision Standard Model calculations, (II) the new PDF4LHC parton
distributions, (III) issues in the theoretical description of the production of
Standard Model Higgs bosons and how to relate experimental measurements, (IV) a
host of phenomenological studies essential for comparing LHC data from Run I
with theoretical predictions and projections for future measurements in Run II,
and (V) new developments in Monte Carlo event generators.Comment: Proceedings of the Standard Model Working Group of the 2015 Les
Houches Workshop, Physics at TeV Colliders, Les Houches 1-19 June 2015. 227
page
Explaining the decline in coronary heart disease mortality in Turkey between 1995 and 2008.
BACKGROUND: Coronary heart disease (CHD) mortality rates have been decreasing in Turkey since the early 1990s. Our study aimed to determine how much of the CHD mortality decrease in Turkey between 1995 and 2008 could be attributed to temporal trends in major risk factors and how much to advances in medical and surgical treatments.
METHODS: The validated IMPACT CHD mortality model was used to combine and analyse data on uptake and effectiveness of CHD treatments and risk factor trends in Turkey in adults aged 35-84 years between 1995 and 2008.Data sources were identified, searched and appraised on population, mortality and major CHD risk factors for adults those aged 35-84 years. Official statistics, electronic databases, national registers, surveys and published trials were screened from 1995 onwards.
RESULTS: Between 1995 and 2008, coronary heart disease mortality rates in Turkey decreased by 34% in men and 28% in women 35 years and over. This resulted in 35,720 fewer deaths in 2008.Approximately 47% of this mortality decrease was attributed to treatments in individuals (including approximately 16% to secondary prevention, 3% angina treatments, 9% to heart failure treatments, 5% to initial treatments of acute myocardial infarction, and 5% to hypertension treatments) and approximately 42% was attributable to population risk factor reductions (notably blood pressure 29%; smoking 27%; and cholesterol 1%). Adverse trends were seen for obesity and diabetes (potentially increasing mortality by approximately 11% and 14% respectively). The model explained almost 90% of the mortality fall.
CONCLUSION: Reduction in major cardiovascular risk factors explained approximately 42% and improvements in medical and surgical treatments explained some 47% of the CHD mortality fall. These findings emphasize the complimentary value of primary prevention and evidence-based medical treatments in controlling coronary heart disease
- …
