812 research outputs found
The end-to-end testbed of the Optical Metrology System on-board LISA Pathfinder
LISA Pathfinder is a technology demonstration mission for the Laser
Interferometer Space Antenna (LISA). The main experiment on-board LISA
Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to
measure the differential acceleration between two free-falling test masses with
an accuracy of 3x10^(-14) ms^(-2)/sqrt[Hz] between 1 mHz and 30 mHz. This
measurement is performed interferometrically by the Optical Metrology System
(OMS) on-board LISA Pathfinder. In this paper we present the development of an
experimental end-to-end testbed of the entire OMS. It includes the
interferometer and its sub-units, the interferometer back-end which is a
phasemeter and the processing of the phasemeter output data. Furthermore,
3-axes piezo actuated mirrors are used instead of the free-falling test masses
for the characterisation of the dynamic behaviour of the system and some parts
of the Drag-free and Attitude Control System (DFACS) which controls the test
masses and the satellite. The end-to-end testbed includes all parts of the LTP
that can reasonably be tested on earth without free-falling test masses. At its
present status it consists mainly of breadboard components. Some of those have
already been replaced by Engineering Models of the LTP experiment. In the next
steps, further Engineering Models and Flight Models will also be inserted in
this testbed and tested against well characterised breadboard components. The
presented testbed is an important reference for the unit tests and can also be
used for validation of the on-board experiment during the mission
Unbiased likelihood-free inference of the Hubble constant from light standard sirens
Multimessenger observations of binary neutron star mergers offer a promising path toward resolution of the Hubble constant (
H
0
) tension, provided their constraints are shown to be free from systematics such as the Malmquist bias. In the traditional Bayesian framework, accounting for selection effects in the likelihood requires calculation of the expected number (or fraction) of detections as a function of the parameters describing the population and cosmology; a potentially costly and/or inaccurate process. This calculation can, however, be bypassed completely by performing the inference in a framework in which the likelihood is never explicitly calculated, but instead fit using forward simulations of the data, which naturally include the selection. This is likelihood-free inference (LFI). Here, we use density-estimation LFI, coupled to neural-network-based data compression, to infer
H
0
from mock catalogues of binary neutron star mergers, given noisy redshift, distance and peculiar velocity estimates for each object. We demonstrate that LFI yields statistically unbiased estimates of
H
0
in the presence of selection effects, with precision matching that of sampling the full Bayesian hierarchical model. Marginalizing over the bias increases the
H
0
uncertainty by only 6% for training sets consisting of
O
(
10
4
)
populations. The resulting LFI framework is applicable to population-level inference problems with selection effects across astrophysics
Effect of ELF e.m. fields on metalloprotein redox-active sites
The peculiarity of the distribution and geometry of metallic ions in enzymes
pushed us to set the hypothesis that metallic ions in active-site act like tiny
antennas able to pick up very feeble e.m. signals. Enzymatic activity of Cu2+,
Zn2+ Superoxide Dismutase (SOD1) and Fe2+ Xanthine Oxidase (XO) has been
studied, following in vitro generation and removal of free radicals. We
observed that Superoxide radicals generation by XO is increased by a weak field
having the Larmor frequency fL of Fe2+ while the SOD1 kinetics is sensibly
reduced by exposure to a weak field having the frequency fL of Cu2+ ion.Comment: 18 pages, 4 figure
A new dataset and empirical relationships between magnitude/intensity and epicentral distance for liquefaction in central-eastern Sicily
Strong earthquakes can trigger several phenomena inducing soil deformation, such as liquefaction, ground fracturing
and landslides, which can often cause more damage than the seismic shaking itself. A research performed
on numerous historical accounts reporting descriptions of seismogeological effects in central-eastern Sicily, allowed
the authors to update the previous liquefaction datasets. 75 liquefaction-induced phenomena observed in
26 sites, triggered by 14 earthquakes, have been used to define relationships between intensity/magnitude values
and epicentral distance from the liquefied sites. The proposed upper bound-curves, at regional scale for central-
eastern Sicily, are realized by using the updating liquefaction dataset and also the new CPTI04 Italian earthquake
parametric catalogue. These relationships can be useful in hazard assessment to evaluate the minimum energy
of an earthquake inducing liquefactions
Can't pay or won't pay? Unemployment, negative equity, and strategic default
Prior research has found that job loss, as proxied for by regional unemployment rates, is a weak predictor of mortgage default. In contrast, using micro data from the PSID, this paper finds that job loss and adverse financial shocks are important determinants of mortgage default. Households with an unemployed head are approximately three times as likely to default as households with an employed head. Similarly, households that experience divorce, report large outstanding medical expenses, or have had any other severe income loss are much more likely to default. While household-level employment and financial shocks are important drivers of mortgage default, our analysis shows that the vast majority of financially distressed households do not default. More than 80 percent of unemployed households with less than one month of mortgage payment in savings are current on their payments. We argue that this has important implications for theoretical models of mortgage default as well as for loss mitigation policies. Finally, this paper provides some of the first direct evidence on the extent of strategic default. Wealth data suggest a limited scope for strategic default, with only one-third of underwater defaulters having enough liquid assets to cover one month´s mortgage payment
Evaluation of the efficacy of animal-assisted therapy based on the reality orientation therapy protocol in Alzheimer's disease patients: a pilot study.
BACKGROUND: The aim of this study was to evaluate the efficacy of animal-assisted therapy (AAT) in elderly patients affected by Alzheimer's disease based on the formal reality orientation therapy (ROT) protocol.
METHODS: Our study was carried out at an Alzheimer's centre for 6 months. A homogeneous sample (age, Mini-Mental State Examination (MMSE), 15-item Geriatric Depression Scale (GDS)) of 50 patients was selected at random and successively. Patients were divided into three groups: (i) 20 patients received a course of AAT (AAT group) based on the ROT protocol; (ii) 20 patients were engaged exclusively in activities based on the ROT group; and (iii) 10 patients (control group) participated in no stimulations. MMSE and GDS were administered at time 0 (T0 ) and time 1 (T1 ) to all three groups. Differences within groups between T0 and T1 for GDS and MMSE scores were analyzed by Student's t-test. Differences between group means were analyzed using an anova test with the Bonferroni-Dunn test for post-hoc comparisons.
RESULTS: Both the AAT group and ROT group had improved GDS scores and showed a slight improvement in terms of mood. On the GDS, the AAT group improved from 11.5 (T0 ) to 9.5 (T1 ), and the ROT group improved from 11.6 (T0 ) to 10.5 (T1 ). At the same time, a slight improvement in cognitive function, as measured by the MMSE, was observed. In the AAT group, mean MMSE was 20.2 at T0 and 21.5 at T1 , and in the ROT group, it was 19.9 at T0 and 20.0 at T1 . In the control group, the average values of both the GDS and MMSE remained unchanged. The Bonferroni-Dunn results showed statistically significant differences between groups, particularly between the AAT group and the other two (P < 0.001).
CONCLUSIONS: Pet therapy interventions based on the formal ROT protocol were effective and, compared to the ROT, provided encouraging and statistically significant results
Unemployment, negative equity, and strategic default
Using new household-level data, we quantitatively assess the roles that job loss, negative equity, and wealth (including unsecured debt, liquid assets, and illiquid assets) play in default decisions. In sharp contrast to prior studies that proxy for individual unemployment status using regional unemployment rates, we find that individual unemployment is the strongest predictor of default. We find that individual unemployment increases the probability of default by 5 - 13 percentage points, ceteris paribus, compared with the sample average default rate of 3.9 percent. We also find that only 13.9 percent of defaulters have both negative equity and enough liquid or illiquid assets to make one month's mortgage payment. This finding suggests that "ruthless" or "strategic" default during the 2007 - 09 recession was relatively rare and that policies designed to promote employment, such as payroll tax cuts, are most likely to stem defaults in the long run rather than policies that temporarily modify mortgages
Discrimination of tsunami sources (Earthquake vs. Landslide) on the basis of historical data in Eastern Sicily and southern Calabria
The source mechanisms responsible for large historical tsunamis that
have struck eastern Sicily and southern Calabria are a topic of robust debate. We have
compiled a database of historical coeval descriptions of three large tsunamis: 11 January
1693, 6 February 1783, and 28 December 1908. By using accounts of run-up and
inundation and employing an approach proposed by Okal and Synolakis in 2004, we
can provide discriminants to define the nature of the near-field tsunami sources (fault
dislocation or landslide).
Historical reports for the 1908 event describe affected localities, maximum runups,
and inundation areas. However, for the 1693 and 1783 tsunamis, reports are
limited to inundation and occasional run-up estimates. We calculate run-up values
for these events using available relations between inundation and run-up. We
employed the model of Okal and Synolakis to the obtained profiles of tsunami
run-up along the inundated shorelines. The 1908 run-up data distribution confirms
that the tsunami is compatible with a seismic dislocation source, whereas the
1783 data supports contemporary observations and recent offshore investigations suggesting
that the tsunami was produced by an earthquake-triggered submarine landslide.
Analysis of the 1693 event data suggests that the tsunami was generated
during a tectonic event and thus a seismogenic source should be found offshore
- …