2,457 research outputs found
The orbit of the close spectroscopic binary epsilon Lupi and the intrinsic variability of its early B-type components
We subjected 106 new high-resolution spectra of the double-lined
spectroscopic close binary epsilon Lupi, obtained in a time-span of 17 days
from two different observatories, to a detailed study of orbital and intrinsic
variations. We derived accurate values of the orbital parameters. We refined
the sidereal orbital period to 4.55970 days and the eccentricity to e=0.277. By
adding old radial velocities, we discovered the presence of apsidal motion with
a period of the rotation of apses of about 430 years. Such a value agrees with
theoretical expectations. Additional data is needed to confirm and refine this
value. Our dataset did not allow us to derive the orbit of the third body,
which is known to orbit the close system in approximately 64 years. We present
the secondary of epsilon Lupi as a new beta Cephei variable, while the primary
is a beta Cephei suspect. A first detailed analysis of line-profile variations
of both primary and secondary led to detection of one pulsation frequency near
10.36 c/d in the variability of the secondary, while no clear periodicity was
found in the primary, although low-amplitude periodicities are still suspected.
The limited accuracy and extent of our dataset did not allow any further
analysis, such as mode-identification.Comment: 13+3 pages, 20 figures. Astronomy and Astrophysics, accepte
Estimating stellar oscillation-related parameters and their uncertainties with the moment method
The moment method is a well known mode identification technique in
asteroseismology (where `mode' is to be understood in an astronomical rather
than in a statistical sense), which uses a time series of the first 3 moments
of a spectral line to estimate the discrete oscillation mode parameters l and
m. The method, contrary to many other mode identification techniques, also
provides estimates of other important continuous parameters such as the
inclination angle alpha, and the rotational velocity v_e. We developed a
statistical formalism for the moment method based on so-called generalized
estimating equations (GEE). This formalism allows the estimation of the
uncertainty of the continuous parameters taking into account that the different
moments of a line profile are correlated and that the uncertainty of the
observed moments also depends on the model parameters. Furthermore, we set up a
procedure to take into account the mode uncertainty, i.e., the fact that often
several modes (l,m) can adequately describe the data. We also introduce a new
lack of fit function which works at least as well as a previous discriminant
function, and which in addition allows us to identify the sign of the azimuthal
order m. We applied our method to the star HD181558, using several numerical
methods, from which we learned that numerically solving the estimating
equations is an intensive task. We report on the numerical results, from which
we gain insight in the statistical uncertainties of the physical parameters
involved in the moment method.Comment: The electronic online version from the publisher can be found at
http://www.blackwell-synergy.com/doi/abs/10.1111/j.1467-9876.2005.00487.
Interpretation of the variability of the <i>β</i> Cephei star <i>λ</i> Scorpii. I. The multiple character
We derive accurate values of the orbital parameters of the close binary β Cephei star λ Scorpii. Moreover, we present the first determination of the properties of the triple system to which λ Scorpii belongs. Our analysis is based on a time series of 815 high-resolution spectra, covering a timespan of 14 years. We find a close orbit of 5d.9525days (e=0.26) and a wide orbit of approximately 1082d days (e=0.23). The orbital parameters of the triple star and a spectrum synthesis lead us to conclude that the system is composed of two early-type B stars and a low-mass pre-main-sequence star rather than containing an ultra-massive white dwarf as claimed before. Our proposed configuration is compatible with population synthesis. The radial velocity variations of the primary allow us to confirm the presence of at least one pulsation mode with frequency 4.679410 c d-1 which is subject to the light-time effect in the triple system. A detailed analysis of the complex line-profile variations is described in a subsequent paper
The PLATO End-to-End CCD Simulator -- Modelling space-based ultra-high precision CCD photometry for the assessment study of the PLATO Mission
The PLATO satellite mission project is a next generation ESA Cosmic Vision
satellite project dedicated to the detection of exo-planets and to
asteroseismology of their host-stars using ultra-high precision photometry. The
main goal of the PLATO mission is to provide a full statistical analysis of
exo-planetary systems around stars that are bright and close enough for
detailed follow-up studies. Many aspects concerning the design trade-off of a
space-based instrument and its performance can best be tackled through
realistic simulations of the expected observations. The complex interplay of
various noise sources in the course of the observations made such simulations
an indispensable part of the assessment study of the PLATO Payload Consortium.
We created an end-to-end CCD simulation software-tool, dubbed PLATOSim, which
simulates photometric time-series of CCD images by including realistic models
of the CCD and its electronics, the telescope optics, the stellar field, the
pointing uncertainty of the satellite (or Attitude Control System [ACS]
jitter), and all important natural noise sources. The main questions that were
addressed with this simulator were the noise properties of different
photometric algorithms, the selection of the optical design, the allowable
jitter amplitude, and the expected noise budget of light-curves as a function
of the stellar magnitude for different parameter conditions. The results of our
simulations showed that the proposed multi-telescope concept of PLATO can
fulfil the defined scientific goal of measuring more than 20000 cool dwarfs
brighter than mV =11 with a precision better than 27 ppm/h which is essential
for the study of earth-like exo-planetary systems using the transit method.Comment: 5 pages, submitted for the Proceedings of the 4th HELAS International
Conference: Seismological Challenges for Stellar Structur
Quantum Particles as Conceptual Entities: A Possible Explanatory Framework for Quantum Theory
We put forward a possible new interpretation and explanatory framework for
quantum theory. The basic hypothesis underlying this new framework is that
quantum particles are conceptual entities. More concretely, we propose that
quantum particles interact with ordinary matter, nuclei, atoms, molecules,
macroscopic material entities, measuring apparatuses, ..., in a similar way to
how human concepts interact with memory structures, human minds or artificial
memories. We analyze the most characteristic aspects of quantum theory, i.e.
entanglement and non-locality, interference and superposition, identity and
individuality in the light of this new interpretation, and we put forward a
specific explanation and understanding of these aspects. The basic hypothesis
of our framework gives rise in a natural way to a Heisenberg uncertainty
principle which introduces an understanding of the general situation of 'the
one and the many' in quantum physics. A specific view on macro and micro
different from the common one follows from the basic hypothesis and leads to an
analysis of Schrodinger's Cat paradox and the measurement problem different
from the existing ones. We reflect about the influence of this new quantum
interpretation and explanatory framework on the global nature and evolutionary
aspects of the world and human worldviews, and point out potential explanations
for specific situations, such as the generation problem in particle physics,
the confinement of quarks and the existence of dark matter.Comment: 45 pages, 10 figure
Strong increases in flood frequency and discharge of the River Meuse over the late Holocene: impacts of long-term anthropogenic land use change and climate variability
International audienceIn recent years the frequency of high-flow events on the Meuse (northwest Europe) has been relatively great, and flooding has become a major research theme. To date, research has focused on observed discharge records of the last century and simulations of the coming century. However, it is difficult to delineate changes caused by human activities (land use change and greenhouse gas emissions) and natural fluctuations on these timescales. To address this problem we coupled a climate model (ECBilt-CLIO-VECODE) and a hydrological model (STREAM) to simulate daily Meuse discharge in two time-slices: 4000?3000 BP (natural situation), and 1000?2000 AD (includes anthropogenic influence). For 4000?3000 BP the basin is assumed to be almost fully forested; for 1000?2000 AD we reconstructed land use based on historical sources. For 1000?2000 AD the simulated mean annual discharge (260.9 mÂł s?1) is significantly higher than for 4000?3000 BP (244.8 mÂł s?1), and the frequency of large high-flow events (discharge >3000 mÂł s?1) is higher (recurrence time decreases from 77 to 65 years). On a millennial timescale almost all of this increase can be ascribed to land use changes (especially deforestation); the effects of climatic change are insignificant. For the 20th Century, the simulated mean discharge (270.0 mÂł s?1) is higher than in any other century studied, and is ca. 2.5% higher than in the 19th Century (despite an increase in evapotranspiration). Furthermore, the recurrence time of large high-flow events is almost twice as short as under natural conditions (recurrence time decreases from 77 to 40 years). On this timescale climate change (strong increase in annual and winter precipitation) overwhelmed land use change as the dominant forcing mechanism
Strong increases in flood frequency and discharge of the River Meuse over the late Holocene: impacts of long-term anthropogenic land use change and climate variability
International audienceIn recent years the frequency of high-flow events on the Meuse (northwest Europe) has been relatively great, and flooding has become a major research theme. To date, research has focused on observed discharge records of the last century and simulations of the coming century. However, it is difficult to delineate changes caused by human activities (land use change and greenhouse gas emissions) and natural fluctuations on these timescales. To address this problem we coupled a climate model (ECBilt-CLIO-VECODE) and a hydrological model (STREAM) to simulate daily Meuse discharge in two time-slices: 4000?3000 BP (natural situation), and 1000?2000 AD (includes anthropogenic influence). For 4000?3000 BP the basin is assumed to be almost fully forested; for 1000?2000 AD we reconstructed land use based on historical sources. For 1000?2000 AD the simulated mean annual discharge (260.9 mÂł s?1) is significantly higher than for 4000?3000 BP (244.8 mÂł s?1), and the frequency of large high-flow events (discharge >3000 mÂł s?1) is higher (recurrence time decreases from 77 to 65 years). On a millennial timescale almost all of this increase can be ascribed to land use changes (especially deforestation); the effects of climatic change are insignificant. For the 20th Century, the simulated mean discharge (270.0 mÂł s?1) is higher than in any other century studied, and is ca. 2.5% higher than in the 19th Century (despite an increase in evapotranspiration). Furthermore, the recurrence time of large high-flow events is almost twice as short as under natural conditions (recurrence time decreases from 77 to 40 years). On this timescale climate change (strong increase in annual and winter precipitation) overwhelmed land use change as the dominant forcing mechanism
Risk allocation in a public-private catastrophe insurance system:an actuarial analysis of deductibles, stop-loss, and premiums
A public-private (PP) partnership could be a viable arrangement for providing insurance coverage for catastrophe events, such as floods and earthquakes. The objective of this paper is to obtain insights into efficient and practical allocations of risk in a PP insurance system. In particular, this study examines how the deductible and stop-loss levels (retentions) for, respectively, the insured and the insurer, relate to the corresponding maximum required coverage and premium amounts under the 99.9% tail value at risk (TVaR) damage constraint. A practical example of flood insurance in the Netherlands is studied in which the (re)insurance could be provided either by a risk-averse (private) or a risk-neutral (public) agency, which could result in large differences in premiums
Preparing the COROT space mission: new variable stars in the galactic Anticenter direction
The activities related to the preparation of the asteroseismic, photometric
space mission COROT are described. Photoelectric observations, wide--field CCD
photometry, uvbyB calibrations and further time--series have been obtained at
different observatories and telescopes. They have been planned to complete the
COROT programme in the direction of the galactic Anticenter. In addition to
suitable asteroseismic targets covering the different evolutionary stages
between ZAMS and TAMS, we discovered several other variable stars, both
pulsating and geometrical. We compared results on the incidence of variability
in the galactic Center and Anticenter directions. Physical parameters have been
obtained and evolutionary tracks fitting them have been calculated. The
peculiarities of some individual stars alre pointed out. Paper based on
observations collected at the San Pedro Martir, Sierra Nevada, Teide, La Silla,
Haute-Provence and Roque de Los Muchachos (Telescopio Nazionale Galileo and
Mercator telescopes) observatories.Comment: 13 pages, 9 figures. Accepted for The Astronomical Journal (2005 May
volume
- …