1,963 research outputs found
Accurate masses and radii of normal stars: modern results and applications
This paper presents and discusses a critical compilation of accurate,
fundamental determinations of stellar masses and radii. We have identified 95
detached binary systems containing 190 stars (94 eclipsing systems, and alpha
Centauri) that satisfy our criterion that the mass and radius of both stars be
known to 3% or better. To these we add interstellar reddening, effective
temperature, metal abundance, rotational velocity and apsidal motion
determinations when available, and we compute a number of other physical
parameters, notably luminosity and distance. We discuss the use of this
information for testing models of stellar evolution. The amount and quality of
the data also allow us to analyse the tidal evolution of the systems in
considerable depth, testing prescriptions of rotational synchronisation and
orbital circularisation in greater detail than possible before. The new data
also enable us to derive empirical calibrations of M and R for single (post-)
main-sequence stars above 0.6 M(Sun). Simple, polynomial functions of T(eff),
log g and [Fe/H] yield M and R with errors of 6% and 3%, respectively.
Excellent agreement is found with independent determinations for host stars of
transiting extrasolar planets, and good agreement with determinations of M and
R from stellar models as constrained by trigonometric parallaxes and
spectroscopic values of T(eff) and [Fe/H]. Finally, we list a set of 23
interferometric binaries with masses known to better than 3%, but without
fundamental radius determinations (except alpha Aur). We discuss the prospects
for improving these and other stellar parameters in the near future.Comment: 56 pages including figures and tables. To appear in The Astronomy and
  Astrophysics Review. Ascii versions of the tables will appear in the online
  version of the articl
Productive restructuring and the reallocation of work and employment: a survey of the “new” forms of social inequality
O propósito do presente artigo consiste
em questionar a inevitabilidade dos processos de
segmentação e precarização das relações de trabalho
e emprego, responsáveis pela inscrição de
“novas” formas de desigualdade social que alicerçam
o actual modelo de desenvolvimento das economias
e sociedades. Visa-se criticar os limites da
lógica econômica e financeira, de contornos globais,
que configuram um “novo espírito do capitalismo”,
ou seja, uma espécie de divinização da
ordem natural das coisas. Impõe-se fazer, por isso,
um périplo analítico pelas transformações em curso
no mercado de trabalho, acompanhado pela vigilância
epistemológica que permita enquadrar e
relativizar as (di)visões neoliberais e teses tecnodeterministas
dominantes. A perspectivação de cenários
sobre o futuro do trabalho encerrará este
périplo, permitindo-nos alertar para os condicionalismos
histórico-temporais, para a urgência de
se desocultar o que de ideológico e político existe
nas actuais lógicas de racionalização e para os
processos de ressimbolização do trabalho e emprego
enquanto “experiência social central” na
contemporaneidade.The scope of this paper is to question
the inevitability of the processes of segmentation
and increased precariousness of the relations
of labor and employment, which are responsible
for the introduction of “new” forms of
social inequality that underpin the current model
of development of economies and societies. It
seeks to criticize the limits of global financial and
economic logic, which constitute a “new spirit of
capitalism,” namely a kind of reverence for the
natural order of things. It is therefore necessary
to conduct an analytical survey of the ongoing
changes in the labor market, accompanied by epistemological
vigilance which makes it possible to
see neoliberal (di)visions and dominant technodeterministic
theses in context. The enunciation
of scenarios on the future of work will conclude
this survey and will make it possible to draw attention
to both the historical and temporal constraints
and to the urgent need to unveil what is
ideological and political in the prevailing logic of
rationalization and processes to reinstate work
and employment as a “central social experience”
in contemporary times
Noiseless Linear Amplification and Distillation of Entanglement
The idea of signal amplification is ubiquitous in the control of physical
systems, and the ultimate performance limit of amplifiers is set by quantum
physics. Increasing the amplitude of an unknown quantum optical field, or more
generally any harmonic oscillator state, must introduce noise. This linear
amplification noise prevents the perfect copying of the quantum state, enforces
quantum limits on communications and metrology, and is the physical mechanism
that prevents the increase of entanglement via local operations. It is known
that non-deterministic versions of ideal cloning and local entanglement
increase (distillation) are allowed, suggesting the possibility of
non-deterministic noiseless linear amplification. Here we introduce, and
experimentally demonstrate, such a noiseless linear amplifier for
continuous-variables states of the optical field, and use it to demonstrate
entanglement distillation of field-mode entanglement. This simple but powerful
circuit can form the basis of practical devices for enhancing quantum
technologies. The idea of noiseless amplification unifies approaches to cloning
and distillation, and will find applications in quantum metrology and
communications.Comment: Submitted 10 June 200
Recommended from our members
Trend following, risk parity and momentum in commodity futures
We show that combining momentum and trend following strategies for individual commodity futures can lead to portfolios which offer attractive risk adjusted returns which are superior to simple momentum strategies; when we expose these returns to a wide array of sources of systematic risk we find that robust alpha survives. Experimenting with risk parity portfolio weightings has limited impact on our results though in particular is beneficial to long–short strategies; the marginal impact of applying trend following methods far outweighs momentum and risk parity adjustments in terms of risk-adjusted returns and limiting downside risk. Overall this leads to an attractive strategy for investing in commodity futures and emphasises the importance of trend following as an investment strategy in the commodity futures context
Can We Really Prevent Suicide?
Every year, suicide is among the top 20 leading causes of death globally for all ages. Unfortunately, suicide is difficult to prevent, in large part because the prevalence of risk factors is high among the general population. In this review, clinical and psychological risk factors are examined and methods for suicide prevention are discussed. Prevention strategies found to be effective in suicide prevention
include means restriction, responsible media coverage, and general public education, as well identification methods such as screening, gatekeeper training, and primary care physician education. Although the treatment for preventing suicide is difficult, follow-up that includes pharmacotherapy, psychotherapy, or both may be useful. However, prevention methods cannot be restricted to the individual. Community, social, and policy interventions will also be essentia
Impact Factor: outdated artefact or stepping-stone to journal certification?
A review of Garfield's journal impact factor and its specific implementation
as the Thomson Reuters Impact Factor reveals several weaknesses in this
commonly-used indicator of journal standing. Key limitations include the
mismatch between citing and cited documents, the deceptive display of three
decimals that belies the real precision, and the absence of confidence
intervals. These are minor issues that are easily amended and should be
corrected, but more substantive improvements are needed. There are indications
that the scientific community seeks and needs better certification of journal
procedures to improve the quality of published science. Comprehensive
certification of editorial and review procedures could help ensure adequate
procedures to detect duplicate and fraudulent submissions.Comment: 25 pages, 12 figures, 6 table
The stellar and sub-stellar IMF of simple and composite populations
The current knowledge on the stellar IMF is documented. It appears to become
top-heavy when the star-formation rate density surpasses about 0.1Msun/(yr
pc^3) on a pc scale and it may become increasingly bottom-heavy with increasing
metallicity and in increasingly massive early-type galaxies. It declines quite
steeply below about 0.07Msun with brown dwarfs (BDs) and very low mass stars
having their own IMF. The most massive star of mass mmax formed in an embedded
cluster with stellar mass Mecl correlates strongly with Mecl being a result of
gravitation-driven but resource-limited growth and fragmentation induced
starvation. There is no convincing evidence whatsoever that massive stars do
form in isolation. Various methods of discretising a stellar population are
introduced: optimal sampling leads to a mass distribution that perfectly
represents the exact form of the desired IMF and the mmax-to-Mecl relation,
while random sampling results in statistical variations of the shape of the
IMF. The observed mmax-to-Mecl correlation and the small spread of IMF
power-law indices together suggest that optimally sampling the IMF may be the
more realistic description of star formation than random sampling from a
universal IMF with a constant upper mass limit. Composite populations on galaxy
scales, which are formed from many pc scale star formation events, need to be
described by the integrated galactic IMF. This IGIMF varies systematically from
top-light to top-heavy in dependence of galaxy type and star formation rate,
with dramatic implications for theories of galaxy formation and evolution.Comment: 167 pages, 37 figures, 3 tables, published in Stellar Systems and
  Galactic Structure, Vol.5, Springer. This revised version is consistent with
  the published version and includes additional references and minor additions
  to the text as well as a recomputed Table 1. ISBN 978-90-481-8817-
Converting simulated total dry matter to fresh marketable yield for field vegetables at a range of nitrogen supply levels
Simultaneous analysis of economic and environmental performance of horticultural crop production requires qualified assumptions on the effect of management options, and particularly of nitrogen (N) fertilisation, on the net returns of the farm. Dynamic soil-plant-environment simulation models for agro-ecosystems are frequently applied to predict crop yield, generally as dry matter per area, and the environmental impact of production. Economic analysis requires conversion of yields to fresh marketable weight, which is not easy to calculate for vegetables, since different species have different properties and special market requirements. Furthermore, the marketable part of many vegetables is dependent on N availability during growth, which may lead to complete crop failure under sub-optimal N supply in tightly calculated N fertiliser regimes or low-input systems. In this paper we present two methods for converting simulated total dry matter to marketable fresh matter yield for various vegetables and European growth conditions, taking into consideration the effect of N supply: (i) a regression based function for vegetables sold as bulk or bunching ware and (ii) a population approach for piecewise sold row crops. For both methods, to be used in the context of a dynamic simulation model, parameter values were compiled from a literature survey. Implemented in such a model, both algorithms were tested against experimental field data, yielding an Index of Agreement of 0.80 for the regression strategy and 0.90 for the population strategy. Furthermore, the population strategy was capable of reflecting rather well the effect of crop spacing on yield and the effect of N supply on product grading
Buses, cars, bicycles and walkers the influence of the type of human transport on the flight responses of waterbirds
One way to manage disturbance to waterbirds in natural areas where humans require access is to promote the occurrence of stimuli for which birds tolerate closer approaches, and so cause fewer responses. We conducted 730 experimental approaches to 39 species of waterbird, using five stimulus types (single walker, three walkers, bicycle, car and bus) selected to mimic different human management options available for a controlled access, Ramsar-listed wetland. Across species, where differences existed (56% of 25 cases), motor vehicles always evoked shorter flight-initiation distances (FID) than humans on foot. The influence of stimulus type on FID varied across four species for which enough data were available for complete cross-stimulus analysis. All four varied FID in relation to stimuli, differing in 4 to 7 of 10 possible comparisons. Where differences occurred, the effect size was generally modest, suggesting that managing stimulus type (e.g. by requiring people to use vehicles) may have species-specific, modest benefits, at least for the waterbirds we studied. However, different stimulus types have different capacities to reduce the frequency of disturbance (i.e. by carrying more people) and vary in their capacity to travel around important habita
- …
