13,660 research outputs found
The microlensing rate and distribution of free-floating planets towards the Galactic bulge
Ground-based optical microlensing surveys have provided tantalising, if
inconclusive, evidence for a significant population of free-floating planets
(FFPs). Both ground and space-based facilities are being used and developed
which will be able to probe the distrubution of FFPs with much better
sensitivity. It is vital also to develop a high-precision microlensing
simulation framework to evaluate the completeness of such surveys. We present
the first signal-to-noise limited calculations of the FFP microlensing rate
using the Besancon Galactic model. The microlensing distribution towards the
Galactic centre is simulated for wide-area ground-based optical surveys such as
OGLE or MOA, a wide-area ground-based near-IR survey, and a targeted
space-based near-IR survey which could be undertaken with Euclid or WFIRST. We
present a calculation framework for the computation of the optical and
near-infrared microlensing rate and optical depth for simulated stellar
catalogues which are signal-to-noise limited, and take account of extinction,
unresolved stellar background light and finite source size effects, which can
be significant for FFPs. We find that the global ground-based I-band yield over
a central 200 deg^2 region covering the Galactic centre ranges from 20
Earth-mass FFPs year^-1 up to 3,500 year^-1 for Jupiter FFPs in the limit of
100% detection efficiency, and almost an order of magnitude larger for a K-band
survey. For ground-based surveys we find that the inclusion of finite source
and the unresolved background reveals a mass-dependent variation in the spatial
distribution of FFPs. For a space-based H-band covering 2 deg^2, the yield
depends on the target field but maximizes close to the Galactic centre with
around 76 Earth through to 1,700 Jupiter FFPs year^-1. For near-IR space-based
surveys the spatial distribution of FFPs is found to be largely insensitive to
the FFP mass scale.Comment: 14 pages, submitted to A&A and accepte
Recommended from our members
Beyond Anticommunism: The Fragility of Class Analysis in Romania
The debate about socio-economic inequalities and class has become increasingly important in mainstream academic and political debates. This article shows that during the late 2000s class analysis was rediscovered in Romania both as an analytical category and as a category of practice. The evidence suggests that this was the result of two converging processes: the deepening crisis of Western capitalism after 2008 and the country’s increasingly transnational networks of young scholars, journalists, and civil society actors. Although a steady and focused interest in class analysis is a novelty in Romania’s academia, media, and political life and has the potential to change the political conversation in the future, so far the social fields where this analysis is practiced have remained relatively marginal
On the general problem of quantum phase estimation
The problem of estimating a generic phase-shift experienced by a quantum
state is addressed for a generally degenerate phase shift operator. The optimal
positive operator-valued measure is derived along with the optimal input state.
Two relevant examples are analyzed: i) a multi-mode phase shift operator for
multipath interferometry; ii) the two mode heterodyne phase detection.Comment: 11 pages. Elsart package use
Recommended from our members
Political economy and the ghosts of the past: revisiting the Spanish and Romanian transitions to democracy
Juan Linz and Alfred Stepan’s opus on democratic transition and consolidation put Spain and Romania at the extreme ends of these processes and paid little attention to the domestic and external economic constraints on the transition process. This paper interrogates these claims. It shows that in retrospect Spain looks a lot less exemplary and Romania a lot less hopeless than this iconic contribution suggested at the time. Moreover, while external economic shocks and local attempts to buffer them through social compensation shaped both transitions, Romanian governments faced balance of payments crises and international policy conditionality constraints, while their Spanish counterparts did not. This difference invites a greater appreciation of the role of political economy analyses when comparing the policy options of political elites ruling in times of democratic transition and consolidation
The Development of LOX-Based Magnetic Fluid Technology and its Impact on Small Satellites
A magnetic fluid system could potentially replace mechanically moving parts in a satellite as a means of increasing system reliability and mission lifetime, but rather than a standard ferrofluid with magnetic particles, liquid oxygen (LOX) may be a more adequate working fluid. As a pure paramagnetic cryogen, LOX is already heavily used in space, but still requires basic research before being integrated into system development. The objectives of the research conducted were to verify LOX as a magnetic working fluid through experiment and establish a theoretical model to describe its behavior. This paper presents the theoretical, experimental, and numerical results of a slug of LOX being pulsed by a 1.1 T solenoid in a quartz tube with an inner diameter of 1.9 mm. The slug oscillated about the solenoid at 6-8 Hz, producing a pressure change of up to 1.2 kPa. System efficiency based on the Mason number was also studied for various geometric setups, and, using a one-dimensional, finite-differenced model in Matlab 2008a, the numerical analyses confirmed the theoretical model. The research provides groundwork for future applied studies using Comsol Multiphysics 3.5a with complex designs
The Big Data Newsvendor: Practical Insights from Machine Learning
We investigate the data-driven newsvendor problem when one has n observations of p features related to the demand as well as historical demand data. Rather than a two-step process of first estimating a demand distribution then optimizing for the optimal order quantity, we propose solving the “Big Data” newsvendor problem via single step machine learning algorithms. Specifically, we propose algorithms based on the Empirical Risk Minimization (ERM) principle, with and without regularization, and an algorithm based on Kernel-weights Optimization (KO). The ERM approaches, equivalent to high-dimensional quantile
regression, can be solved by convex optimization problems and the KO approach by a sorting algorithm.
We analytically justify the use of features by showing that their omission yields inconsistent decisions. We then derive finite-sample performance bounds on the out-of-sample costs of the feature-based algorithms, which quantify the effects of dimensionality and cost parameters. Our bounds, based on algorithmic stability theory, generalize known analyses for the newsvendor problem without feature information. Finally, we apply the feature-based algorithms for nurse staffing in a hospital emergency room using a data set from a large UK teaching hospital and find that (i) the best ERM and KO algorithms beat the best practice benchmark by 23% and 24% respectively in the out-of-sample cost, and (ii) the best KO algorithm is faster than the best ERM algorithm by three orders of magnitude and the best practice benchmark by two orders of magnitude
Distinguishing mixed quantum states: Minimum-error discrimination versus optimum unambiguous discrimination
We consider two different optimized measurement strategies for the
discrimination of nonorthogonal quantum states. The first is conclusive
discrimination with a minimum probability of inferring an erroneous result, and
the second is unambiguous, i. e. error-free, discrimination with a minimum
probability of getting an inconclusive outcome, where the measurement fails to
give a definite answer. For distinguishing between two mixed quantum states, we
investigate the relation between the minimum error probability achievable in
conclusive discrimination, and the minimum failure probability that can be
reached in unambiguous discrimination of the same two states. The latter turns
out to be at least twice as large as the former for any two given states. As an
example, we treat the case that the state of the quantum system is known to be,
with arbitrary prior probability, either a given pure state, or a uniform
statistical mixture of any number of mutually orthogonal states. For this case
we derive an analytical result for the minimum probability of error and perform
a quantitative comparison to the minimum failure probability.Comment: Replaced by final version, accepted for publication in Phys. Rev. A.
Revtex4, 6 pages, 3 figure
A Simulation Model Outline for the Hungarian Forest Sector
The model presented in this paper describes the structure of the Hungarian forest sector. The planning of the sector at a national and company level as well as the mechanism of regulation concerning production, investments, and consumption are also investigated and the exports and imports linked.
One of the most important objectives is to create this model in order to study the behavior of the system so as to aid the decision making both in strategic and tactical areas. Apart from forestry the model also includes the wood processing activities
- …