5,080 research outputs found

    Astrometry and exoplanets in the Gaia era: a Bayesian approach to detection and parameter recovery

    Full text link
    (abridged) We develop Bayesian methods and detection criteria for orbital fitting, and revise the detectability of exoplanets in light of the in-flight properties of Gaia. Limiting ourselves to one-planet systems as a first step of the development, we simulate Gaia data for exoplanet systems over a grid of S/N, orbital period, and eccentricity. The simulations are then fit using Markov chain Monte Carlo methods. We investigate the detection rate according to three information criteria and the delta chi^2. For the delta chi^2, the effective number of degrees of freedom depends on the mission length. We find that the choice of the Markov chain starting point can affect the quality of the results; we therefore consider two limit possibilities: an ideal case, and a very simple method that finds the starting point assuming circular orbits. Using Jeffreys' scale of evidence, the fraction of false positives passing a strong evidence criterion is < ~0.2% (0.6%) when considering a 5 yr (10 yr) mission and using the Akaike information criterion or the Watanabe-Akaike information criterion, and <0.02% (<0.06%) when using the Bayesian information criterion. We find that there is a 50% chance of detecting a planet with a minimum S/N=2.3 (1.7). This sets the maximum distance to which a planet is detectable to ~70 pc and ~3.5 pc for a Jupiter-mass and Neptune-mass planet, respectively, assuming a 10 yr mission, a 4 au semi-major axis, and a 1 M_sun star. The period is the orbital parameter that can be determined with the best accuracy, with a median relative difference between input and output periods of 4.2% (2.9%) assuming a 5 yr (10 yr) mission. The median accuracy of the semi-major axis of the orbit can be recovered with a median relative error of 7% (6%). The eccentricity can also be recovered with a median absolute accuracy of 0.07 (0.06).Comment: 18 pages, 11 figures. New version accepted by A&A for publicatio

    The Tycho-Gaia astrometric solution. How to get 2.5 million parallaxes with less than one year of Gaia data

    Full text link
    Context. The first release of astrometric data from Gaia will contain the mean stellar positions and magnitudes from the first year of observations, and proper motions from the combination of Gaia data with Hipparcos prior information (HTPM). Aims. We study the potential of using the positions from the Tycho-2 Catalogue as additional information for a joint solution with early Gaia data. We call this the Tycho-Gaia astrometric solution (TGAS). Methods. We adapt Gaia's Astrometric Global Iterative Solution (AGIS) to incorporate Tycho information, and use simulated Gaia observations to demonstrate the feasibility of TGAS and to estimate its performance. Results. Using six to twelve months of Gaia data, TGAS could deliver positions, parallaxes and annual proper motions for the 2.5 million Tycho-2 stars, with sub-milliarcsecond accuracy. TGAS overcomes some of the limitations of the HTPM project and allows its execution half a year earlier. Furthermore, if the parallaxes from Hipparcos are not incorporated in the solution, they can be used as a consistency check of the TGAS/HTPM solution.Comment: Accepted for publication in A&A, 24 Dec 201

    Abundance of Belugas, Delphinapterus leucas, in Cook Inlet, Alaska, 1994–2000

    Get PDF
    Annual abundance estimates of belugas, Delphinapterus leucas, in Cook Inlet were calculated from counts made by aerial observers and aerial video recordings. Whale group-size estimates were corrected for subsurface whales (availability bias) and whales that were at the surface but were missed (detection bias). Logistic regression was used to estimate the probability that entire groups were missed during the systematic surveys, and the results were used to calculate a correction to account for the whales in these missed groups (1.015, CV = 0.03 in 1994–98; 1.021, CV = 0.01 in 1999– 2000). Calculated abundances were 653 (CV = 0.43) in 1994, 491 (CV = 0.44) in 1995, 594 (CV = 0.28) in 1996, 440 (CV = 0.14) in 1997, 347 (CV = 0.29) in 1998, 367 (CV = 0.14) in 1999, and 435 (CV = 0.23, 95% CI=279–679) in 2000. For management purposes the current Nbest = 435 and Nmin = 360. These estimates replace preliminary estimates of 749 for 1994 and 357 for 1999. Monte Carlo simulations indicate a 47% probability that from June 1994 to June 1998 abundance of the Cook Inlet stock of belugas was depleted by 50%. The decline appears to have stopped in 1998

    Beluga, Delphinapterus leucas, Group Sizes in Cook Inlet, Alaska, Based on Observer Counts and Aerial Video

    Get PDF
    Belugas, Delphinapterus leucas, groups were videotaped concurrent to observer counts during annual NMFS aerial surveys of Cook Inlet, Alaska, from 1994 to 2000. The videotapes provided permanent records of whale groups that could be examined and compared to group size estimates ade by aerial observers.Examination of the video recordings resulted in 275 counts of 79 whale groups. The McLaren formula was used to account for whales missed while they were underwater (average correction factor 2.03; SD=0.64). A correction for whales missed due to video resolution was developed by using a second, paired video camera that magnified images relative to the standard video. This analysis showed that some whales were missed either because their image size fell below the resolution of hte standard video recording or because two whales surfaced so close to each other that their images appeared to be one large whale. The correction method that resulted depended on knowing the average whale image size in the videotapes. Image sizes were measured for 2,775 whales from 275 different passes over whale groups. Corrected group sizes were calcualted as the product of the original count from video, the correction factor for whales missed underwater, and the correction factor for whales missed due to video resolution (averaged 1.17; SD=0.06). A regression formula was developed to estimate group sizes from aerial observer counts; independent variables were the aerial counts and an interaction term relative to encounter rate (whales per second during the counting of a group), which were regressed against the respective group sizes as calculated from the videotapes. Significant effects of encounter rate, either positive or negative, were found for several observers. This formula was used to estimate group size when video was not available. The estimated group sizes were used in the annual abundance estimates

    Congestion Management in European Power Networks: Criteria to Assess the Available Options

    Get PDF
    EU Member States are pursuing large scale investment in renewable generation in order to meet a 2020 target to source 20% of total energy sources by renewables. As the location for this new generation differs from the location of existing generation sources, and is often on the extremities of the electricity network, it will create new flow patterns and transmission needs. While congestion exists between European countries, increasing the penetration of variable sources of energy will change the current cross-border congestion profile. It becomes increasingly important for the power market design to foster the full use of existing transmission capacity and allow for robust operation even in the presence of system congestion. After identifying five criteria that an effective congestion management scheme for European countries will need, this paper critically assess to what extent the various approaches satisfy the requirements.Power market design, integrating renewables, congestion management

    Gaia astrometry for stars with too few observations - a Bayesian approach

    Full text link
    Gaia's astrometric solution aims to determine at least five parameters for each star, together with appropriate estimates of their uncertainties and correlations. This requires at least five distinct observations per star. In the early data reductions the number of observations may be insufficient for a five-parameter solution, and even after the full mission many stars will remain under-observed, including faint stars at the detection limit and transient objects. In such cases it is reasonable to determine only the two position parameters. Their formal uncertainties would however grossly underestimate the actual errors, due to the neglected parallax and proper motion. We aim to develop a recipe to calculate sensible formal uncertainties that can be used in all cases of under-observed stars. Prior information about the typical ranges of stellar parallaxes and proper motions is incorporated in the astrometric solution by means of Bayes' rule. Numerical simulations based on the Gaia Universe Model Snapshot (GUMS) are used to investigate how the prior influences the actual errors and formal uncertainties when different amounts of Gaia observations are available. We develop a criterion for the optimum choice of priors, apply it to a wide range of cases, and derive a global approximation of the optimum prior as a function of magnitude and galactic coordinates. The feasibility of the Bayesian approach is demonstrated through global astrometric solutions of simulated Gaia observations. With an appropriate prior it is possible to derive sensible positions with realistic error estimates for any number of available observations. Even though this recipe works also for well-observed stars it should not be used where a good five-parameter astrometric solution can be obtained without a prior. Parallaxes and proper motions from a solution using priors are always biased and should not be used.Comment: Revised version, accepted 21st of August 2015 for publication in A&

    Implicit contracts, takeovers and corporate governance: in the shadow of the city code

    Get PDF
    This paper offers a qualitative, case-study based analysis of hostile takeover bids mounted in the UK in the mid-1990s under the regime of the City Code on Takeovers and Mergers. It is shown that during bids, directors of bid targets focus on the concerns of target shareholders to the exclusion of other stakeholder groups. A review of the case studies five years on find that, almost withouth exception, mergers led to large-scale job losses and asset disposals. However, almost none of the bids were considered by financial commentators, at this point, to have generated shareholder value for investors in that merged company. While there is therefore clear evidence that the Takeover Code is effective in protecting the interests of target shareholders, the implications of the Code for efficiency in corporate performance are much less certain.hostile takeovers, stakeholding, implicit contracts, breach of trust

    ConStance: Modeling Annotation Contexts to Improve Stance Classification

    Full text link
    Manual annotations are a prerequisite for many applications of machine learning. However, weaknesses in the annotation process itself are easy to overlook. In particular, scholars often choose what information to give to annotators without examining these decisions empirically. For subjective tasks such as sentiment analysis, sarcasm, and stance detection, such choices can impact results. Here, for the task of political stance detection on Twitter, we show that providing too little context can result in noisy and uncertain annotations, whereas providing too strong a context may cause it to outweigh other signals. To characterize and reduce these biases, we develop ConStance, a general model for reasoning about annotations across information conditions. Given conflicting labels produced by multiple annotators seeing the same instances with different contexts, ConStance simultaneously estimates gold standard labels and also learns a classifier for new instances. We show that the classifier learned by ConStance outperforms a variety of baselines at predicting political stance, while the model's interpretable parameters shed light on the effects of each context.Comment: To appear at EMNLP 201
    • 

    corecore