14,436 research outputs found

    Study of the impact of cruise speed on scheduling and productivity of commercial transport aircraft

    Get PDF
    A comparison is made between airplane productivity and utilization levels derived from commercial airline type schedules which were developed for two subsonic and four supersonic cruise speed aircraft. The cruise speed component is the only difference between the schedules which are based on 1995 passenger demand forecasts. Productivity-to-speed relationships were determined for the three discrete route systems: North Atlantic, Trans-Pacific, and North-South America. Selected combinations of these route systems were also studied. Other areas affecting the productivity-to-speed relationship such as aircraft design range and scheduled turn time were examined

    Asymmetry, Loss Aversion and Forecasting

    Get PDF
    Conditional volatility models, such as GARCH, have been used extensively in financial applications to capture predictable variation in the second moment of asset returns. However, with recent theoretical literature emphasising the loss averse nature of agents, this paper considers models which capture time variation in the second lower partial moment. Utility based evaluation is carried out on several approaches to modelling the conditional second order lower partial moment (or semi-variance), including distribution and regime based models. The findings show that when agents are loss averse, there are utility gains to be made from using models which explicitly capture this feature (rather than trying to approximate using symmetric volatility models). In general direct approaches to modelling the semi-variance are preferred to distribution based models. These results are relevant to risk management and help to link the theoretical discussion on loss aversion to emprical modellingAsymmetry, loss aversion, semi-variance, volatility models.

    Studying and Modeling the Connection between People's Preferences and Content Sharing

    Full text link
    People regularly share items using online social media. However, people's decisions around sharing---who shares what to whom and why---are not well understood. We present a user study involving 87 pairs of Facebook users to understand how people make their sharing decisions. We find that even when sharing to a specific individual, people's own preference for an item (individuation) dominates over the recipient's preferences (altruism). People's open-ended responses about how they share, however, indicate that they do try to personalize shares based on the recipient. To explain these contrasting results, we propose a novel process model of sharing that takes into account people's preferences and the salience of an item. We also present encouraging results for a sharing prediction model that incorporates both the senders' and the recipients' preferences. These results suggest improvements to both algorithms that support sharing in social media and to information diffusion models.Comment: CSCW 201

    Mixed tenure orthodoxy: practitioner reflections on policy effects

    Get PDF
    This article examines mixed tenure as a policy orthodoxy. It first sets out how mixed tenure may be considered to constitute an orthodoxy within planning, being generally accepted as a theory and practice even in the absence of supporting evidence. Five elements of this orthodoxy are identified, relating to (1) housing and the environment, (2) social change, (3) economic impacts, (4) sustainable communities, (5) and sociospatial integration. Interviews with practitioners involved with three social housing estates that have experienced mixed-tenure policy interventions are reported to consider why the implementation and effects of mixed tenure might not correspond with the orthodox understanding. It is argued that policy ambiguity and weaknesses in policy theory and specification, alongside practical constraints, lie behind incomplete and counterproductive policy implementation, but a belief in pursuing the policy orthodoxy persists nevertheless

    A Way to Dynamically Overcome the Cosmological Constant Problem

    Full text link
    The Cosmological Constant problem can be solved once we require that the full standard Einstein Hilbert lagrangian, gravity plus matter, is multiplied by a total derivative. We analyze such a picture writing the total derivative as the covariant gradient of a new vector field (b_mu). The dynamics of this b_mu field can play a key role in the explanation of the present cosmological acceleration of the Universe.Comment: 5 page

    Swift/UVOT Photometry of the Planetary Nebula WeBo 1: Unmasking A Faint Hot Companion Star

    Get PDF
    We present an analysis of over 150 ks of data on the planetary nebula WeBo 1 (PN G135.6+01.0) obtained with the Swift Ultraviolet Optical Telescope (UVOT). The central object of this nebula has previously been described as a late-type K giant barium star with a possible hot companion, most likely a young pre-white dwarf. UVOT photometry shows that while the optical photometry is consistent with a large cool object, the near-ultraviolet (UV) photometry shows far more UV flux than could be produced by any late-type object. Using model stellar atmospheres and a comparison to UVOT photometry for the pre-white dwarf PG 1159-035, we find that the companion has a temperature of at least 40,000 K and a radius of, at most, 0.056 R_sun. While the temperature and radius are consistent with a hot compact stellar remnant, they are lower and larger, respectively, than expected for a typical young pre-white dwarf. This likely indicates a deficiency in the assumed UV extinction curve. We find that higher temperatures more consistent with expectations for a pre-white dwarf can be derived if the foreground dust has a strong "blue bump" at 2175 AA and a lower R_V. Our results demonstrate the ability of Swift to both uncover and characterize hot hidden companion stars and to constrain the UV extinction properties of foreground dust based solely on UVOT photometry.Comment: 26 pages, 9 figure, accepted to Astronomical Journa

    A multi-method approach to delineate and validate migratory corridors

    Get PDF
    Context: Managers are faced with numerous methods for delineating wildlife movement corridors, and often must make decisions with limited data. Delineated corridors should be robust to different data and models. Objectives: We present a multi-method approach for delineating and validating wildlife corridors using multiple data sources, which can be used conserve landscape connectivity. We used this approach to delineate and validate migration corridors for wildebeest (Connochaetes taurinus) in the Tarangire Ecosystem of northern Tanzania. Methods: We used two types of locational data (distance sampling detections and GPS collar locations), and three modeling methods (negative binomial regression, logistic regression, and Maxent), to generate resource selection functions (RSFs) and define resistance surfaces. We compared two corridor detection algorithms (cost-distance and circuit theory), to delineate corridors. We validated corridors by comparing random and wildebeest locations that fell within corridors, and cross-validated by data type. Results: Both data types produced similar RSFs. Wildebeest consistently selected migration habitat in flatter terrain farther from human settlements. Validation indicated three of the combinations of data type, modeling, and corridor detection algorithms (detection data with Maxent modeling, GPS collar data with logistic regression modeling, and GPS collar data with Maxent modeling, all using cost-distance) far outperformed the other seven. We merged the predictive corridors from these three data-method combinations to reveal habitat with highest probability of use. Conclusions: The use of multiple methods ensures that planning is able to prioritize conservation of migration corridors based on all available information
    corecore