40,993 research outputs found

    The Predictive Utility of Generalized Expected Utility Theories

    Get PDF
    Many alternative theories have been proposed to explain violations of expected utility (EU) theory observed in experiments. Several recent studies test some of these alternative theories against each other. Formal tests used to judge the theories usually count the number of responses consistent with the theory, ignoring systematic variation in responses that are inconsistent. We develop a maximum-likelihood estimation method which uses all the information in the data, creates test statistics that can be aggregated across studies, and enables one to judge the predictive utility-the fit and parsimony-of utility theories. Analyses of 23 data sets, using several thousand choices, suggest a menu of theories which sacrifice the least parsimony for the biggest improvement in fit. The menu is: mixed fanning, prospect theory, EU, and expected value. Which theories are best is highly sensitive to whether gambles in a pair have the same support (EU fits better) or not (EU fits poorly). Our method may have application to other domains in which various theories predict different subsets of choices (e.g., refinements of Nash equilibrium in noncooperative games)

    Box truss analysis and technology development. Task 1: Mesh analysis and control

    Get PDF
    An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna

    Kalman-filter control schemes for fringe tracking. Development and application to VLTI/GRAVITY

    Full text link
    The implementation of fringe tracking for optical interferometers is inevitable when optimal exploitation of the instrumental capacities is desired. Fringe tracking allows continuous fringe observation, considerably increasing the sensitivity of the interferometric system. In addition to the correction of atmospheric path-length differences, a decent control algorithm should correct for disturbances introduced by instrumental vibrations, and deal with other errors propagating in the optical trains. We attempt to construct control schemes based on Kalman filters. Kalman filtering is an optimal data processing algorithm for tracking and correcting a system on which observations are performed. As a direct application, control schemes are designed for GRAVITY, a future four-telescope near-infrared beam combiner for the Very Large Telescope Interferometer (VLTI). We base our study on recent work in adaptive-optics control. The technique is to describe perturbations of fringe phases in terms of an a priori model. The model allows us to optimize the tracking of fringes, in that it is adapted to the prevailing perturbations. Since the model is of a parametric nature, a parameter identification needs to be included. Different possibilities exist to generalize to the four-telescope fringe tracking that is useful for GRAVITY. On the basis of a two-telescope Kalman-filtering control algorithm, a set of two properly working control algorithms for four-telescope fringe tracking is constructed. The control schemes are designed to take into account flux problems and low-signal baselines. First simulations of the fringe-tracking process indicate that the defined schemes meet the requirements for GRAVITY and allow us to distinguish in performance. In a future paper, we will compare the performances of classical fringe tracking to our Kalman-filter control.Comment: 17 pages, 8 figures, accepted for publication in A&

    Data compression techniques applied to high resolution high frame rate video technology

    Get PDF
    An investigation is presented of video data compression applied to microgravity space experiments using High Resolution High Frame Rate Video Technology (HHVT). An extensive survey of methods of video data compression, described in the open literature, was conducted. The survey examines compression methods employing digital computing. The results of the survey are presented. They include a description of each method and assessment of image degradation and video data parameters. An assessment is made of present and near term future technology for implementation of video data compression in high speed imaging system. Results of the assessment are discussed and summarized. The results of a study of a baseline HHVT video system, and approaches for implementation of video data compression, are presented. Case studies of three microgravity experiments are presented and specific compression techniques and implementations are recommended

    Why are Prices Sticky? Evidence from Business Survey Data

    Get PDF
    This paper offers new insights on the price setting behaviour of German retail firms using a novel dataset that consists of a large panel of monthly business surveys from 1991-2006. The firm-level data allows matching changes in firms' prices to several other firm-characteristics. Moreover, information on price expectations allow analyzing the determinants of price updating. Using univariate and bivariate ordered probit specifications, empirical menu cost models are estimated relating the probability of price adjustment and price updating, respectively, to both time- and state- dependent variables. First, results suggest an important role for state-dependence; changes in the macroeconomic and institutional environment as well as firm-specific factors are significantly related to the timing of price adjustment. These findings imply that price setting models should endogenize the timing of price adjustment in order to generate realistic predictions concerning the transmission of monetary policy. Second, an analysis of price expectations yields similar results providing evidence in favour of state-dependent sticky plan models. Third, intermediate input cost changes are among the most important determinants of price adjustment suggesting that pricing models should explicitly incorporate price setting at different production stages. However, the results show that adjustment to input cost changes takes time indicating "additional stickiness" at the last stage of processing

    Understanding Trading Behavior in 401(k) Plans

    Get PDF
    We use a new database covering 1.2 million active participants to study trading activities in 1,530 defined contribution retirement plans. Descriptive statistics and regression analysis indicate some interesting trading patterns. First, we show that trading activity in 401(k) accounts is very limited: only 20% of participants ever reshuffled their portfolios in two years. Second, demographic characteristics are strongly associated with trading activities: traders are older, wealthier, more highly paid, male employees with longer plan tenure. Finally, we find that plan design factors, such as the number of funds offered, loan availability, and specific fund-families offered have significant impacts on 401(k) plan participants’ trading behavior. Moreover, on-line access channels stimulate participants to trade more frequently, although they do not increase turnover ratio as much. We conclude that plan design features are crucial in sharing trading patterns in 401(k) plans.

    Lumpy Price Adjustments: A Microeconometric Analysis

    Get PDF
    This paper presents a simple model of state-dependent pricing that allows identification of the relative importance of the degree of price rigidity that is inherent to the price setting mechanism (intrinsic) and that which is due to the price’s driving variables (extrinsic). Using two data sets consisting of a large fraction of the price quotes used to compute the Belgian and French CPI, we are able to assess the role of intrinsic and extrinsic price stickiness in explaining the occurrence and magnitude of price changes at the outlet level. We find that infrequent price changes are not necessarily associated with large adjustment costs. Indeed, extrinsic rigidity appears to be significant in many cases. We also find that asymmetry in the price adjustment could be due to trends in marginal costs and/or desired mark-ups rather than asymmetric cost of adjustment bands
    • …
    corecore