2,560 research outputs found

    Counterfactual Mean-variance Optimization

    Full text link
    We study a new class of estimands in causal inference, which are the solutions to a stochastic nonlinear optimization problem that in general cannot be obtained in closed form. The optimization problem describes the counterfactual state of a system after an intervention, and the solutions represent the optimal decisions in that counterfactual state. In particular, we develop a counterfactual mean-variance optimization approach, which can be used for optimal allocation of resources after an intervention. We propose a doubly-robust nonparametric estimator for the optimal solution of the counterfactual mean-variance program. We go on to analyze rates of convergence and provide a closed-form expression for the asymptotic distribution of our estimator. Our analysis shows that the proposed estimator is robust against nuisance model misspecification, and can attain fast n\sqrt{n} rates with tractable inference even when using nonparametric methods. This result is applicable to general nonlinear optimization problems subject to linear constraints whose coefficients are unknown and must be estimated. In this way, our findings contribute to the literature in optimization as well as causal inference. We further discuss the problem of calibrating our counterfactual covariance estimator to improve the finite-sample properties of our proposed optimal solution estimators. Finally, we evaluate our methods via simulation, and apply them to problems in healthcare policy and portfolio construction

    Design of a High-Speed Architecture for Stabilization of Video Captured Under Non-Uniform Lighting Conditions

    Get PDF
    Video captured in shaky conditions may lead to vibrations. A robust algorithm to immobilize the video by compensating for the vibrations from physical settings of the camera is presented in this dissertation. A very high performance hardware architecture on Field Programmable Gate Array (FPGA) technology is also developed for the implementation of the stabilization system. Stabilization of video sequences captured under non-uniform lighting conditions begins with a nonlinear enhancement process. This improves the visibility of the scene captured from physical sensing devices which have limited dynamic range. This physical limitation causes the saturated region of the image to shadow out the rest of the scene. It is therefore desirable to bring back a more uniform scene which eliminates the shadows to a certain extent. Stabilization of video requires the estimation of global motion parameters. By obtaining reliable background motion, the video can be spatially transformed to the reference sequence thereby eliminating the unintended motion of the camera. A reflectance-illuminance model for video enhancement is used in this research work to improve the visibility and quality of the scene. With fast color space conversion, the computational complexity is reduced to a minimum. The basic video stabilization model is formulated and configured for hardware implementation. Such a model involves evaluation of reliable features for tracking, motion estimation, and affine transformation to map the display coordinates of a stabilized sequence. The multiplications, divisions and exponentiations are replaced by simple arithmetic and logic operations using improved log-domain computations in the hardware modules. On Xilinx\u27s Virtex II 2V8000-5 FPGA platform, the prototype system consumes 59% logic slices, 30% flip-flops, 34% lookup tables, 35% embedded RAMs and two ZBT frame buffers. The system is capable of rendering 180.9 million pixels per second (mpps) and consumes approximately 30.6 watts of power at 1.5 volts. With a 1024×1024 frame, the throughput is equivalent to 172 frames per second (fps). Future work will optimize the performance-resource trade-off to meet the specific needs of the applications. It further extends the model for extraction and tracking of moving objects as our model inherently encapsulates the attributes of spatial distortion and motion prediction to reduce complexity. With these parameters to narrow down the processing range, it is possible to achieve a minimum of 20 fps on desktop computers with Intel Core 2 Duo or Quad Core CPUs and 2GB DDR2 memory without a dedicated hardware

    Heterogeneity aware fault tolerance for extreme scale computing

    Get PDF
    Upcoming Extreme Scale, or Exascale, Computing Systems are expected to deliver a peak performance of at least 10^18 floating point operations per second (FLOPS), primarily through significant expansion in scale. A major concern for such large scale systems, however, is how to deal with failures in the system. This is because the impact of failures on system efficiency, while utilizing existing fault tolerance techniques, generally also increases with scale. Hence, current research effort in this area has been directed at optimizing various aspects of fault tolerance techniques to reduce their overhead at scale. One characteristic that has been overlooked so far, however, is heterogeneity, specifically in the rate at which individual components of the underlying system fail, and in the execution profile of a parallel application running on such a system. In this thesis, we investigate the implications of such types of heterogeneity for fault tolerance in large scale high performance computing (HPC) systems. To that end, we 1) study how knowledge of heterogeneity in system failure likelihoods can be utilized to make current fault tolerance schemes more efficient, 2) assess the feasibility of utilizing application imbalance for improved fault tolerance at scale, and 3) propose and evaluate changes to system level resource managers in order to achieve reliable job placement over resources with unequal failure likelihoods. The results in this thesis, taken together, demonstrate that heterogeneity in failure likelihoods significantly changes the landscape of fault tolerance for large scale HPC systems

    Monte Carlo experiments of market demand theory

    Get PDF
    This study investigated the present theory for market demand and discussed the limitations and restrictions of theory from an empirical perspective. The objective of the study was to set up a Monte Carlo model to analyze the relationship between consumer demand and market demand and investigate the market approximation characteristics in light of aggregation conditions. Assuming that income and prices follow lognormal distributions individual optimal allocations were computed and aggregated to market data. Then various market demand systems were applied and approximation characteristics were studied in terms of bias and variance of elasticities. The analyses were carried in several experiments based on different individual demand systems under various assumptions of distributions of income and prices;The results indicated that bias in elasticities may be considerable. The numerical exercises also indicated that rejection rate of Slutsky restrictions increased as the assumption of constant variance of distribution of income and prices was relaxed;Quadratic response surfaces for bias as percent of true elasticities and variance of Slutsky restrictions were also fitted based on experiments following a Central Composite design;The design factors included some of the individual demand coefficients and variance of income and price distributions. The fitted quadratic response surfaces for bias in elasticities indicated that the design factors were not important. On the other hand, the response surfaces fitted for variance of Slutsky restrictions indicated that the design factors were important

    Augmented Computational Design: Methodical Application of Artificial Intelligence in Generative Design

    Full text link
    This chapter presents methodological reflections on the necessity and utility of artificial intelligence in generative design. Specifically, the chapter discusses how generative design processes can be augmented by AI to deliver in terms of a few outcomes of interest or performance indicators while dealing with hundreds or thousands of small decisions. The core of the performance-based generative design paradigm is about making statistical or simulation-driven associations between these choices and consequences for mapping and navigating such a complex decision space. This chapter will discuss promising directions in Artificial Intelligence for augmenting decision-making processes in architectural design for mapping and navigating complex design spaces.Comment: This is the author's version of the book chapter Augmented Computational Design: Methodical Application of Artificial Intelligence in Generative Design. In Artificial Intelligence in Performance-Driven Design: Theories, Methods, and Tools Towards Sustainability, edited by Narjes Abbasabadi and Mehdi Ashayeri. Wiley, 202

    Essays on the environmental effects of agricultural production

    Get PDF
    This dissertation is devoted to the study of environmental effects of agricultural production. Recent periods of high demand for agricultural products and the increase of world commodity prices result, in part, from the implementation of biofuel policies and the growth of per-capita income in developing countries. The extent to which food, feed, and fuel demands are satisfied depends on the ability of agricultural supply to react to these events. In economics, supply response models are used as the framework to analyze these types of problems in providing estimated magnitudes of the mentioned effects. The accuracy with which these magnitudes are calculated impacts the measurement of environmental effects of agricultural production, such as green-house gas emissions and land use change at a global scale, having important consequences on country-level accountings. Chapter 2 analyzes the econometric applications of the Neoclassical duality theory of the firm intended to measure the response of production quantities to price changes. We find that the use of real-world market-based data, which is typically available to practitioners but includes features that contradict some hypothesis of the theory, induces bias in the estimated supply response values. In light of these results, Chapter 3 proposes an alternative approach that overcomes the problems encountered when duality theory is applied to real-world data. This novel approach combines market-based data with information about production functions, which are simultaneously used in the econometric estimation of the supply response parameters. The methodology employs Bayesian econometric methods and bases the complementarity among the various datasets on underlined theoretical relationships. An application of this approach to U.S. agriculture provides updated measures of crop yield elasticities with respect to prices. Chapter 4 takes on the issue of direct environmental effects from agricultural production. In particular, it documents and quantifies the effects on nitrous oxide emissions from cutting nitrogen fertilizer applications when farmers face a market instrument intended to discourage the excessive use of nitrogen in soils. An expected utility maximization problem is specified where the farmer chooses the optimal nitrogen application facing a nonlinear market instrument. The nonlinearity captures the nonlinear relationship between nitrogen applications and nitrous oxide emissions and is arguably more efficient than linear schemes. Simulation results for U.S. corn show that farmers are induced to significantly reduce their fertilization (and consequently emissions) with only minor effects on expected crop yields
    • …
    corecore