1,271 research outputs found

    New models to estimate costs of US farm programs

    Get PDF
    In this study, I extended the stochastic model built by Babcock and Paulson (2012) to conduct a one-year cost projection for crop insurance in order to investigate its feasibility of being solely provided by private firms. Based on the 52 years’ yield data from 1961 to 2012, the risk consequences from insuring crop yield and revenue against losses are estimated to be far beyond what private insurers could bear on their own. However, reinsurance from the government provides an attractive incentive to insurance firms. Among the six insurance policies researched in this study, the minimum expected net underwriting gains to private firms with government reinsurance in 2013 was 289.9million,whichisabout9.30289.9 million, which is about 9.30% of retained premiums. The maximum loss that firms could have borne in 2013 was 4.9 billion. In addition, the impact of a proposal to eliminate premium subsidy for the harvest price option is also estimated. The total savings for taxpayers are estimated to be 1.3billion,whichisabout1.3 billion, which is about 400 million more than CBO’s estimate in 2013, but only 67% of its estimate in 2015. Based on the three-crop competitive storage model initiated by Lence and Hayes (2002), I also develop a better approach for a multiple-year cost projection by modeling the demand shock as a random walk. This approach is capable of preserving the correlations between national yields and prices, maintaining the relationships among national, county and farm yields, retaining the spatial correlations of yields across crops, and incorporating inter-temporal price correlations as well. More importantly, this approach is capable of simulating price draws with a desired volatility pattern: increasing over time but at a slower rate than square root of time t, as stated in Lence, Hart and Hayes (2009). Preserving these correlations and price-related features are crucial in conducting precise cost estimations and valid policy analysis. My analysis shows that the payments from Price Loss Coverage (PLC) with its time-invariant fixed guarantees would be significantly underestimated if both the price serial correlation and the increasing, concave price volatilities are ignored. For Agriculture Risk Coverage (ARC) and Supplemental Coverage Option (SCO), their guarantees are adjusted to reflect market conditions so the difference in estimated payments is modest. An easy fix for estimating the cost of PLC is to inflate the price volatilities used to generate random prices for budget scoring purposes

    Automatic compensating cleanup operation

    Get PDF
    Journal ArticleToday's part geometries are becoming ever more complex and require more accurate tool path to manufacture. Machining process efficiency is also a major consideration for designers as well as manufacturing engineers. Although the current advanced CAD/CAM systems have greatly improved the efficiency and accuracy of machining with the introduction of Numerically Controlled (NC) machining, excessive material may still be left on the finished part due to machining constraints, including the inaccessibility of the designed part geometry with respect the cutter, machine motion constraints like ramp angles, specific cutting patterns, etc. Polishing operations such as grinding and hand finishing are quite time consuming and expensive and may damage the surface of the part or introduce inaccuracies because of human errors. Although most of the existing machining approaches attempt to reduce such excessive restmaterials by modifying NC tool paths, none of them is satisfactory. They can be time consuming, error prone, computationally intensive, too complicated to implement, and limited to certain problem domains. A compensating cleanup tool path will be developed in this research to automatically remove these excessive material from the finish part. This method greatly reduces the burden of hand finishing and polishing and also reduces the error and complexities introduced in manually generating cleanup tool paths in the shop floor. More important, the tool path generated by this method will reduce the machining time and increase tool life compared with optimized tool path which left no excessive material behind

    Family Leadership Shift in China

    Get PDF
    "This paper discusses the Chinese traditional family leadership and its transformation during the process of modernisation. Since Confucianism shaped and influenced Chinese society and cultural values, so main Confucian ideas on family leadership would be discussed in this first part. In particular, this part concentrates on the Confucian tradition on three relationships, i.e. the relationship between father and son, husband and wife, and younger and older brother. Then, the second part will discuss the prodigious changes that have taken place in Chinese society in the last century, with an analysis of the three stages or elements which brought about these changes. In the third and fourth part of this paper, I discuss the main factors of the new family leadership and its main challenges in the face of Christian values.", (p. 23, Introdution

    Investigations of supernovae and supernova remnants in the era of SKA

    Full text link
    Two main physical mechanisms are used to explain supernova explosions: thermonuclear explosion of a white dwarf(Type Ia) and core collapse of a massive star (Type II and Type Ib/Ic). Type Ia supernovae serve as distance indicators that led to the discovery of the accelerating expansion of the Universe. The exact nature of their progenitor systems however remain unclear. Radio emission from the interaction between the explosion shock front and its surrounding CSM or ISM provides an important probe into the progenitor star's last evolutionary stage. No radio emission has yet been detected from Type Ia supernovae by current telescopes. The SKA will hopefully detect radio emission from Type Ia supernovae due to its much better sensitivity and resolution. There is a 'supernovae rate problem' for the core collapse supernovae because the optically dim ones are missed due to being intrinsically faint and/or due to dust obscuration. A number of dust-enshrouded optically hidden supernovae should be discovered via SKA1-MID/survey, especially for those located in the innermost regions of their host galaxies. Meanwhile, the detection of intrinsically dim SNe will also benefit from SKA1. The detection rate will provide unique information about the current star formation rate and the initial mass function. A supernova explosion triggers a shock wave which expels and heats the surrounding CSM and ISM, and forms a supernova remnant (SNR). It is expected that more SNRs will be discovered by the SKA. This may decrease the discrepancy between the expected and observed numbers of SNRs. Several SNRs have been confirmed to accelerate protons, the main component of cosmic rays, to very high energy by their shocks. This brings us hope of solving the Galactic cosmic ray origin's puzzle by combining the low frequency (SKA) and very high frequency (Cherenkov Telescope Array: CTA) bands' observations of SNRs.Comment: To be published in: "Advancing Astrophysics with the Square Kilometre Array", Proceedings of Science, PoS(AASKA14

    Robust boolean set operations for manifold solids bounded by planar and natural quadric surfaces

    Get PDF
    Journal ArticleThis paper describes our latest effort in robust solid modeling. An algorithm for set operations on solids bounded by planar and natural quadric surfaces, that handles all geometrically degenerate cases robustly, is described. We identify as the main reason for the lack of robustness in geometric modeling, that dependent relations are handled inconsistently by disregarding the dependencies. Instead of using explicit reasoning to make dependent decisions consistent, we show that redundant computation can be avoided by correctly ordering the operations, and redundant data can be eliminated in the set operation algorithm, so that the result is guaranteed to be a valid two-manifold solid

    Robust solid modeling by avoiding redundancy for manifold objects in boundary representation

    Get PDF
    Journal ArticleThis paper describes a new approach to the robustness problem in solid modeling. We identify as t h e main cause of t h e lack of robustness that interdependent topological relations are derived from approximate data. Disregarding the interdependencies very likely violates basic properties, such as reflexivity, and transitivity, resulting in invalid data representations, such as dangling edges, missing faces, etc. We show that the boundary of manifold objects can be represented without redundant relations which avoids inconsistencies. An algorithm for regularized set operations for manifold solids which is based on the principle of avoiding and eliminating redundancy is described. This algorithm has been implemented for objects bounded by planar and natural quadric surfaces; it handles coincidence and incidence cases between surfaces and curves robustly
    • …
    corecore