14,478 research outputs found

    IMPACTS OF LIBERALIZING THE JAPANESE PORK MARKET

    Get PDF
    The Japanese pork market is protected by a complex set of restrictions, including a variable levy and an import tariff. The combination of these policies distorts the quantity, price, and form of Japanese pork imports. An important issue relevant to the liberalization of the Japanese pork market is the accurate measurement of the price wedge between Japanese and world pork prices. The analysis indicates that the tariff equivalent of the price wedge over the 1986-88 period was 44%. If the tariff equivalent of the price wedge is reduced over a ten-year period, Japanese pork imports are projected to increase by over 39% initially and by over 215% compared to baseline projections by the year 2000. Producer welfare can be maintained by a deficiency payment scheme. A less costly alternative is an industry buffer scheme, which maintains the level of the pork industry for two years and then implements a declining deficiency payment scheme that limits the decrease in production levels to 5% per year.International Relations/Trade,

    A sequential real-time refinement calculus

    Get PDF
    We present a comprehensive refinement calculus for the development of sequential, real-time programs from real-time specifications. A specification may include not only execution time limits, but also requirements on the behaviour of outputs over the duration of the execution of the program. The approach allows refinement steps that separate timing constraints and functional requirements. New rules are provided for handling timing constraints, but the refinement of components implementing functional requirements is essentially the same as in the standard refinement calculus. The product of the refinement process is a program in the target programming language extended with timing deadline directives. The extended language is a machine-independent, real-time programming language. To provide valid machine code for a particular model of machine, the machine code produced by a compiler must be analysed to guarantee that it meets the specified timing deadlines

    Radar sounding using the Cassini altimeter waveform modeling and Monte Carlo approach for data inversion observations of Titan's seas

    Get PDF
    Recently, the Cassini RADAR has been used as a sounder to probe the depth and constrain the composition of hydrocarbon seas on Saturn's largest moon, Titan. Altimetry waveforms from observations over the seas are generally composed of two main reflections: the first from the surface of the liquid and the second from the seafloor. The time interval between these two peaks is a measure of sea depth, and the attenuation from the propagation through the liquid is a measure of the dielectric properties, which is a sensitive property of liquid composition. Radar measurements are affected by uncertainties that can include saturation effects, possible receiver distortion, and processing artifacts, in addition to thermal noise and speckle. To rigorously treat these problems, we simulate the Ku-band altimetry echo received from Titan's seas using a two-layer model, where the surface is represented by a specular reflection and the seafloor is modeled using a facet-based synthetic surface. The simulation accounts for the thermal noise, speckle, analog-to-digital conversion, and block adaptive quantization and allows for possible receiver saturation. We use a Monte Carlo method to compare simulated and observed waveforms and retrieve the probability distributions of depth, surface/subsurface intensity ratio, and subsurface roughness for the individual double-peaked waveform of Ligeia Mare acquired by the Cassini spacecraft in May 2013. This new analysis provides an update to the Ku-band attenuation and results in a new estimate for its loss tangent and composition. We also demonstrate the ability to retrieve bathymetric information from saturated altimetry echoes acquired over Ontario Lacus in December 2008

    A Search for Intrinsic Polarization in O Stars with Variable Winds

    Get PDF
    New observations of 9 of the brightest northern O stars have been made with the Breger polarimeter on the 0.9~m telescope at McDonald Observatory and the AnyPol polarimeter on the 0.4~m telescope at Limber Observatory, using the Johnson-Cousins UBVRI broadband filter system. Comparison with earlier measurements shows no clearly defined long-term polarization variability. For all 9 stars the wavelength dependence of the degree of polarization in the optical range can be fit by a normal interstellar polarization law. The polarization position angles are practically constant with wavelength and are consistent with those of neighboring stars. Thus the simplest conclusion is that the polarization of all the program stars is primarily interstellar. The O stars chosen for this study are generally known from ultraviolet and optical spectroscopy to have substantial mass loss rates and variable winds, as well as occasional circumstellar emission. Their lack of intrinsic polarization in comparison with the similar Be stars may be explained by the dominance of radiation as a wind driving force due to higher luminosity, which results in lower density and less rotational flattening in the electron scattering inner envelopes where the polarization is produced. However, time series of polarization measurements taken simultaneously with H-alpha and UV spectroscopy during several coordinated multiwavelength campaigns suggest two cases of possible small-amplitude, periodic short-term polarization variability, and therefore intrinsic polarization, which may be correlated with the more widely recognized spectroscopic variations.Comment: LaTeX2e, 22 pages including 11 tables; 12 separate gif figures; uses aastex.cls preprint package; accepted by The Astronomical Journa

    Consumer responses to corporate social irresponsibility: The role of moral emotions, evaluations, and social cognitions

    Full text link
    We investigate the mediating roles of moral emotions and attitudes between perceptions of corporate irresponsible actions, on the one hand, and consumer responses, on the other hand, and further examine their contingencies based on consumer social cognitions. Our findings show that, for corporate transgressions, multiple social cognitions (moral identity, relational and collective self‐concepts, and affective empathy) moderate the elicitation of negative moral emotions (contempt and anger) and overall evaluations (attitudes), which, in turn, lead to negative responses toward the company (negative word of mouth, complaint behaviors, and boycotting). Our study adds to extant research on corporate social irresponsibility by examining three generic reactions people have toward corporate social irresponsibility and demonstrating important boundary conditions. In addition, hypotheses are tested on a sample of adult consumers. Implications for communication by firms are considered.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/149212/1/mar21197.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/149212/2/mar21197_am.pd

    Exoplanetary atmosphere target selection in the era of comparative planetology

    Full text link
    The large number of new planets expected from wide-area transit surveys means that follow-up transmission spectroscopy studies of their atmospheres will be limited by the availability of telescope assets. We argue that telescopes covering a broad range of apertures will be required, with even 1m-class instruments providing a potentially important contribution. Survey strategies that employ automated target selection will enable robust population studies. As part of such a strategy, we propose a decision metric to pair the best target to the most suitable telescope, and demonstrate its effectiveness even when only primary transit observables are available. Transmission spectroscopy target selection need not therefore be impeded by the bottle-neck of requiring prior follow-up observations to determine the planet mass. The decision metric can be easily deployed within a distributed heterogeneous network of telescopes equipped to undertake either broadband photometry or spectroscopy. We show how the metric can be used either to optimise the observing strategy for a given telescope (e.g. choice of filter) or to enable the selection of the best telescope to optimise the overall sample size. Our decision metric can also provide the basis for a selection function to help evaluate the statistical completeness of follow-up transmission spectroscopy datasets. Finally, we validate our metric by comparing its ranked set of targets against lists of planets that have had their atmospheres successfully probed, and against some existing prioritised exoplanet lists.Comment: 20 pages, 16 figures, 3 tables. Revision 3, accepted by MNRAS. Improvements include always using planetary masses where available and reliable, treatment for sky backgrounds and out-of-transit noise and a use case for defocused photometr

    Abyssinia

    Get PDF
    n/
    • 

    corecore