444 research outputs found

    A Theoretical Analysis of Multiproduct Mergers: Application in the Major Meat Processing Sectors

    Get PDF
    The research is motivated by the significant increase in multiproduct mergers in the meat-protein processing sector, whereby the largest firms now process beef, pork, and chicken. This thesis conducts a theoretical merger analysis, accounting for both within- and across-submarket substitution of demand related goods. The model developed is suitable for analyzing markets in which there are identifiable consumer submarkets within a larger market. The results indicate two primary findings. The first finding is that Bertrand firms have a unilateral incentive to merge. Firms involved in a given merger increase profit, as well as those not included in the merger. Second, it is found that without sufficient realized scope economies by the merged firm, significant anticompetitive price increases are likely. However, as substitutability within and across submarkets tend towards each other in magnitude, the required cost reductions for welfare neutrality increase vastly. Additionally, guidelines for future empirical analysis are discussed

    Perspectives on climate change mitigation

    Get PDF
    1. Warming and associated climate effects from CO2 3 emissions persist for decades to millennia. In the near-term, changes in climate are determined by past and present greenhouse gas emissions modified by natural variability. Reducing the total concentration of atmospheric CO2 is necessary to limit near-term climate change and stay below long-term warming targets (such as the oft-cited 3.6°F [2°C] goal). Other greenhouse gases (for example, methane) and black carbon aerosols exert stronger warming effects than CO2 on a per ton basis, but they do not persist as long in the atmosphere; therefore, mitigation of non-CO2 species contributes substantially to near term cooling benefits but cannot be relied upon for ultimate stabilization goals. (Very high confidence) 2. Stabilizing global mean temperature below long-term warming targets requires an upper limit on the accumulation of CO2 14 in the atmosphere. The relationship between cumulative CO2 emissions and global temperature response is estimated to be nearly linear. Nevertheless, in evaluating specific temperature targets, there are uncertainties about the exact amount of compatible anthropogenic CO2 emissions due to uncertainties in climate sensitivity, the response of the carbon cycle including feedbacks, the amount of past CO2 emissions, and the influence of past and future non-CO2 species. (Very high confidence) 3. Stabilizing global mean temperature below 3.6°F (2°C) or lower relative to preindustrial levels requires significant reductions in net global CO2 emissions relative to present-day values before 2040 and likely requires net emissions to become zero or possibly negative later in the century. Accounting for the temperature effects of non-CO2 species, cumulative CO2 emissions are required to stay below about 800 GtC in order to provide a two-thirds likelihood of preventing 3.6°F (2°C) of warming, meaning approximately 230 GtC more could be emitted globally. Assuming global emissions follow the range between the RCP8.5 and RCP4.5 scenarios, emissions could continue for approximately two decades before this cumulative carbon threshold is exceeded. (High confidence) 4. Successful implementation of the first round of Nationally Determined Contributions associated with the Paris Agreement will provide some likelihood of meeting the long term temperature goal of limiting global warming to “well below” 3.6°F (2°C) above preindustrial levels; the likelihood depends strongly on the magnitude of global emission reductions after 2030. (High confidence) 5. Climate intervention or geoengineering strategies such as solar radiation management are measures that attempt to limit or reduce global temperature increases. If interest in geoengineering increases with observed impacts and/or projected risks of climate change, interest will also increase in assessments of the technical feasibilities, costs, risks, co-benefits, and governance challenges of these additional measures, which are as yet unproven at scale. These assessments are a necessary step before judgments about the benefits and risks of these approaches can be made with high confidence. (High confidence

    Quantifying uncertainties in projections of extremes—a perturbed land surface parameter experiment

    Get PDF
    Uncertainties in the climate response to a doubling of atmospheric CO2 concentrations are quantified in a perturbed land surface parameter experiment. The ensemble of 108 members is constructed by systematically perturbing five poorly constrained land surface parameters of global climate model individually and in all possible combinations. The land surface parameters induce small uncertainties at global scale, substantial uncertainties at regional and seasonal scale and very large uncertainties in the tails of the distribution, the climate extremes. Climate sensitivity varies across the ensemble mainly due to the perturbation of the snow albedo parameterization, which controls the snow albedo feedback strength. The uncertainty range in the global response is small relative to perturbed physics experiments focusing on atmospheric parameters. However, land surface parameters are revealed to control the response not only of the mean but also of the variability of temperature. Major uncertainties are identified in the response of climate extremes to a doubling of CO2. During winter the response both of temperature mean and daily variability relates to fractional snow cover. Cold extremes over high latitudes warm disproportionately in ensemble members with strong snow albedo feedback and large snow cover reduction. Reduced snow cover leads to more winter warming and stronger variability decrease. As a result uncertainties in mean and variability response line up, with some members showing weak and others very strong warming of the cold tail of the distribution, depending on the snow albedo parametrization. The uncertainty across the ensemble regionally exceeds the CMIP3 multi-model range. Regarding summer hot extremes, the uncertainties are larger than for mean summer warming but smaller than in multi-model experiments. The summer precipitation response to a doubling of CO2 is not robust over many regions. Land surface parameter perturbations and natural variability alter the sign of the response even over subtropical region

    How Following Regulatory Guidance Can Increase Auditors’ Litigation Risk Exposure

    Get PDF
    This study investigates how following explicit regulatory guidance can result, unintentionally, in increased litigation risk exposure for auditors. We do so by examining the unique and specific context where the PCAOB directly instructs auditors how to apply professional judgment - to rely on a client’s competent and objective internal audit function (IAF) during multi-location audits. Consistent with theoretical predictions based on numerosity heuristic processing and norm theory, we find that holding all other factors constant, following explicit regulatory advice not only fails to limit auditors’ litigation risk but can actually increase jurors’ assessments of auditor negligence. Because the numerosity heuristic leads jurors to believe that there is a higher likelihood of misstatement on multi-location compared to single location audits, jurors perceive that auditor reliance on the IAF during multi-location audits is not normal. Accordingly, they judge auditors to be more negligent when they rely on the IAF in multi-location audits than when they do not, but IAF reliance does not impact negligence assessments on single location audits. Our results suggest auditor reluctance to use a qualified IAF, despite client and regulatory pressure, can be a rational and defensible strategy to limit their litigation risk exposure

    Applying Machine Learning Techniques To Intermediate-Length Cascade Decays

    Full text link
    In the collider phenomenology of extensions of the Standard Model with partner particles, cascade decays occur generically, and they can be challenging to discover when the spectrum of new particles is compressed and the signal cross section is low. Achieving discovery-level significance and measuring the properties of the new particles appearing as intermediate states in the cascade decays is a longstanding problem, with analysis techniques for some decay topologies already optimized. We focus our attention on a benchmark decay topology with four final state particles where there is room for improvement, and where multidimensional analysis techniques have been shown to be effective in the past. Using machine learning techniques, we identify the optimal kinematic observables for discovery, spin determination and mass measurement. In agreement with past work, we confirm that the kinematic observable Δ4\Delta_4 is highly effective. We quantify the achievable accuracy for spin determination and for the precision for mass measurements as a function of the signal size.Comment: 28 pages, 12 figure
    • 

    corecore