10,772 research outputs found
Monopoly quality degradation and regulation in cable television
Using an empirical framework based on the Mussa-Rosen model of monopoly quality choice, we calculate the degree of quality degradation in cable television markets and the impact of regulation on those choices. We find lower bounds of quality degradation ranging from 11 to 45 percent of offered service qualities. Furthermore, cable operators in markets with local regulatory oversight offer significantly higher quality, less degradation, and greater quality per dollar, despite higher prices
Calculating Cost Savings Per Acre When Harvest Days are Stochastic
New cotton harvesters have been introduced that have higher performance rate as well as eliminate extra labor and accompanying equipment. The new machines build partial modules on board the harvester. Higher field efficiency (performance rate) lets a farmers harvest his cotton in a shorter period. Precipitation causes cotton losses in both quality and quantity of the cotton. This paper seeks to measure cost per acre when harvest days are stochastic by using historic precipitation data. Cost per acre will include the cost of losses from a loss function from precipitation. Cost per acre will be adjusted for conventional versus new technology by quantifying the losses that contribute to extra costs of extended harvesting.Cotton, harvester, fieldwork days, stochastic, cost per acre, Agribusiness, Farm Management,
Estimating Cotton Harvest Cost per Acre When Harvest Days are Stochastic
The cotton harvesting industry is in the beginnings of its next technological advance, cotton harvesters that form cotton modules inside the machine then deposit them off the rows. These new machines eliminate the need for extra labor and equipment, but are more expensive than conventional pickers. Increased field efficiency is also a benefit of the on-board module builders. The problem facing producers is determining the optimal number of acres to plan for harvest when trying to decide which harvester to purchase. This paper examines two objectives. First, determine the cost per acre of both conventional and on-board module harvester systems for different acreage levels assuming harvest hours per year are fixed. Second, make the harvest hours per season stochastic to determine the cost per acre under different farm sizes for each type of cotton picker. The results show that the maximum benefits of the new machines are realized with larger farms when a larger number of acres need to be harvested in the harvest period. Results should help farmers plan both their cotton acre estimates as well as their purchase decisions for new cotton pickers.cotton harvester, harvest hours, cost per acre, field efficiency, on-board module builder, Crop Production/Industries, Financial Economics, Risk and Uncertainty,
Dynamic Scoring: A Back-of-the-Envelope Guide
This paper uses the neoclassical growth model to examine the extent to which a tax cut pays for itself through higher economic growth. The model yields simple expressions for the steady-state feedback effect of a tax cut. The feedback is surprisingly large: for standard parameter values, half of a capital tax cut is self-financing. The paper considers various generalizations of the basic model, including elastic labor supply, departures from infinite horizons, and non-neoclassical production settings. It also examines how the steady-state results are modified when one considers the transition path to the steady state.
News or Noise? An Analysis of GNP Revisions
This paper studies the nature of the errors in preliminary GNP data, It first documents that these errors are large. For example, suppose the prelimimary estimate indicates that real GNP did not change over the recent quarter; then one can be only 80 percent confident that the final estimate (annual rate) will be in the range from -2.8 percent to +2.8 percent. The paper also documents that the revisions in GNP data are not forecastable, This finding implies that the preliminary estimates are the efficient given available information. Hence, the Bureau of Economic Analysis appears to follow efficient statistical procedures, in making its preliminary estimates.
Risk and Return: Consumption versus Market Beta
The interaction between the macroeconomy and asset markets is central to a variety of modern theories of the business cycle. Much recentwork emphasizes the joint nature of the consumption decision and the portfolio allocation decision. In this paper, we compare two formulations of the Capital Asset Pricing Model. The traditional CAPM suggests that the appropriate measure of an asset's risk is the covariance of the asset's return with the market return. The consumption CAPM, on the other hand, implies that a better measure of risk is the covariance with aggregate consumption growth. We examine a cross section of 464 stocks and find that the beta measured with respect to a stock market index outperforms the beta measured with respect to consumption growth.
The Optimal Taxation of Height: A Case Study of Utilitarian Income Redistribution
Should the income tax include a credit for short taxpayers and a surcharge for tall ones? The standard Utilitarian framework for tax analysis answers this question in the affirmative. Moreover, a plausible parameterization using data on height and wages implies a substantial height tax: a tall person earning 4,500 more in tax than a short person. One interpretation is that personal attributes correlated with wages should be considered more widely for determining taxes. Alternatively, if policies such as a height tax are rejected, then the standard Utilitarian framework must fail to capture intuitive notions of distributive justice.
Semantic Modeling of Analytic-based Relationships with Direct Qualification
Successfully modeling state and analytics-based semantic relationships of
documents enhances representation, importance, relevancy, provenience, and
priority of the document. These attributes are the core elements that form the
machine-based knowledge representation for documents. However, modeling
document relationships that can change over time can be inelegant, limited,
complex or overly burdensome for semantic technologies. In this paper, we
present Direct Qualification (DQ), an approach for modeling any semantically
referenced document, concept, or named graph with results from associated
applied analytics. The proposed approach supplements the traditional
subject-object relationships by providing a third leg to the relationship; the
qualification of how and why the relationship exists. To illustrate, we show a
prototype of an event-based system with a realistic use case for applying DQ to
relevancy analytics of PageRank and Hyperlink-Induced Topic Search (HITS).Comment: Proceedings of the 2015 IEEE 9th International Conference on Semantic
Computing (IEEE ICSC 2015
- ā¦