4,465 research outputs found

    Does Information Matter? Assessing the Role of Information and Prices in the Nitrogen Fertilizer Management Decision

    Get PDF
    This article investigates the impact of agronomic, environmental, and price information on the management decision of nitrogen fertilizer. Because excessive nitrogen originating from agricultural production activities can cause environmental degradation, understanding how information influences the nutrient application decision on the field is important for developing strategies for nitrogen load mitigation. I investigate the value farmers place on information about N management they receive from several sources. In particular, I evaluate how farmers use information from soil N-tests to make decisions about the rate of N to apply to the field. My results show that soil N-testing can be an effective management practices for reducing excess N applications. I find farmers who use a soil test reduce their use of commercial N by up to 14 lbs/ac relative to non-testers. I also find new evidence that rising fertilizer prices encourage farmers to manage N more carefully. I estimate a price elasticity of demand of between -0.6 and -1.29. I also show prices play a role in other forms of N management behavior, including application method and timing.Nitrogen Fertilizer Application, Soil N-testing, Agronomic Information, Best Management Practices, Nonpoint Source Pollution, Demand and Price Analysis, Environmental Economics and Policy, Q24, Q28,

    Disaster Relief through the Tax Code: Hurricane Katrina and the Gulf Opportunity Zone

    Get PDF
    This project investigates the impact of geo-graphically targeted Federal tax relief enacted in the wake of Hurricane Katrina in 2005. To facilitate administration of relief efforts and define eligibility for the temporary tax law changes, the Gulf Opportunity Zone (GO Zone) was created. We estimate the initial impacts of these tax incentives using propensity score matching (PSM) and Mahalanobis metric matching (MM) methods, combined with difference-in-difference (DD) estimation, to limit the confounding influences of observable and fixed unobservable differences between counties affected by these incentives and similarly storm-damaged counties in the region that were not included in the GO Zone. Results show that per capita personal income and net earnings increased more rapidly in GO Zone counties that experienced minimal storm damage than in similar non-GO Zone counties in the GO Zone States and neighboring States., Public Economics, H2, H24, H25,

    The Effect of the Housing Boom on Farm Land Values via Tax-Deferred Exchanges

    Get PDF
    This project examines Section 1031 of the Internal Revenue Code and agriculture land exchanges. Stakeholders in rural communities and agriculture are particularly interested in Section 1031 because the recent growth in transaction values of farmland may have, in part, been stimulated by Section 1031 land exchanges. Further, although many have speculated that such exchanges are widely used, little empirical research exists about the provision. We examine the theory of exchanges and develop a theoretical premium value for exchanges. We also present the first evidence of like-kind exchanges involving farmland using Federal tax data.Like-Kind Exchange, Capital Gains Tax, Agricultural Land, Land Economics/Use, Public Economics, Q15, H24,

    Efficient calibration for high-dimensional computer model output using basis methods

    Full text link
    Calibration of expensive computer models with high-dimensional output fields can be approached via history matching. If the entire output field is matched, with patterns or correlations between locations or time points represented, calculating the distance metric between observational data and model output for a single input setting requires a time intensive inversion of a high-dimensional matrix. By using a low-dimensional basis representation rather than emulating each output individually, we define a metric in the reduced space that allows the implausibility for the field to be calculated efficiently, with only small matrix inversions required, using projection that is consistent with the variance specifications in the implausibility. We show that projection using the L2L_2 norm can result in different conclusions, with the ordering of points not maintained on the basis, with implications for both history matching and probabilistic methods. We demonstrate the scalability of our method through history matching of the Canadian atmosphere model, CanAM4, comparing basis methods to emulation of each output individually, showing that the basis approach can be more accurate, whilst also being more efficient

    A 3/2-approximation algorithm for some minimum-cost graph problems

    No full text
    International audienceWe consider a class of graph problems introduced in a paper of Goemans and Williamson that involve finding forests of minimum edge cost. This class includes a number of location/routing problems; it also includes a problem in which we are given as input a parameter k, and want to find a forest such that each component has at least k vertices. Goemans and Williamson gave a 2-approximation algorithm for this class of problems. We give an improved 3/2-approximation algorithm

    Extraction-scintillation medium and method of use

    Get PDF
    An extraction-scintillation medium of substantially free-flowing, porous, solid particulate matter having one or more fluors retained within the particulate matter and an extraction agent adsorbed on or bound to the surface of the particulate matter. The medium is capable of extracting one of a selected radionuclide from an aqueous stream and permits transmission of light therethrough, which light is emitted from the one or more fluors in response to radiation absorbed thereby from the selected radionuclide. A sensor system using the extraction-scintillation medium for real-time or near real-time detection of the selected radionuclide is also disclosed

    Quantifying spatio-temporal boundary condition uncertainty for the North American deglaciation

    Full text link
    Ice sheet models are used to study the deglaciation of North America at the end of the last ice age (past 21,000 years), so that we might understand whether and how existing ice sheets may reduce or disappear under climate change. Though ice sheet models have a few parameters controlling physical behaviour of the ice mass, they also require boundary conditions for climate (spatio-temporal fields of temperature and precipitation, typically on regular grids and at monthly intervals). The behaviour of the ice sheet is highly sensitive to these fields, and there is relatively little data from geological records to constrain them as the land was covered with ice. We develop a methodology for generating a range of plausible boundary conditions, using a low-dimensional basis representation of the spatio-temporal input. We derive this basis by combining key patterns, extracted from a small ensemble of climate model simulations of the deglaciation, with sparse spatio-temporal observations. By jointly varying the ice sheet parameters and basis vector coefficients, we run ensembles of the Glimmer ice sheet model that simultaneously explore both climate and ice sheet model uncertainties. We use these to calibrate the ice sheet physics and boundary conditions for Glimmer, by ruling out regions of the joint coefficient and parameter space via history matching. We use binary ice/no ice observations from reconstructions of past ice sheet margin position to constrain this space by introducing a novel metric for history matching to binary data
    corecore