2,282 research outputs found
ECONOMIC AND TECHNICAL ANALYSIS OF ETHANOL DRY MILLING: MODEL DESCRIPTION
Ethanol, the common name for ethyl alcohol, is fuel grade alcohol that is predominately produced through the fermentation of simple carbohydrates by yeasts. In the United States, the carbohydrate feedstock most commonly used in the commercial production of ethanol is yellow dent corn (YDC). The use of ethanol in combustion engines emits less greenhouse gasses than its petroleum equivalent, and it is widely hoped that the increased substitution of petroleum by ethanol will reduce US dependence on imported oil and decrease greenhouse gas emissions. Production of ethanol within the United States is expected to double, from 3.4 billion gallons in 2004, to about seven billion gallons in the next five years. Two processes currently being utilized to produce ethanol from YDC are dry milling and wet milling. The wet mill process is more versatile than the dry mill process in that it produces a greater variety of products; starch, corn syrup, ethanol, Splenda, etc., which allows for the wet mill to better react to market conditions. However, the costs of construction and operation of a wet mill are much greater than those of a dry mill. If ethanol is the target product, then it can be produced at a lower cost and more efficiently in a dry mill plant than in a wet mill plant, under current economic conditions. Of the more than 70 US ethanol plants currently in production, only a few are of the wet mill variety. A descriptive engineering spreadsheet model (DM model) was developed to model the dry mill ethanol production process. This model was created to better understand the economics of the ethanol dry mill production process and how the profitability of dry mill plants is affected under different conditions. It was also developed to determine the economic and environmental costs and benefits of utilizing new and different technologies in the dry mill process. Specifically, this model was constructed to conduct an economic analysis for novel processes of obtaining greater alcohol yields in the dry mill process by conducting a secondary fermentation of sugars converted from lignocellulosics found in the dry mill co-product, distillerâs grains. This research is being conducted at Purdue University, Michigan State, Iowa State, USDA, and NCAUR under a grant from the US Department of Energy. The DM model is more technically precise, and more transparent, than other models of the dry mill process that have been constructed for similar purposes. The Tiffany and Eidman model (TE model) uses broad generalities of the dry mill process, based on the current state of production, to approximate the sensitivities of the process to changes in variables. The TE model parameters were well researched, but the model suffers from several drawbacks. The main limitations of this model are that the results are very sensitive to the input values chosen by the user. Unlike the DM model, complex manipulations, such as determining the effect of new technologies would require accurate parameter estimates using the TE model. The McAloon model [11].uses highly technical engineering software (ASPEN) that acts essentially as a âblack boxâ in the dry mill production process. This very exact model does not allow for a more general examination of the dry mill process. Changes in the production process would necessitate precise engineering plans. Similar to the TE and McAloon models, the DM model is a spreadsheet model, but unlike the McAloon model it is completely self-contained. The DM model is a feed backward model, input requirements (corn, enzymes, chemicals, utilities, etc) are calculated based on the user entered values for annual production and process parameters. The mass flow rates, in pounds per hour were then calculated and used in estimating the size, in dimension or power, of each major piece of equipment. The cost associated with each piece of major equipment was then calculated as an exponential function of its corresponding size. Total capital costs associated with a dry mill plant were then estimated using the percentage of equipment costs method [13]. It was found that the DM model estimates of the total capital costs associated with medium to large dry mill plants (those with the capacity to produce between 10 and 100 million gallons of ethanol a year) were within 5% of total fixed costs estimated by BBI [2]. Operating costs were compared with the 2002 USDA survey results and also found to be very close [15]. A companion document, âEconomic and Technical Analysis of Dry Milling: Model Userâs Manual,â staff paper no 06-05, explains how the model is used to conduct analysis of dry milling alternatives.Ethanol, DDGS, Dry Milling, Biochemical Process Engineering, Economic Modeling, Financing, Fermentation Process Modeling
Economic and Technical Analysis of Ethanol Dry Milling: MOdel User's Manual
Using the DM model is not complex: the user changes input values of interest (plant size, conversion rates, etc.) and examines the effect of these changes on output values (annual profits, feed stock requirements, etc.). There are nine worksheets in four modules in the excel workbook- assumptions, process, economics, and technology assessment. All user inputs are entered in the assumptions module of the model, which consists of three worksheets denoted with bright yellow tabs: process assumptions, economic assumptions and physical assumptions. The values that are entered on this page are then used in each of the subsequent modules to calculate hourly flow rates, equipment size and cost, total costs, loan terms, and annual profits. At the top of each page is a title bar which describes the page, the color coding of the cells, and pertinent information from the other pages. Before each of the pages is discussed, an explanation of the different types of cells in the model is in order.model user's manual
Microwave cavity light shining through a wall optimization and experiment
It has been proposed that microwave cavities can be used in a photon
regeneration experiment to search for hidden sector photons. Using two isolated
cavities, the presence of hidden sector photons could be inferred from a 'light
shining through a wall' phenomenon. The sensitivity of the experiment has
strong a dependence on the geometric construction and electromagnetic mode
properties of the two cavities. In this paper we perform an in depth
investigation to determine the optimal setup for such an experiment. We also
describe the results of our first microwave cavity experiment to search for
hidden sector photons. The experiment consisted of two cylindrical copper
cavities stacked axially inside a single vacuum chamber. At a hidden sector
photon mass of 37.78 micro eV we place an upper limit on the kinetic mixing
parameter chi = 2.9 * 10^(-5). Whilst this result lies within already
established limits our experiment validates the microwave cavity `light shining
through a wall' concept. We also show that the experiment has great scope for
improvement, potentially able to reduce the current upper limit on the mixing
parameter chi by several orders of magnitude.Comment: To be published in PR
Land sparing to make space for species dependent on natural habitats and high nature value farmland.
Empirical evidence from four continents indicates that human food demand may be best reconciled with biodiversity conservation through sparing natural habitats by boosting agricultural yields. This runs counter to the conservation paradigm of wildlife-friendly farming, which is influential in Europe, where many species are dependent on low-yielding high nature value farmland threatened by both intensification and abandonment. In the first multi-taxon population-level test of land-sparing theory in Europe, we quantified how population densities of 175 bird and sedge species varied with farm yield across 26 squares (each with an area of 1 km2) in eastern Poland. We discovered that, as in previous studies elsewhere, simple land sparing, with only natural habitats on spared land, markedly out-performed land sharing in its effect on region-wide projected population sizes. However, a novel 'three-compartment' land-sparing approach, in which about one-third of spared land is assigned to very low-yield agriculture and the remainder to natural habitats, resulted in least-reduced projected future populations for more species. Implementing the three-compartment model would require significant reorganization of current subsidy regimes, but would mean high-yield farming could release sufficient land for species dependent on both natural and high nature value farmland to persist.Supported by a NERC CASE studentship to C.F
Recommended from our members
The biodiversity intactness index may underestimate losses.
The Biodiversity Intactness Index (BII) is a high-profile metric of an areaâs average abundance of wild species relative to that in pre-modern times1 or in primary vegetation under current climatic conditions2. It has been endorsed by the Group on Earth Observations of the Biodiversity Observation Network, adopted by the Intergovernmental Platform on Biodiversity and Ecosystem Services as a "core" indicator of progress towards the Convention on Biological Diversityâs Aichi targets 12 and 14, and accepted by the Biodiversity Indicators Partnership as an indicator for Aichi target 5. We strongly support development of spatially-explicit indicators such as the BII, which can be used to prioritise areas for conservation interventions. However, it is important that the metric is as robust as possible, and we have noticed several unusual features of the BII that concern us
Are conservation actions reducing the threat to India's vulture populations?
Research Communications.-- et al.Veterinary use of the non-steroidal anti-inflammatory drug, diclofenac is responsible for the population collapse of resident vulture species in India. Conservation efforts, including a ban on veterinary diclofenac and the identification of a vulture-safe alternative (meloxicam), were introduced in 2006 in order to address the threat. Sampling of domesticated ungulate carcasses available to vultures in India was undertaken in three surveys prior to, around the time of, and 1-2 years after the ban in order to quantify the prevalence of diclofenac and meloxicam residues. A total of 1445, 1488 and 1251 liver tissue samples were collected from nine states and analysed with a validated LC-ESI/MS methodology. Overall diclofenac prevalence levels declined by almost a half over the three surveys, and there was an increase in meloxicam prevalence between the second and third surveys, although some states revealed little change. These surveys indicate that two of the key conservation actions to counter the threat faced by vultures - banning veterinary diclofenac and promoting meloxicam as a safe alternative - are beginning to take effect. However, because only a small proportion of diclofenac-contaminated carcasses is sufficient to cause vulture population declines, further efforts are needed to eliminate diclofenac from the food supply of India's vultures.The research was funded by the UK Governmentâs Darwin Initiative programme and by the
Royal Society for the Protection of Birds, UK.Peer Reviewe
Recommended from our members
Accounting for variability in ULF wave radial diffusion models
Many modern outer radiation belt models simulate the longâtime behavior of highâenergy electrons by solving a threeâdimensional FokkerâPlanck equation for the driftâ and bounceâaveraged electron phase space density that includes radial, pitchâangle, and energy diffusion. Radial diffusion is an important process, often characterized by a deterministic diffusion coefficient. One widely used parameterization is based on the median of statistical ultralow frequency (ULF) wave power for a particular geomagnetic index Kp. We perform idealized numerical ensemble experiments on radial diffusion, introducing temporal and spatial variability to the diffusion coefficient through stochastic parameterization, constrained by statistical properties of its underlying observations. Our results demonstrate the sensitivity of radial diffusion over a long time period to the full distribution of the radial diffusion coefficient, highlighting that information is lost when only using median ULF wave power. When temporal variability is included, ensembles exhibit greater diffusion with more rapidly varying diffusion coefficients, larger variance of the diffusion coefficients and for distributions with heavier tails. When we introduce spatial variability, the variance in the set of all ensemble solutions increases with larger spatial scales of variability. Our results demonstrate that the variability of diffusion affects the temporal evolution of phase space density in the outer radiation belt. We discuss the need to identify important temporal and length scales to constrain variability in diffusion models. We suggest that the application of stochastic parameterization techniques in the diffusion equation may allow the inclusion of natural variability and uncertainty in modeling of waveâparticle interactions in the inner magnetosphere
- âŠ