1,489 research outputs found

    Introduction to forestry investment analysis: Part I. Basic investment characteristics and financial criteria

    Get PDF
    Many forest landowners consider their forest to be an investment. Some of these landowners, however, and many new timberland investors, may not fully understand the basic ingredients that make up a forestry investment. Like all investments, forestry involves costs and revenues, and rates of return can be calculated. These rates of return can be compared with interest rates earned for other investments, but forest landowners should be sure to understand the unique characteristics of a forestry investment. Most of the cash flow from a forestry investment will result from timber sales. Timber sale revenue, of course, is a function of current stumpage prices, but also it is a function of the amount of wood removed from an acre. This is called forest yield

    Introduction to forestry investment analysis: Part II. Taxes, inflation, and other issues

    Get PDF
    Part I of this article covered the basics of compounding and discounting. You were introduced to forestry investment analysis. However, several complications were not discussed. What about inflation, taxes, and risk? Part II addresses basic complications. It also includes more detailed forestry investment analyses

    Constraining stellar assembly and AGN feedback at the peak epoch of star formation

    Get PDF
    We study stellar assembly and feedback from active galactic nuclei (AGN) around the epoch of peak star formation (1<z<2), by comparing hydrodynamic simulations to rest-frame UV-optical galaxy colours from the Wide Field Camera 3 (WFC3) Early-Release Science (ERS) Programme. Our Adaptive Mesh Refinement simulations include metal-dependent radiative cooling, star formation, kinetic outflows due to supernova explosions, and feedback from supermassive black holes. Our model assumes that when gas accretes onto black holes, a fraction of the energy is used to form either thermal winds or sub-relativistic momentum-imparting collimated jets, depending on the accretion rate. We find that the predicted rest-frame UV-optical colours of galaxies in the model that includes AGN feedback is in broad agreement with the observed colours of the WFC3 ERS sample at 1<z<2. The predicted number of massive galaxies also matches well with observations in this redshift range. However, the massive galaxies are predicted to show higher levels of residual star formation activity than the observational estimates, suggesting the need for further suppression of star formation without significantly altering the stellar mass function. We discuss possible improvements, involving faster stellar assembly through enhanced star formation during galaxy mergers while star formation at the peak epoch is still modulated by the AGN feedback.Comment: 6 pages, 4 figures, accepted for publication in MNRAS Letter

    Tribological properties of room temperature fluorinated graphite heat-treated under fluorine atmosphere

    Get PDF
    This work is concerned with the study of the tribologic properties of room temperature fluorinated graphite heat-treated under fluorine atmosphere. The fluorinated compounds all present good intrinsic friction properties (friction coefficient in the range 0.05–0.09). The tribologic performances are optimized if the materials present remaining graphitic domains (influenced by the presence of intercalated fluorinated species) whereas the perfluorinated compounds, where the fluorocarbon layers are corrugated (armchair configuration of the saturated carbon rings) present higher friction coefficients. Raman analyses reveal that the friction process induces severe changes in the materials structure especially the partial re-building of graphitic domains in the case of perfluorinated compounds which explains the improvement of μ during the friction tests for these last materials

    Financing Direct Democracy: Revisiting the Research on Campaign Spending and Citizen Initiatives

    Get PDF
    The conventional view in the direct democracy literature is that spending against a measure is more effective than spending in favor of a measure, but the empirical results underlying this conclusion have been questioned by recent research. We argue that the conventional finding is driven by the endogenous nature of campaign spending: initiative proponents spend more when their ballot measure is likely to fail. We address this endogeneity by using an instrumental variables approach to analyze a comprehensive dataset of ballot propositions in California from 1976 to 2004. We find that both support and opposition spending on citizen initiatives have strong, statistically significant, and countervailing effects. We confirm this finding by looking at time series data from early polling on a subset of these measures. Both analyses show that spending in favor of citizen initiatives substantially increases their chances of passage, just as opposition spending decreases this likelihood

    Carbon Efficiency of Humanitarian Supply Chains: Evidence from French Red Cross operations

    Get PDF
    Natural catastrophes are often amplified by man-made impact on the environment. Sustainability is identified as a major gap in humanitarian logistics research literature. Although humanitarian supply chains are designed for speed and sustainability is of minor concern, environmentally-friendly behavior (e.g. through reduction of transportation emissions and avoidance of non-degradable materials) should be a long-term concern as it may ultimately affect more vulnerable regions. The purpose of this paper is to illustrate how green house gas emissions can be measured using the supply chain of common relief items in humanitarian logistics. We analyze the CO2 emissions of selected supply chains by performing Life Cycle Assessments based on data provided by the French Red Cross. We calculate the CO2 emissions of the items from ‘cradle to grave’ including production, transportation, warehousing and disposal. Using these calculations, we show that transporting relief items causes the majority of emissions; however, transportation modes may not always be changed as the main purpose of humanitarian supply chains is speed. Nevertheless, strategic and efficient pre-positioning of main items will translate into less transportation and thus reducing the environmental impact. The study also shows that initiatives for “greening” item production and disposal can improve the overall carbon efficiency of humanitarian supply chains

    Stochastic and epistemic uncertainty propagation in LCA

    Get PDF
    Purpose: When performing uncertainty propagation, most LCA practitioners choose to represent uncertainties by single probability distributions and to propagate them using stochastic methods. However the selection of single probability distributions appears often arbitrary when faced with scarce information or expert judgement (epistemic uncertainty). Possibility theory has been developed over the last decades to address this problem. The objective of this study is to present a methodology that combines probability and possibility theories to represent stochastic and epistemic uncertainties in a consistent manner and apply it to LCA. A case study is used to show the uncertainty propagation performed with the proposed method and compare it to propagation performed using probability and possibility theories alone. Methods: Basic knowledge on the probability theory is first recalled, followed by a detailed description of hal-00811827, version 1- 11 Apr 2013 epistemic uncertainty representation using fuzzy intervals. The propagation methods used are the Monte Carlo analysis for probability distribution and an optimisation on alpha-cuts for fuzzy intervals. The proposed method (noted IRS) generalizes the process of random sampling to probability distributions as well as fuzzy intervals, thus making the simultaneous use of both representations possible

    A comprehensive computational model of sound transmission through the porcine lung

    Get PDF
    A comprehensive computational simulation model of sound transmission through the porcine lung is introduced and experimentally evaluated. This subject-specific model utilizes parenchymal and major airway geometry derived from x-ray CT images. The lung parenchyma is modeled as a poroviscoelastic material using Biot theory. A finite element (FE) mesh of the lung that includes airway detail is created and used in COMSOL FE software to simulate the vibroacoustic response of the lung to sound input at the trachea. The FE simulation model is validated by comparing simulation results to experimental measurements using scanning laser Doppler vibrometry on the surface of an excised, preserved lung. The FE model can also be used to calculate and visualize vibroacoustic pressure and motion inside the lung and its airways caused by the acoustic input. The effect of diffuse lung fibrosis and of a local tumor on the lung acoustic response is simulated and visualized using the FE model. In the future, this type of visualization can be compared and matched with experimentally obtained elastographic images to better quantify regional lung material properties to noninvasively diagnose and stage disease and response to treatment

    Overestimating Outcome Rates: Statistical Estimation When Reliability Is Suboptimal

    Full text link
    To demonstrate how failure to account for measurement error in an outcome (dependent) variable can lead to significant estimation errors and to illustrate ways to recognize and avoid these errors. Data Sources . Medical literature and simulation models. Study Design/Data Collection . Systematic review of the published and unpublished epidemiological literature on the rate of preventable hospital deaths and statistical simulation of potential estimation errors based on data from these studies. Principal Findings . Most estimates of the rate of preventable deaths in U.S. hospitals rely upon classifying cases using one to three physician reviewers (implicit review). Because this method has low to moderate reliability, estimates based on statistical methods that do not account for error in the measurement of a “preventable death” can result in significant overestimation. For example, relying on a majority rule rating with three reviewers per case (reliability ∼0.45 for the average of three reviewers) can result in a 50–100 percent overestimation compared with an estimate based upon a reliably measured outcome (e.g., by using 50 reviewers per case). However, there are statistical methods that account for measurement error that can produce much more accurate estimates of outcome rates without requiring a large number of measurements per case. Conclusion . The statistical principles discussed in this case study are critically important whenever one seeks to estimate the proportion of cases belonging to specific categories (such as estimating how many patients have inadequate blood pressure control or identifying high-cost or low-quality physicians). When the true outcome rate is low (<20 percent), using an outcome measure that has low-to-moderate reliability will generally result in substantially overestimating the proportion of the population having the outcome unless statistical methods that adjust for measurement error are used.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/74896/1/j.1475-6773.2006.00661.x.pd

    Who pays and who benefits? How different models of shared responsibilities between formal and informal carers influence projections of costs of dementia management

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The few studies that have attempted to estimate the future cost of caring for people with dementia in Australia are typically based on total prevalence and the cost per patient over the average duration of illness. However, costs associated with dementia care also vary according to the length of the disease, severity of symptoms and type of care provided. This study aimed to determine more accurately the future costs of dementia management by taking these factors into consideration.</p> <p>Methods</p> <p>The current study estimated the prevalence of dementia in Australia (2010-2040). Data from a variety of sources was recalculated to distribute this prevalence according to the location (home/institution), care requirements (informal/formal), and dementia severity. The cost of care was attributed to redistributed prevalences and used in prediction of future costs of dementia.</p> <p>Results</p> <p>Our computer modeling indicates that the ratio between the prevalence of people with mild/moderate/severe dementia will change over the three decades from 2010 to 2040 from 50/30/20 to 44/32/24.</p> <p>Taking into account the severity of symptoms, location of care and cost of care per hour, the current study estimates that the informal cost of care in 2010 is AU3.2billionandformalcareatAU3.2 billion and formal care at AU5.0 billion per annum. By 2040 informal care is estimated to cost AU11.6billionandformalcare11.6 billion and formal care AU16.7 billion per annum. Interventions to slow disease progression will result in relative savings of 5% (AU1.5billion)perannumandinterventionstodelaydiseaseonsetwillresultinrelativesavingsof141.5 billion) per annum and interventions to delay disease onset will result in relative savings of 14% (AU4 billion) of the cost per annum.</p> <p>With no intervention, the projected combined annual cost of formal and informal care for a person with dementia in 2040 will be around AU38,000(in2010dollars).Aninterventiontodelayprogressionby2yearswillseethisreducedtoAU38,000 (in 2010 dollars). An intervention to delay progression by 2 years will see this reduced to AU35,000.</p> <p>Conclusions</p> <p>These findings highlight the need to account for more than total prevalence when estimating the costs of dementia care. While the absolute values of cost of care estimates are subject to the validity and reliability of currently available data, dynamic systems modeling allows for future trends to be estimated.</p
    corecore