2,094 research outputs found

    The effect of biofuel on the international oil market

    Get PDF
    This paper derives a method to quantify the short- to medium-run impact of biofuel on fuel markets, assuming that these markets are dominated by cartel of oil-rich countries, and that prices in these countries are set to maximize the sum of domestic consumer and producer surplus, leading to a wedge between domestic and international fuel prices. We model this behavior by applying the optimal export tax model (henceforth, the cartel-of-nations model) to the fuel markets. Using data from 2007 to calibrate the model, we show that the introduction of biofuels lowered global gasoline and diesel consumption and international fuel prices by about 1% and 2%, respectively. We identify large differences between the effects of introducing biofuels using the cartel-of-nations model, in contrast to the competitive or the standard cartel model (henceforth, the cartel-of- firms model). We illustrate that assessing the effect of biofuels assuming competitive fuel markets overestimates the reduction in fuel price, and underestimates the reduction of gasoline and diesel consumption, and therefore impact of biofuels on greenhouse gas emissions, when compared to the effect under a cartel-of-nations model. Similar conclusions are derived with respect to cartel- of-firms model. Finally, we illustrate that a 20% increase in fuel demand more than doubles the impact of biofuels on fuel markets, compared to 2007.Energy, OPEC, biofuel, fuel, carbon savings, optimal export tax model, cheap oil, Resource /Energy Economics and Policy, F1, Q4,

    Food Safety, the Environment, and Trade

    Get PDF
    Distorted incentives, agricultural and trade policy reforms, national agricultural development, Agricultural and Food Policy, International Relations/Trade, F13, F14, Q17, Q18,

    Linearly Dichroic Plasmonic Lens and Hetero-Chiral Structures

    Full text link
    We present theoretical and experimental study of plasmonic Hetero-Chiral structures, comprised of constituents with opposite chirality. We devise, simulate and experimentally demonstrate different schemes featuring selective surface plasmon polariton focusing of orthogonal polarization states and standing plasmonic vortex fields.Comment: 9 pages, 6 figure

    The role of inventory adjustments in quantifying factors causing food price inflation

    Get PDF
    The food commodity price increases beginning in 2001 and culminating in the food crisis of 2007/08 reflected a combination of several factors, including economic growth, biofuel expansion, exchange rate fluctuations, and energy price inflation. To quantify these influences, the authors developed an empirical model that also included crop inventory adjustments. The study shows that, if inventory effects are not taken into account, the impacts of the various factors on food commodity price inflation would be overestimated. If the analysis ignores crop inventory adjustments, it indicates that prices of corn, soybean, rapeseed, rice, and wheat would have been, respectively, 42, 38, 52, and 45 percent lower than the corresponding observed prices in 2007. If inventories are properly taken into account, the contributions of the above mentioned factors to those commodity prices are 36, 26, 26, and 35 percent, respectively. Those four factors, taken together, explain 70 percent of the price increase for corn, 55 percent for soybean, 54 percent for wheat, and 47 percent for rice during the 2001-2007 period. Other factors, such as speculation, trade policy, and weather shocks, which are not included in the analysis, might be responsible for the remaining contribution to the food commodity price increases.Markets and Market Access,Economic Theory&Research,Food&Beverage Industry,Access to Markets,Currencies and Exchange Rates

    Water quality assessment, trophic classification and water resources management

    Get PDF
    Quantification of water quality (WQ) is an integral part of scientifically based water resources management. The main objective of this study was comparative analysis of two approaches applied for quantitative assessment of WQ: the trophic level index (TLI) and the Delphi method (DM). We analyzed the following features of these conceptually different approaches: A. similarity of estimates of lake WQ; B. sensitivity to indicating disturbances in the aquatic ecosystem structure and functioning; C. capacity to reflect the impact of major management measures on the quality of water resources. We compared the DM and TLI based on results from a series of lakes covering varying productivity levels, mixing regimes and climatic zones. We assumed that the conservation of aquatic ecosystem in some predefined, “reference”, state is a major objective of sustainable water resources management in the study lakes. The comparison between the two approaches was quantified as a relationship between the DM ranks and respective TLI values. We show that being a classification system, the TLI does not account for specific characteristics of aquatic ecosystems and the array of different potential uses of the water resource. It indirectly assumes that oligotrophication is identical to WQ improvement, and reduction of economic activity within the lake catchment area is the most effective way to improve WQ. WQ assessed with the TLI is more suitable for needs of natural water resources management if eutrophication is a major threat. The DM allows accounting for several water resource uses and therefore it may serve as a more robust and comprehensive tool for WQ quantification and thus for sustainable water resources management

    IoT Design Challenges and the Social IoT Solution

    Get PDF
    The IoT (Internet of Things) promises to be the major phenomenon in information technology in the near term. By some forecasts more than half of all new IT system deployments by 2020 will incorporate some form of IoT technology. Currently, however, there is no dominant IoT platform and no universal IoT design standards currently in use. This contributes to Architectural Heterogeneity which in turn contributes to high integration costs and inhibits IoT benefits realisation. The use of universal design standards presents one solution to this problem. Social Internet of Things (SIoT) methods use the way that people manage social relationships as a reference architecture for the way to manage the interaction between the various Things in an IoT network. This paper discusses some of the current IoT design challenges and presents solutions couched in SIoT that can be used as standards for future IoT designs to reduce Architectural Heterogeneity

    Abandon Statistical Significance

    Get PDF
    We discuss problems the null hypothesis significance testing (NHST) paradigm poses for replication and more broadly in the biomedical and social sciences as well as how these problems remain unresolved by proposals involving modified p-value thresholds, confidence intervals, and Bayes factors. We then discuss our own proposal, which is to abandon statistical significance. We recommend dropping the NHST paradigm--and the p-value thresholds intrinsic to it--as the default statistical paradigm for research, publication, and discovery in the biomedical and social sciences. Specifically, we propose that the p-value be demoted from its threshold screening role and instead, treated continuously, be considered along with currently subordinate factors (e.g., related prior evidence, plausibility of mechanism, study design and data quality, real world costs and benefits, novelty of finding, and other factors that vary by research domain) as just one among many pieces of evidence. We have no desire to "ban" p-values or other purely statistical measures. Rather, we believe that such measures should not be thresholded and that, thresholded or not, they should not take priority over the currently subordinate factors. We also argue that it seldom makes sense to calibrate evidence as a function of p-values or other purely statistical measures. We offer recommendations for how our proposal can be implemented in the scientific publication process as well as in statistical decision making more broadly

    The reliability of volume functions based on spruce tree species in Slovenia

    Get PDF
    Primerjali smo različne volumenske funkcije za ocenjevanje volumnov dreves. S pomočjo sekcijskih meritev (Newtonova metoda) smo izračunali prave ocene volumnov dreves 88 posekanih smrek na Pokljuki. Te smo uporabili za izdelavo regionalnih trovhodnih volumenskih funkcij, (dvovhodnih) deblovnic in tarif. Standardna napaka ocene povprečne vrednosti volumna drevesa je najmanjša pri trovhodnih volumenskih funkcijah (5,0 %), nato pri deblovnicah (11,7 %) in največja pri tarifah (15,1 %). Zanesljivost in uporabnost izdelanih regionalnih ter drugih volumenskih funkcij smo preverili na dveh hektarskih raziskovalnih ploskvah. Ugotovili smo, da s prirejenimi nemškimi deblovnicami smrekam določimo previsoke volumne. Ker volumne iz teh deblovnic uporabljamo tudi pri določanju prirejenih tarif na Slovenskem, je s takim postopkom določen tarifni razred z vsaj za 5 % previsokimi volumni dreves.Various volume functions used for tree volume estimations were compared by the authors. Using section measurements (Newtonʼs method), they calculated the real tree volumes of 88 felled spruce trees on the Pokljuka plateau. The real tree volumes were used to form regional three-entry volume functions, (two-entry) volume tables, and tariff functions. The standard error of the average tree volume estimation is the lowest for three-entry volume functions (5.0%), then for volume tables (11.7%), and the highest for tariffs (15.1%). The reliability and applicability of the developed regional and other volume functions was verified on two one-hectare research plots. It was established that by applying the adapted German volume tables, the measured spruce volumes were too high. As the volumes measured in this manner are used when determining the adapted tariffs in Slovenia, this procedure results in a tariff class of at least 5% too high tree volumes

    A photometric search for transients in galaxy clusters

    Full text link
    We have begun a program to search for supernovae and other transients in the fields of galaxy clusters with the 2.3m Bok Telescope on Kitt Peak. We present our automated photometric methods for data reduction, efficiency characterization, and initial spectroscopy. With this program, we aim to ultimately identify \sim25-35 cluster SN Ia (\sim10 of which will be intracluster, hostless events) and constrain the SN Ia rate associated with old, passive stellar populations. With these measurements we will constrain the relative contribution of hostless and hosted SN Ia to the metal enrichment of the intracluster medium. In the current work, we have identified a central excess of transient events within 1.25r2001.25 r_{200} in our cluster fields after statistically subtracting out the 'background' transient rate taken from an off-cluster CCD chip. Based on the published rate of SN Ia for cluster populations we estimate that \sim20 percent of the excess cluster transients are due to cluster SN Ia, a comparable fraction to core collapse (CC) supernovae and the remaining are likely to be active galactic nuclei. Interestingly, we have identified three intracluster SN candidates, all of which lay beyond R>r200R>r_{200}. These events, if truly associated with the cluster, indicate a large deficit of intracluster (IC) SN at smaller radii, and may be associated with the IC stars of infalling groups or indicate that the intracluster light (ICL) in the cluster outskirts is actively forming stars which contribute CC SN or prompt SN Ia.Comment: Updated to match accepted version; 26 pages, 14 figures, AJ accepte
    corecore