2,425 research outputs found

    Genre Analysis and Genre-based Approaches to EFL Writing: A Critical Analysis

    Get PDF
    本論文では、初級EFL学習者を対象にジャンル分析とジャンルに基づいた指導について議論する。ジャンル分析とはどのように言語が特定の状況の中で使用されるかを分析する方法であり、修辞のスタイルと談話のタイプなどが研究の焦点となる。本論文ではライティング指導におけるジャンル分析やジャンルに基づいたアプローチの有効性を検証する。加えて、オーストラリアの学校で活用されているジャンルに基づいたEFL指導を特に参考にしながら、日本における大学生初級EFL学習者に対するジャンルに基づいた指導の活用について議論と検証を行う。研究論

    Abolishing green rates : the effects on cereals, sugar, and oilseeds in West Germany

    Get PDF
    In 1987 the European Community began the ambitious task of forging a single market for goods and services across the national borders of its member states by 1992. Substantive reform of the Community's Common Agricultural Policy - necessary for the full integration of existing markets - has not yet been accomplished and has proven difficult to achieve. Creating a truly"common"agricultural policy in the European Community requires, at a minimum, eliminating price differences resulting from country- and commodity-specific exchange rates, known as"green rates."The authors discuss the various policy instruments that complicate the effects of these policy-determined price differences on crop production and the demand for inputs. They present a model that estimates the cross-commodity biases created by multiple policy instruments and that quantifies the effects of removing green-rate differentials in what was West Germany. The effects of price changes on domestic production are statistically significant in the model, although quantitatively small. This result suggests that eliminating green rates would lead primarily to a decline in farm income and a devaluation of fixed agricultural assets - which complicates the difficult task of attaining reform.Environmental Economics&Policies,Economic Theory&Research,Access to Markets,Markets and Market Access,Crops&Crop Management Systems

    Bridge damage detection using spatiotemporal patterns extracted from dense sensor network

    Get PDF
    The alarmingly degrading state of transportation infrastructures combined with their key societal and economic importance calls for automatic condition assessment methods to facilitate smart management of maintenance and repairs. With the advent of ubiquitous sensing and communication capabilities, scalable data-driven approaches is of great interest, as it can utilize large volume of streaming data without requiring detailed physical models that can be inaccurate and computationally expensive to run. Properly designed, a data-driven methodology could enable fast and automatic evaluation of infrastructures, discovery of causal dependencies among various sub-system dynamic responses, and decision making with uncertainties and lack of labeled data. In this work, a spatiotemporal pattern network (STPN) strategy built on symbolic dynamic filtering (SDF) is proposed to explore spatiotemporal behaviors in a bridge network. Data from strain gauges installed on two bridges are generated using finite element simulation for three types of sensor networks from a density perspective (dense, nominal, sparse). Causal relationships among spatially distributed strain data streams are extracted and analyzed for vehicle identification and detection, and for localization of structural degradation in bridges. Multiple case studies show significant capabilities of the proposed approach in: (i) capturing spatiotemporal features to discover causality between bridges (geographically close), (ii) robustness to noise in data for feature extraction, (iii) detecting and localizing damage via comparison of bridge responses to similar vehicle loads, and (iv) implementing real-time health monitoring and decision making work flow for bridge networks. Also, the results demonstrate increased sensitivity in detecting damages and higher reliability in quantifying the damage level with increase in sensor network density

    Real-Time Quantitative Bronchoscopy

    Get PDF
    The determination of motion within a sequence of images remains one of the fundamental problems in computer vision after more than 30 years of research. Despite this work, there have been relatively few applications of these techniques to practical problems outside the fields of robotics and video encoding. In this paper, we present the continuing work to apply optical flow and egomotion recovery to the problem of measuring and navigating through the airway using a bronchoscope during a standard procedure, without the need for any additional data, localization systems or other external components. The current implementation uses a number of techniques to provide a range of numerical measurements and estimations to physicians in real time, using standard computer hardware

    Tax Incidence in Differentiated Product Oligopoly

    Get PDF
    We analyze the incidence of ad valorem and unit excise taxes in an oligopolistic industry with differentiated products and price-setting (Bertrand) firms. Both taxes may be passed on to consumers by more than 100 percent, and an increase in the tax rate can increase short run firm profits (and hence the long run number of firms). We provide summary conditions for these effects to arise. The conditions depend on demand curvatures and are written in elasticity form. Surprisingly, the analysis largely corroborates Cournot results with homogeneous demand.Excise tax, unit tax, specific tax, ad valorem tax, imperfect competition, product differentiation, Bertrand, oligopoly, tax incidence, discrete choice models

    The Efficiency of Indirect Taxes under Imperfect Competition

    Get PDF
    This paper considers the relative efficiency of ad valorem and unit taxes in imperfectly competitive markets. We provide a simple proof that ad valorem taxes are welfare-superior to unit taxes in the short run when production costs are identical across firms. The proof covers differentiated products and a wide range of market conduct. Cost asymmetries strengthen the case for ad valorem taxation under Cournot competition, but unit taxation may be welfare-superior under Bertrand competition with product differentiation. Ad valorem taxation is superior with free entry under Cournot competition, but not necessarily under price competition when consumers value variety.Excise tax, unit tax, specific tax, ad valorem tax, tax efficiency, product differentiation, imperfect competition

    The Survey of Lines in M31 (SLIM): The Drivers of the [CII]/TIR Variation

    Full text link
    The ratio of the [CII] 158μ\,\mum emission line over the total infrared emission (TIR) is often used as a proxy for the photoelectric (PE) heating efficiency (ϵPE\epsilon_{\rm PE}) of the far-ultraviolet (FUV) photons absorbed by dust in the interstellar medium. In the nearby galaxy M31, we measure a strong radial variation of [CII]/TIR that we rule out as being due to an intrinsic variation in ϵPE\epsilon_{\rm PE}. [CII]/TIR fails as a proxy for ϵPE\epsilon_{\rm PE}, because the TIR measures all dust heating, not just the contribution from FUV photons capable of ejecting electrons from dust grains. Using extensive multiwavelength coverage from the FUV to far-infrared (FIR), we infer the attenuated FUV emission (UVatt\rm UV_{att}), and the total attenuated flux (TOTatt\rm TOT_{att}). We find [CII]/TIR to be strongly correlated with UVatt\rm UV_{att}/TOTatt\rm TOT_{att}, indicating that, in M31 at least, one of the dominant drivers for [CII]/TIR variation is the relative hardness of the absorbed stellar radiation field. We define ϵPEUV\rm{ \epsilon_{PE}^{UV}}, [CII]/UVatt\rm{ UV_{att}} which should be more closely related to the actual PE efficiency, which we find to be essentially constant (1.85±0.8%1.85 \pm 0.8 \%) in all explored fields in M31. This suggests that part of the observed variation of [CII]/TIR in other galaxies is likely due to a change in the relative hardness of the absorbed stellar radiation field, caused by a combination of variations in the stellar population, dust opacity and galaxy metallicity, although PE efficiency may also vary across a wider range of environments.Comment: 19 pages, 16 figures, accepted for publication in Ap

    Subseasonal Prediction for Bloom Dates of Tart Cherries in Utah and Michigan, USA: Merging Phenological Models With CFSv2 Forecast

    Get PDF
    Temperate fruit trees require chilling for rest completion, followed by sufficient heat accumulation for onset of growth and bloom. The application of phenological models to predict bloom dates has been widely used in orchard management. Examples of such application include selecting adapted cultivars less prone to early bloom, predicting needs for frost protection, and preventing damage from late spring freezes. This study merged the Utah (chill) and ASYMCUR (forcing) phenological models by combining chill units and heat units (measured in growing degree hours) to predict bloom dates of tart cherries (Prunus cerasus L.) in Utah and Michigan, the top producing states of the USA. It was found that the modified Utah model improves the estimation of chill units compared with the original one, while the original Utah model may still be suitable for use in the colder winter of Michigan (with its later bloom dates than Utah). The combined models were applied with the temperature predicted by the Climate Forecast System v2 (CFSv2) model. The prediction was applied twice a month, starting from 1 February to 1 May. The Utah-ASYMCUR model using the forecasted temperature from CFSv2 exhibits subseasonal performance in predicting the bloom dates for 6 weeks in advance. The prediction can offer growers a way to mitigate extreme climate anomalies

    Fast calibrated additive quantile regression

    Full text link
    We propose a novel framework for fitting additive quantile regression models, which provides well calibrated inference about the conditional quantiles and fast automatic estimation of the smoothing parameters, for model structures as diverse as those usable with distributional GAMs, while maintaining equivalent numerical efficiency and stability. The proposed methods are at once statistically rigorous and computationally efficient, because they are based on the general belief updating framework of Bissiri et al. (2016) to loss based inference, but compute by adapting the stable fitting methods of Wood et al. (2016). We show how the pinball loss is statistically suboptimal relative to a novel smooth generalisation, which also gives access to fast estimation methods. Further, we provide a novel calibration method for efficiently selecting the 'learning rate' balancing the loss with the smoothing priors during inference, thereby obtaining reliable quantile uncertainty estimates. Our work was motivated by a probabilistic electricity load forecasting application, used here to demonstrate the proposed approach. The methods described here are implemented by the qgam R package, available on the Comprehensive R Archive Network (CRAN)
    corecore