48,558 research outputs found

    NIR Calibrations for Soybean Seeds and Soy Food Composition Analysis: Total Carbohydrates, Oil, Proteins and Water Contents

    Get PDF
    Conventional chemical analysis techniques are expensive, time consuming, and often destructive. The non-invasive Near Infrared (NIR) technology was introduced over the last decades for wide-scale, inexpensive chemical analysis of food and crop seed composition (see Williams and Norris, 1987; Wilcox and Cavins, 1995; Buning and Diller, 2000 for reviews of the NIR technique development stage prior to 1998, when Diode Arrays were introduced to NIR). NIR spectroscopic measurements obey Lambert and Beer’s law, and quantitative measurements can be successfully made with high speed and ease of operation. NIR has been used in a great variety of food applications. General applications of products analyzed come from all sectors of the food industry including meats, grains, and dairy products (Shadow, 1998)

    A Conversation with Eric Ghysels

    Get PDF
    Published in Econometric Theory, 2012, https://doi.org/10.1017/S026646661100017X</p

    Acceptance sampling plan for multiple manufacturing lines using EWMA process capability index

    Get PDF
    The problem of developing a product acceptance determination procedure for multiple characteristics has attracted the quality assurance practitioners. Due to sufficient demands of consumers, it may not be possible to deliver the quantity ordered on time using the process based on one manufacturing line. So, in factories, product is manufactured using multiple manufacturing lines and combine it. In this manuscript, we present the designing of an acceptance sampling plan for products from multiple independent manufacturing lines using exponentially weighted moving average (EWMA) statistic of the process capability index. The plan parameters such as the sample size and the acceptance number will be determined by satisfying both the producer&apos;s and the consumer&apos;s risks. The efficiency of the proposed plan will be discussed over the existing sampling plan. The tables are given for industrial use and explained with the help of industrial examples. We conclude that the use of the proposed plan in these industries minimizes the cost and time of inspection. Smaller the sample size means low inspection cost. The proposed plan for some non-normal distributions can be extended as a future research. The determination of sampling plan using cost model is also interested area for the future research. ? 2017 The Japan Society of Mechanical Engineers.11Ysciescopu

    Control of Four-Level Quantum Coherence via Discrete Spectral Shaping of an Optical Frequency Comb

    Full text link
    We present an experiment demonstrating high-resolution coherent control of a four-level atomic system in a closed (diamond) type configuration. A femtosecond frequency comb is used to establish phase coherence between a pair of two-photon transitions in cold Rb atoms. By controlling the spectral phase of the frequency comb we demonstrate the optical phase sensitive response of the diamond system. The high-resolution state selectivity of the comb is used to demonstrate the importance of the signs of dipole moment matrix elements in this type of closed-loop excitation. Finally, the pulse shape is optimized resulting in a 256% increase in the two-photon transition rate by forcing constructive interference between the mode pairs detuned from an intermediate resonance.Comment: 5 pages, 4 figures Submitted to Physical Review Letter

    Dating the Timeline of Financial Bubbles during the Subprime Crisis

    Get PDF
    A new recursive regression methodology is introduced to analyze the bubble characteristics of various financial time series during the subprime crisis. The methods modify a technique proposed in Phillips, Wu and Yu (2010) and provide a technology for identifying bubble behavior and consistent dating of their origination and collapse. The tests also serve as an early warning diagnostic of bubble activity. Seven relevant financial series are investigated, including three financial assets (the Nasdaq index, home price index and asset-backed commercial paper), two commodities (the crude oil price and platinum price), one bond rate (Baa), and one exchange rate (Pound/USD). Statistically significant bubble characteristics are found in all of these series. The empirical estimates of the origination and collapse dates suggest an interesting migration mechanism among the financial variables: a bubble first emerged in the equity market during mid-1995 lasting to the end of 2000, followed by a bubble in the real estate market between January 2001 and July 2007 and in the mortgage market between November 2005 and August 2007. After the subprime crisis erupted, the phenomenon migrated selectively into the commodity market and the foreign exchange market, creating bubbles which subsequently burst at the end of 2008, just as the effects on the real economy and economic growth became manifest. Our empirical estimates of the origination and collapse dates match well with the general datetimes of this crisis put forward in a recent study by Caballero, Farhi and Gourinchas (2008).Financial bubbles, Crashes, Date stamping, Explosive behavior, Mildly explosive process, Subprime crisis, Timeline

    A Two-Stage Realized Volatility Approach to Estimation of Diffusion Processes with Discrete

    Get PDF
    This paper motivates and introduces a two-stage method of estimating diffusion processes based on discretely sampled observations. In the first stage we make use of the feasible central limit theory for realized volatility, as developed in Jacod (1994) and Barndorff-Nielsen and Shephard (2002), to provide a regression model for estimating the parameters in the diffusion function. In the second stage the in-fill likelihood function is derived by means of the Girsanov theorem and then used to estimate the parameters in the drift function. Consistency and asymptotic distribution theory for these estimates are established in various contexts. The finite sample performance of the proposed method is compared with that of the approximate maximum likelihood method of At-Sahalia (2002).Maximum likelihood, Girsnov theorem, Discrete sampling, Continuous record, realized volatility

    Comment on “Realized Variance and Market Microstructure Noise” by Peter R. Hansen and Asger Lunde

    Get PDF
    We find ourselves very much in agreement with the thrust of HL’s message concerning the complexity induced by microstructure noise. In particular, we agree that noise is time dependent and correlated with the efficient price - features that in our view are a necessary consequence of the observed form of market transactions, as we have argued above - and that the properties of noise inevitably evolve over time, again just as the efficient price is itself evolutionary. We further agree that microstructure noise cannot be accommodated by simple specifications. Since microstructure noise at ultra high infill sampling frequencies often off-sets the actual transactions data to the latent efficient price, the complexity of microstructure noise includes local nonstationarity and perfect correlation with the efficient price. These are properties that are not permitted in the models and methods presently used in the literature. However, there are empirical procedures that are capable of addressing these additional complexities as we have indicated in parts of our discussion. We join the authors in saying there is still much to do in this exciting field and we look forward to further developments that build on the work they and others have done recently.
    corecore