213 research outputs found

    Analog and digital worlds: Part 2. Fourier analysis in signals and data treatment

    Get PDF
    The most direct scope of Fourier Transform (FT) is to give an alternative representation of a signal: from the original domain to the corresponding frequency domain. The original domain can be time, space or any other independent variable that can be used as the domain of the function. This subject has been treated in Part 1 [1]. In particular, the FT of a signal, also referred to as the frequency spectrum of a signal, has been used to calculate the lowest sampling frequency that provides a correct representation of the signal itself. At the beginning of this contribution, it is illustrated how to implement the so-called windowing process to periodic sequences. Then, the meaning of the operations denominated convolution and deconvolution is discussed. It is shown how FT provides a very effective path to the execution of these operations in the alternative domain by employing the convolution theorem. Finally, the application of convolution and deconvolution operations to experimental signals associated with the 'spontaneous' convolution of two concurrent events is analysed by different examples

    Expert System for Bomb Factory Detection by Networks of Advance Sensors

    Get PDF
    Abstract: (1) Background: Police forces and security administrations are nowadays considering Improvised explosives (IEs) as a major threat. The chemical substances used to prepare IEs are called precursors, and their presence could allow police forces to locate a bomb factory where the on-going manufacturing of IEs is carried out. (2) Methods: An expert system was developed and tested in handling signals from a network of sensors, allowing an early warning. The expert system allows the detection of one precursor based on the signal provided by a single sensor, the detection of one precursor based on the signal provided by more than one sensor, and the production of a global alarm level based on data fusion from all the sensors of the network. (3) Results: The expert system was tested in the Italian Air Force base of Pratica di Mare (Italy) and in the Swedish Defence Research Agency (FOI) in Grindsjön (Sweden). (4) Conclusion: The performance of the expert system was successfully evaluated under relevant environmental conditions. The approach used in the development of the expert system allows maximum flexibility in terms of integration of the response provided by any sensor, allowing to easily include in the network all possible new sensors

    Practical comparison of sparse methods for classification of Arabica and Robusta coffee species using near infrared hyperspectral imaging

    Get PDF
    In the present work sparse-based methods are applied to the analysis of hyperspectral images with the aim at studying their capability of being adequate methods for variable selection in a classification framework. The key aspect of sparse methods is the possibility of performing variable selection by forcing the model coefficients related to irrelevant variables to zero. In particular, two different sparse classification approaches, i.e. sPCA+kNN and sPLS-DA, were compared with the corresponding classical methods (PCA + kNN and PLS-DA) to classify Arabica and Robusta coffee species. Green coffee samples were analyzed using near infrared hyperspectral imaging and the average spectra from each hyperspectral image were used to build training and test sets; furthermore a test image was used to evaluate the performances of the considered methods at pixel-level. In our case, sparse methods led to similar results as classical methods, with the advantage of obtaining more interpretable and parsimonious models. An important result to highlight is that variable selection performed with two different sparse classification approaches converged to the selection of same spectral regions, which implies the chemical relevance of those regions in the discrimination of Arabica and Robusta coffee species

    Fast exploration and classification of large hyperspectral image datasets for early bruise detection on apples

    Get PDF
    Hyperspectral imaging allows to easily acquire tens of thousands of spectra for a single sample in few seconds; though valuable, this data-richness poses many problems due to the difficulty of handling a representative amount of samples altogether. For this reason, we recently proposed an approach based on the idea of reducing each image into a one-dimensional signal, named hyperspectrogram, which accounts both for spatial and for spectral information. In this manner, a dataset of hyperspectral images can be easily and quickly converted into a set of signals (2D data matrix), which in turn can be analyzed using classical chemometric techniques. In this work, the hyperspectrograms obtained from a dataset of 800 NIR-hyperspectral images of two different apple varieties were used to discriminate bruised from sound apples using iPLS-DA as variable selection algorithm, which allowed to efficiently detect the presence of bruises. Moreover, the reconstruction as images of the selected variables confirmed that the automated procedure led to the exact identification of the spatial features related to the onset and to the subsequent evolution with time of the bruise defect

    Use of multivariate image analysis for the evaluation of total mixed rations in dairy cow feeding

    Get PDF
    Multivariate image analysis was applied for the evaluation of total mixed rations (TMR) used in dairy cow feeding. The estimation of the correlations between images and chemical-physical traits of TMR was performe

    Simulation of an experimental database of infrared spectra of complex gaseous mixtures for detecting specific substances. The case of drug precursors

    Get PDF
    This work is motivated by the need to develop suitable databases in absence of real experimental data, for instance when spectra measured with a newly developed instrumentation on real samples are not available yet. This notwithstanding, in fact, the realization of the physical project should be addressed by a starting database, also invaluable in order to test its effectiveness. In this article we face the issue of simulating gas mixtures spectra for the development of a new sensor for External Cavity-Quantum Cascade Laser Photoacoustic Spectroscopy (EC-QCLPAS) starting from literature FT-IR spectra of pure components: a dataset is realized suitable to realistically represent the ensemble of spectra of the gas mixtures of interest. The informative data deriving from the literature spectra were combined with the stochastic component extracted from a sample spectrum recorded with a prototype instrument, allowing us to build a matrix containing thousands of simulated spectra of gaseous mixtures, accounting for the presence of different components at different concentrations. Signal processing and experimental design techniques were used along the whole path leading to the dataset of simulated spectra. In particular, the goal of the construction of the database lies in the development of a final system to detect drug precursors in the vapour phase. The comparison of some EC-QCLPAS spectra with the corresponding simulated signals confirms the validity of the proposed approach

    Characterization of common wheat flours (Triticum aestivum L.) through multivariate analysis of conventional rheological parameters and gluten peak test indices

    Get PDF
    The GlutoPeak consists in high speed mixing of a small amount of wheat flour (<10 g) added with water, and in registering a torque vs. time curve in a very short time (<10 min). Peak torque, peak maximum time, and energy values are calculated from the curve, and used to estimate the aggregation behavior of gluten. The information brought by the GlutoPeak indices is still difficult to interpret correctly, also in relation to the conventional approaches in the field of cereal science. A multivariate approach was used to investigate the correlations existing between the GlutoPeak indices and the conventional rheological parameters. 120 wheat flours- different for protein, dough stability, extensibility, tenacity, and strength, and end-uses - were analyzed using the GlutoPeak and conventional instrumentation. The parameters were subjected to a data exploration step through Principal Component Analysis. Then, multivariate Partial Least Squares Regression (PLSR) models were developed using the GlutoPeak indices to predict the conventional parameters. The values of the squared correlation coefficients in prediction of an external test set showed that acceptable to good results (0.61 64 R2PRED 64 0.96) were obtained for the prediction of 18 out of the 26 conventional parameters here considered

    An application of Z-Box method in dairy cow feedingto estimate the relationships among peNDF, otherfeed variables and productive data

    Get PDF
    Physically effective NDF (peNDF) is defined as the fraction of fibrethat stimulates chewing and contributes to the floating mat oflarge particles in the rumen, and consequently to its regular activity.PeNDF is calculated from a physical effectiveness factor (pef),varying from 0 (NDF stimulates no chewing) to 1 (max chewing),which may be obtained by laboratory-based particle sizing techniques,such as Penn State Particle Separator, Mertens Separator,Z-Box, Cut Accuracy Test, based on the proportion of DM retainedon sieves (by horizontal or vertical shaking). We chose Z-Boxmethod, thanks to its easy use and applicability to as-is feed andtotal mixed rations (TMR), and we are trying to obtain an estimatingequation which may predict milk fat content and/or other productivedata from peNDF and other variables measured on TMR.To this aim, samples of TMR collected from several farms aresieved (3 sub samples each), and undergo proximate analysis,NDF, ADF, ADL and starch. Milk yield, milk fat, water addiction toTMR are collected on farm; qualitative data such as type of forage,breed, season, geographical origin and altitude (plain/hill/mountain)are also taken into account, to estimate their possible effect.As a first step, in order to investigate the complex relationshipsexisting among this wide set of variables, Principal ComponentAnalysis (PCA) is used as a data exploration tool. Two PCA models(presence of silage or not in TMR) are calculated separately. Foreach PCA model, the overall correlations among all the consideredvariables and their relative importance are investigated by meansof the loadings plots, posing particular attention to the correlationswith peNDF and with milk fat. Moreover, it is also possible to identifyhow the different groups of samples depend on specific variables

    Development of Quantitative Structure-Property Relationships (QSPR) using calculated descriptors for the prediction of the physico-chemical properties (nD, r, bp, e and h) of a series of organic solvents.

    Get PDF
    Quantitative structure-property relationship (QSPR) models were derived for predicting boiling point (at 760 mmHg), density (at 25 \ub0C), viscosity (at 25 \ub0C), static dielectric constant (at 25 \ub0C), and refractive index (at 20 \ub0C) of a series of pure organic solvents of structural formula X-CH2CH2-Y. A very large number of calculated molecular descriptors were derived by quantum chemical methods, molecular topology, and molecular geometry by using the CODESSA software package. A comparative analysis of the multiple linear regression techniques (heuristic and best multilinear regression) implemented in CODESSA, with the multivariate PLS/GOLPE method, has been carried out. The performance of the different regression models has been evaluated by the standard deviation of prediction errors, calculated for the compounds of both the training set (internal validation) and the test set (external validation). Satisfactory QSPR models, from both predictive and interpretative point of views, have been obtained for all the studied properties
    • …
    corecore