144,766 research outputs found

    Multi-scale gapped smoothing algorithm for robust baseline-free damage detection in optical infrared thermography

    Get PDF
    Flash thermography is a promising technique to perform rapid non-destructive testing of composite materials. However, it is well known that several difficulties are inherently paired with this approach, such as non-uniform heating, measurement noise and lateral heat diffusion effects. Hence, advanced signal-processing techniques are indispensable in order to analyze the recorded dataset. One such processing technique is Gapped Smoothing Algorithm, which predicts a gapped pixel’s value in its sound state from a measurement in the defected state by evaluating only its neighboring pixels. However, the standard Gapped Smoothing Algorithm uses a fixed spatial gap size, which induces issues to detect variable defect sizes in a noisy dataset. In this paper, a Multi-Scale Gapped Smoothing Algorithm (MSGSA) is introduced as a baseline-free image processing technique and an extension to the standard Gapped Smoothing Algorithm. The MSGSA makes use of the evaluation of a wide range of spatial gap sizes so that defects of highly different dimensions are identified. Moreover, it is shown that a weighted combination of all assessed spatial gap sizes significantly improves the detectability of defects and results in an (almost) zero-reference background. The technique thus effectively suppresses the measurement noise and excitation non-uniformity. The efficiency of the MSGSA technique is evaluated and confirmed through numerical simulation and an experimental procedure of flash thermography on carbon fiber reinforced polymers with various defect sizes

    Adaptive Smoothing in fMRI Data Processing Neural Networks

    Full text link
    Functional Magnetic Resonance Imaging (fMRI) relies on multi-step data processing pipelines to accurately determine brain activity; among them, the crucial step of spatial smoothing. These pipelines are commonly suboptimal, given the local optimisation strategy they use, treating each step in isolation. With the advent of new tools for deep learning, recent work has proposed to turn these pipelines into end-to-end learning networks. This change of paradigm offers new avenues to improvement as it allows for a global optimisation. The current work aims at benefitting from this paradigm shift by defining a smoothing step as a layer in these networks able to adaptively modulate the degree of smoothing required by each brain volume to better accomplish a given data analysis task. The viability is evaluated on real fMRI data where subjects did alternate between left and right finger tapping tasks.Comment: 4 pages, 3 figures, 1 table, IEEE 2017 International Workshop on Pattern Recognition in Neuroimaging (PRNI

    Regularized brain reading with shrinkage and smoothing

    Full text link
    Functional neuroimaging measures how the brain responds to complex stimuli. However, sample sizes are modest, noise is substantial, and stimuli are high dimensional. Hence, direct estimates are inherently imprecise and call for regularization. We compare a suite of approaches which regularize via shrinkage: ridge regression, the elastic net (a generalization of ridge regression and the lasso), and a hierarchical Bayesian model based on small area estimation (SAE). We contrast regularization with spatial smoothing and combinations of smoothing and shrinkage. All methods are tested on functional magnetic resonance imaging (fMRI) data from multiple subjects participating in two different experiments related to reading, for both predicting neural response to stimuli and decoding stimuli from responses. Interestingly, when the regularization parameters are chosen by cross-validation independently for every voxel, low/high regularization is chosen in voxels where the classification accuracy is high/low, indicating that the regularization intensity is a good tool for identification of relevant voxels for the cognitive task. Surprisingly, all the regularization methods work about equally well, suggesting that beating basic smoothing and shrinkage will take not only clever methods, but also careful modeling.Comment: Published at http://dx.doi.org/10.1214/15-AOAS837 in the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Signal Processing Spreads a Voxel’s Temporal Frequency Task-Activated Peak and Induces Spatial Correlations in Dual-Task Complex-Valued fMRI

    Get PDF
    Roll 213. Barth's Counsel Schedule / Bannon's History Copies. Image 7 of 21. (13 October, 1955) [PHO 1.213.7]The Boleslaus Lukaszewski (Father Luke) Photographs contain more than 28,000 images of Saint Louis University people, activities, and events between 1951 and 1970. The photographs were taken by Boleslaus Lukaszewski (Father Luke), a Jesuit priest and member of the University's Philosophy Department faculty
    corecore