536 research outputs found

    Molecular Surface Mesh Generation by Filtering Electron Density Map

    Get PDF
    Bioinformatics applied to macromolecules are now widely spread and in continuous expansion. In this context, representing external molecular surface such as the Van der Waals Surface or the Solvent Excluded Surface can be useful for several applications. We propose a fast and parameterizable algorithm giving good visual quality meshes representing molecular surfaces. It is obtained by isosurfacing a filtered electron density map. The density map is the result of the maximum of Gaussian functions placed around atom centers. This map is filtered by an ideal low-pass filter applied on the Fourier Transform of the density map. Applying the marching cubes algorithm on the inverse transform provides a mesh representation of the molecular surface

    Post-Reconstruction Deconvolution of PET Images by Total Generalized Variation Regularization

    Full text link
    Improving the quality of positron emission tomography (PET) images, affected by low resolution and high level of noise, is a challenging task in nuclear medicine and radiotherapy. This work proposes a restoration method, achieved after tomographic reconstruction of the images and targeting clinical situations where raw data are often not accessible. Based on inverse problem methods, our contribution introduces the recently developed total generalized variation (TGV) norm to regularize PET image deconvolution. Moreover, we stabilize this procedure with additional image constraints such as positivity and photometry invariance. A criterion for updating and adjusting automatically the regularization parameter in case of Poisson noise is also presented. Experiments are conducted on both synthetic data and real patient images.Comment: First published in the Proceedings of the 23rd European Signal Processing Conference (EUSIPCO-2015) in 2015, published by EURASI

    Forward Error Correction applied to JPEG-XS codestreams

    Full text link
    JPEG-XS offers low complexity image compression for applications with constrained but reasonable bit-rate, and low latency. Our paper explores the deployment of JPEG-XS on lossy packet networks. To preserve low latency, Forward Error Correction (FEC) is envisioned as the protection mechanism of interest. Despite the JPEG-XS codestream is not scalable in essence, we observe that the loss of a codestream fraction impacts the decoded image quality differently, depending on whether this codestream fraction corresponds to codestream headers, to coefficients significance information, or to low/high frequency data, respectively. Hence, we propose a rate-distortion optimal unequal error protection scheme that adapts the redundancy level of Reed-Solomon codes according to the rate of channel losses and the type of information protected by the code. Our experiments demonstrate that, at 5% loss rates, it reduces the Mean Squared Error by up to 92% and 65%, compared to a transmission without and with optimal but equal protection, respectively

    Optimal measurement budget allocation for particle filtering

    Full text link
    Particle filtering is a powerful tool for target tracking. When the budget for observations is restricted, it is necessary to reduce the measurements to a limited amount of samples carefully selected. A discrete stochastic nonlinear dynamical system is studied over a finite time horizon. The problem of selecting the optimal measurement times for particle filtering is formalized as a combinatorial optimization problem. We propose an approximated solution based on the nesting of a genetic algorithm, a Monte Carlo algorithm and a particle filter. Firstly, an example demonstrates that the genetic algorithm outperforms a random trial optimization. Then, the interest of non-regular measurements versus measurements performed at regular time intervals is illustrated and the efficiency of our proposed solution is quantified: better filtering performances are obtained in 87.5% of the cases and on average, the relative improvement is 27.7%.Comment: 5 pages, 4 figues, conference pape

    A Three-Level Computational Attention Model

    Get PDF
    This article deals with a biologically-motivated three-level computational attention model architecture based on the rarity and the information theory framework. It mainly focuses on a low-level step which aims in fastly highlighting important areas and a middle-level step which analyses the behaviour of the detected areas. Their application on both still images and videos provide results to be used by the third high-level step

    A Three-Level Computational Attention Model

    Get PDF
    This article deals with a biologically-motivated three-level computational attention model architecture based on the rarity and the information theory framework. It mainly focuses on a low-level step which aims in fastly highlighting important areas and a middle-level step which analyses the behaviour of the detected areas. Their application on both still images and videos provide results to be used by the third high-level step

    Security and robustness constraints for spread-spectrum Tardos fingerprinting

    Full text link
    International audienceThis paper presents a practical analysis of the impact of robustness and security on Tardos' collusion-secure fingerprinting codes using spread-spectrum watermarking modulations. In this framework, we assume that the coalition has to face an embedding scheme of given security level and consequently has to suffer a probability of wrongly estimating their embedded symbols. We recall the Worst Case Attack associated to this probability, e.g. the optimal attack which minimises the mutual information between the sequence of a colluder and the pirated one. For a given achievable rate of the Tardos' fingerprinting model, we compare the Improved Spread-Spectrum embedding versus a new secure embedding (called rho-Circular Watermarking) considering the AWGN channel. We show that secure embeddings are more immune to decoding errors than non-secure ones while keeping the same fingerprinting capacity

    Computational Attention for Event Detection

    Get PDF
    This article deals with a biologically-motivated three-level computational attention model architecture based on the rarity and the information theory framework. It mainly focuses on low-level and medium-level steps and their application in pre-attentive detection of tumours in CT scans and unusual events in audio recordings

    Computational Attention for Defect Localisation

    Get PDF
    This article deals with a biologically-motivated three-level computational attention model architecture based on the rarity and the information theory framework. It mainly focuses on a low-level step and its application in pre-attentive defect localisation for apple quality grading and tumour localisation for medical images
    corecore