1,641 research outputs found

    Post-Reconstruction Deconvolution of PET Images by Total Generalized Variation Regularization

    Full text link
    Improving the quality of positron emission tomography (PET) images, affected by low resolution and high level of noise, is a challenging task in nuclear medicine and radiotherapy. This work proposes a restoration method, achieved after tomographic reconstruction of the images and targeting clinical situations where raw data are often not accessible. Based on inverse problem methods, our contribution introduces the recently developed total generalized variation (TGV) norm to regularize PET image deconvolution. Moreover, we stabilize this procedure with additional image constraints such as positivity and photometry invariance. A criterion for updating and adjusting automatically the regularization parameter in case of Poisson noise is also presented. Experiments are conducted on both synthetic data and real patient images.Comment: First published in the Proceedings of the 23rd European Signal Processing Conference (EUSIPCO-2015) in 2015, published by EURASI

    Video object watermarking robust to manipulations

    Get PDF
    This paper presents a watermarking scheme that embeds a signature in video objects for the MPEG-4 video standard. The different constraints associated with this standard are quite different from classical video watermarking schemes. The mark detection had to be achieved after different video object manipulations such as rotation or scaling operations. Principal component analysis and warping methods are used to enable the synchronization of the mark after geometric manipulations. The embedding of the mark is done adding an oriented random sequence and the detection of the mark is processed using a correlation criterion. The different results point out the fact that the presented scheme can detect the mark after bit-rate modification, object shape sub-sampling and geometric manipulations (scaling and rotations).Cet article présente un schéma de tatouage permettant de marquer des objets vidéo tels qu'ils sont décrits dans le cadre de la norme MPEG-4. Les contraintes liées à cette norme sont différentes de celles connues en tatouage de séquences classiques. Dans un tel contexte, la détection de la signature doit en effet être possible après diverses manipulations de l'objet vidéo telles que des rotations ou changements d'échelle. La méthode proposée utilise la forme de l'objet vidéo pour permettre la synchronisation de la signature. Cette étape est effectuée en utilisant des techniques d'analyse en composantes principales et de « morphing » de séquences de forme prédéfinie. L'insertion de la signature s'effectue ensuite par addition d'une séquence aléatoire orientée, et la détection s'opère par corrélation. Les tests appliqués sur des objets vidéo indiquent que le schéma présenté permet la détection de la signature après des opérations telles que la réduction du débit, le sous-échantillonnage du masque associé à l'objet, ou encore des manipulations géométriques (rotations, changements d'échelle)

    Molecular Surface Mesh Generation by Filtering Electron Density Map

    Get PDF
    Bioinformatics applied to macromolecules are now widely spread and in continuous expansion. In this context, representing external molecular surface such as the Van der Waals Surface or the Solvent Excluded Surface can be useful for several applications. We propose a fast and parameterizable algorithm giving good visual quality meshes representing molecular surfaces. It is obtained by isosurfacing a filtered electron density map. The density map is the result of the maximum of Gaussian functions placed around atom centers. This map is filtered by an ideal low-pass filter applied on the Fourier Transform of the density map. Applying the marching cubes algorithm on the inverse transform provides a mesh representation of the molecular surface

    Reversible watermarking scheme with image-independent embedding capacity

    Get PDF
    Permanent distortion is one of the main drawbacks of all the irreversible watermarking schemes. Attempts to recover the original signal after the signal passing the authentication process are being made starting just a few years ago. Some common problems, such as salt-and-pepper artefacts owing to intensity wraparound and low embedding capacity, can now be resolved. However, some significant problems remain unsolved. First, the embedding capacity is signal-dependent, i.e., capacity varies significantly depending on the nature of the host signal. The direct impact of this is compromised security for signals with low capacity. Some signals may be even non-embeddable. Secondly, while seriously tackled in irreversible watermarking schemes, the well-known problem of block-wise dependence, which opens a security gap for the vector quantisation attack and transplantation attack, are not addressed by researchers of the reversible schemes. This work proposes a reversible watermarking scheme with near-constant signal-independent embedding capacity and immunity to the vector quantisation attack and transplantation attack

    Prognostic Power of Texture Based Morphological Operations in a Radiomics Study for Lung Cancer

    Full text link
    The importance of radiomics features for predicting patient outcome is now well-established. Early study of prognostic features can lead to a more efficient treatment personalisation. For this reason new radiomics features obtained through mathematical morphology-based operations are proposed. Their study is conducted on an open database of patients suffering from Nonsmall Cells Lung Carcinoma (NSCLC). The tumor features are extracted from the CT images and analyzed via PCA and a Kaplan-Meier survival analysis in order to select the most relevant ones. Among the 1,589 studied features, 32 are found relevant to predict patient survival: 27 classical radiomics features and five MM features (including both granularity and morphological covariance features). These features will contribute towards the prognostic models, and eventually to clinical decision making and the course of treatment for patients.Comment: 9 pages, 3 tables, 3 figures, 31 reference

    The Agile UX Development Lifecycle: Combining Formative Usability and Agile Methods

    Get PDF
    This paper contributes a method variation that helps cross-functional teams combine both formative usability and agile methods to develop interactive systems. Both methods are iterative, continuous and focus on delivering value to users, which makes their combination possible. The “agile UX development lifecycle” supports and facilitates the synchronization of the steps involved in both formative usability and agile sprints in an operable manner and is intended for design and development settings. We present a case study that illustrates the extent to which this tool meets the needs of real-world cross-functional teams, describing the gains in efficiency it can provide but also guidelines for increasing the benefits gained from this combination in design and development settings

    Forward Error Correction applied to JPEG-XS codestreams

    Full text link
    JPEG-XS offers low complexity image compression for applications with constrained but reasonable bit-rate, and low latency. Our paper explores the deployment of JPEG-XS on lossy packet networks. To preserve low latency, Forward Error Correction (FEC) is envisioned as the protection mechanism of interest. Despite the JPEG-XS codestream is not scalable in essence, we observe that the loss of a codestream fraction impacts the decoded image quality differently, depending on whether this codestream fraction corresponds to codestream headers, to coefficients significance information, or to low/high frequency data, respectively. Hence, we propose a rate-distortion optimal unequal error protection scheme that adapts the redundancy level of Reed-Solomon codes according to the rate of channel losses and the type of information protected by the code. Our experiments demonstrate that, at 5% loss rates, it reduces the Mean Squared Error by up to 92% and 65%, compared to a transmission without and with optimal but equal protection, respectively

    La participation publique en science et technologie et son contexte normatif. L'héritage du tournant participatif et le cadre européen émergent de la 'Responsible Research and Innovation'

    Full text link
    Over the last two decades in Europe, science and technology’s unforeseen impacts led many STS scholars to plead for a ‘participatory turn’ in order to make our democracies more able to handle sociotechnical controversies. However, since the outset of this participatory turn, critiques sharing the common emphasize on the importance of taking into account the context in which public participation takes place have pointed to the risk of participation being either romanticized or instrumentalized. This thesis contributes to the critical scrutinizing of public participation in science and technology. By drawing on a set of qualitative data collection strategies and on a discourse analysis of collected materials, it investigates the normative context in which public participation is currently conceived and promoted at the European level and links it to historical perspectives in order to grasp the way in which the participatory turn’s legacy has been impacted. At it shows, far from being left opened-up, public participation is strongly closed-down by normative forces that lies in the context is which its promotion is currently taking place. As argued, public participation appears as instrumentalized in Horizon 2020 due to the increasing economization of policies and the steering of science and innovation toward tackling societal challenges. However, while acknowledging that these trends are characteristic of current developments, some longer ones are highlighted. Indeed, as this research suggests, the instrumentalization of public participation goes largely beyond the mere Horizon 2020. From the Sixth Framework Programme already, it appears that the normative context in which public participation in science and technology has been conceived and promoted has always tended to instrumentalized and to close down the deliberative governance of science
    corecore