893 research outputs found

    Présentation du numéro Enseignements universitaires francophones en milieux bi/plurilingues : légitimations et mises en oeuvre

    Get PDF
    International audienceThe thematic dossier of this issue stems from the work carried out by Education et Plurilinguismes : mises en perspective historiques et sociales a strand of the regional project Pluri-L1 (Pays de la Loire region), and is dedicated to diverse legitimations and implementations of Francophone university teaching in bi-plurilingual milieus. By taking into account the status of the languages involved, we would like to offer various perspectives on the development of a bi-plurilingual competence at the university be it from the point of view of the participants, the programmes put in place, or the challenges faced during their implementation. A varia section includes two studies that focus respectively on primary and secondary levels.Le dossier thématique de ce numéro, issu des travaux de l'axe Education et Plurilinguismes : mises en perspective historiques et sociales du projet régional Pluri-L de la région des Pays de la Loire, est consacré à diverses légitimations et mises en œuvre d'enseignements universitaires francophones en milieux bi/plurilingues. À ce titre, nous souhaitons apporter des éclairages variés, tenant compte du statut des langues en présence, au sujet du développement d'une compétence bi/plurilingue à l'université, que ce soit du point de vue des acteurs concernés, des dispositifs conçus ou des difficultés rencontrées au moment de la réalisation. Une partie varia comprend deux études qui concernent l'une le niveau primaire et l'autre le niveau secondaire

    Exact discrete minimization for TV+L0 image decomposition models

    Get PDF
    International audiencePenalized maximum likelihood denoising approaches seek a solution that fulfills a compromise between data fidelity and agreement with a prior model. Penalization terms are generally chosen to enforce smoothness of the solution and to reject noise. The design of a proper penalization term is a difficult task as it has to capture image variability. Image decomposition into two components of different nature, each given a different penalty, is a way to enrich the modeling. We consider the decomposition of an image into a component with bounded variations and a sparse component. The corresponding penalization is the sum of the total variation of the first component and the L0 pseudo-norm of the second component. The minimization problem is highly non-convex, but can still be globally minimized by a minimum s-t-cut computation on a graph. The decomposition model is applied to synthetic aperture radar image denoising

    Patch similarity under non Gaussian noise

    No full text
    International audienceMany tasks in computer vision require to match image parts. While higher-level methods consider image features such as edges or robust descriptors, low-level approaches compare groups of pixels (patches) and provide dense matching. Patch similarity is a key ingredient to many techniques for image registration, stereo-vision, change detection or denoising. A fundamental difficulty when comparing two patches from "real" data is to decide whether the differences should be ascribed to noise or intrinsic dissimilarity. Gaussian noise assumption leads to the classical definition of patch similarity based on the squared intensity differences. When the noise departs from the Gaussian distribution, several similarity criteria have been proposed in the literature. We review seven of those criteria taken from the fields of image processing, detection theory and machine learning. We discuss their theoretical grounding and provide a numerical comparison of their performance under Gamma and Poisson noises

    Sparse + smooth decomposition models for multi-temporal SAR images

    No full text
    International audienceSAR images have distinctive characteristics compared to optical images: speckle phenomenon produces strong fluctuations, and strong scatterers have radar signatures several orders of magnitude larger than others. We propose to use an image decomposition approach to account for these peculiarities. Several methods have been proposed in the field of image processing to decompose an image into components of different nature, such as a geometrical part and a textural part. They are generally stated as an energy minimization problem where specific penalty terms are applied to each component of the sought decomposition. We decompose temporal series of SAR images into three components: speckle, strong scatterers and background. Our decomposition method is based on a discrete optimization technique by graph-cut. We apply it to change detection tasks

    Template Matching with Noisy Patches: A Contrast-Invariant GLR Test

    No full text
    International audienceMatching patches from a noisy image to atoms in a dictionary of patches is a key ingredient to many techniques in image processing and computer vision. By representing with a single atom all patches that are identical up to a radiometric transformation, dictionary size can be kept small, thereby retaining good computational efficiency. Identification of the atom in best match with a given noisy patch then requires a contrast-invariant criterion. In the light of detection theory, we propose a new criterion that ensures contrast invariance and robustness to noise. We discuss its theoretical grounding and assess its performance under Gaussian, gamma and Poisson noises

    How to compare noisy patches? Patch similarity beyond Gaussian noise

    No full text
    International audienceMany tasks in computer vision require to match image parts. While higher-level methods consider image features such as edges or robust descriptors, low-level approaches (so-called image-based) compare groups of pixels (patches) and provide dense matching. Patch similarity is a key ingredient to many techniques for image registration, stereo-vision, change detection or denoising. Recent progress in natural image modeling also makes intensive use of patch comparison. A fundamental difficulty when comparing two patches from "real" data is to decide whether the differences should be ascribed to noise or intrinsic dissimilarity. Gaussian noise assumption leads to the classical definition of patch similarity based on the squared differences of intensities. For the case where noise departs from the Gaussian distribution, several similarity criteria have been proposed in the literature of image processing, detection theory and machine learning. By expressing patch (dis)similarity as a detection test under a given noise model, we introduce these criteria with a new one and discuss their properties. We then assess their performance for different tasks: patch discrimination, image denoising, stereo-matching and motion-tracking under gamma and Poisson noises. The proposed criterion based on the generalized likelihood ratio is shown to be both easy to derive and powerful in these diverse applications

    Fusion of aerial images and sensor data from a ground vehicle for improved semantic mapping

    Get PDF
    This work investigates the use of semantic information to link ground level occupancy maps and aerial images. A ground level semantic map, which shows open ground and indicates the probability of cells being occupied by walls of buildings, is obtained by a mobile robot equipped with an omnidirectional camera, GPS and a laser range finder. This semantic information is used for local and global segmentation of an aerial image. The result is a map where the semantic information has been extended beyond the range of the robot sensors and predicts where the mobile robot can find buildings and potentially driveable ground
    corecore