82 research outputs found

    A mass-transportation approach to a one dimensional fluid mechanics model with nonlocal velocity

    Get PDF
    We consider a one dimensional transport model with nonlocal velocity given by the Hilbert transform and develop a global well-posedness theory of probability measure solutions. Both the viscous and non-viscous cases are analyzed. Both in original and in self-similar variables, we express the corresponding equations as gradient flows with respect to a free energy functional including a singular logarithmic interaction potential. Existence, uniqueness, self-similar asymptotic behavior and inviscid limit of solutions are obtained in the space P2(R)\mathcal{P}_{2}(\mathbb{R}) of probability measures with finite second moments, without any smallness condition. Our results are based on the abstract gradient flow theory developed in \cite{Ambrosio}. An important byproduct of our results is that there is a unique, up to invariance and translations, global in time self-similar solution with initial data in P2(R)\mathcal{P}_{2}(\mathbb{R}), which was already obtained in \textrm{\cite{Deslippe,Biler-Karch}} by different methods. Moreover, this self-similar solution attracts all the dynamics in self-similar variables. The crucial monotonicity property of the transport between measures in one dimension allows to show that the singular logarithmic potential energy is displacement convex. We also extend the results to gradient flow equations with negative power-law locally integrable interaction potentials

    Statistical region based active contour using a fractional entropy descriptor: Application to nuclei cell segmentation in confocal microscopy images

    Get PDF
    We propose an unsupervised statistical region based active contour approach integrating an original fractional entropy measure for image segmentation with a particular application to single channel actin tagged fluorescence confocal microscopy image segmentation. Following description of statistical based active contour segmentation and the mathematical definition of the proposed fractional entropy descriptor, we demonstrate comparative segmentation results between the proposed approach and standard Shannon’s entropy on synthetic and natural images. We also show that the proposed unsupervised statistical based approach, integrating the fractional entropy measure, leads to very satisfactory segmentation of the cell nuclei from which shape characterization can be calculated

    Computer-Assisted Segmentation of Videocapsule Images Using Alpha-Divergence-Based Active Contour in the Framework of Intestinal Pathologies Detection

    Get PDF
    Visualization of the entire length of the gastrointestinal tract through natural orifices is a challenge for endoscopists. Videoendoscopy is currently the “gold standard” technique for diagnosis of different pathologies of the intestinal tract. Wireless Capsule Endoscopy (WCE) has been developed in the 1990's as an alternative to videoendoscopy to allow direct examination of the gastrointestinal tract without any need for sedation. Nevertheless, the systematic post-examination by the specialist of the 50,000 (for the small bowel) to 150,000 images (for the colon) of a complete acquisition using WCE remains time-consuming and challenging due to the poor quality of WCE images. In this article, a semiautomatic segmentation for analysis of WCE images is proposed. Based on active contour segmentation, the proposed method introduces alpha-divergences, a flexible statistical similarity measure that gives a real flexibility to different types of gastrointestinal pathologies. Results of segmentation using the proposed approach are shown on different types of real-case examinations, from (multi-) polyp(s) segmentation, to radiation enteritis delineation

    Machine Learning under the light of Phraseology expertise: use case of presidential speeches, De Gaulle -Hollande (1958-2016)

    Get PDF
    International audienceAuthor identification and text genesis have always been a hot topic for the statistical analysis of textual data community. Recent advances in machine learning have seen the emergence of machines competing state-of-the-art computational linguistic methods on specific natural language processing tasks (part-of-speech tagging, chunking and parsing, etc). In particular, Deep Linguistic Architectures are based on the knowledge of language speci-ficities such as grammar or semantic structure. These models are considered as the most competitive thanks to their assumed ability to capture syntax. However if those methods have proven their efficiency, their underlying mechanisms, both from a theoretical and an empirical analysis point of view, remains hard both to explicit and to maintain stable, which restricts their area of applications. Our work is enlightening mechanisms involved in deep architectures when applied to Natural Language Processing (NLP) tasks. The Query-By-Dropout-Committee (QBDC) algorithm is an active learning technique we have designed for deep architectures: it selects iteratively the most relevant samples to be added to the training set so that the model is improved the most when built from the new training set. However in this article, we do not go into details of the QBDC algorithm-as it has already been studied in the original QBDC article-but we rather confront the relevance of the sentences chosen by our active strategy to state of the art phraseology techniques. We have thus conducted experiments on the presidential discourses from presidents C. De Gaulle, N. Sarkozy and F. Hollande in order to exhibit the interest of our active deep learning method in terms of discourse author identification and to analyze the extracted linguistic patterns by our artificial approach compared to standard phraseology techniques.L'identification de l'auteur et la gen ese d'un texte ont toujours eté une question de tr es grand intérêt pour la com-munauté de l'analyse statistique des données textuelles. Les récentes avancées dans le domaine de l'apprentissage machine ont permis l'´ emergence d'algorithmes concurrençant les méthodes de linguistique computationnelles de l'´ etat de l'art pour des tâches spécifiques en traitement automatique du langage (´ etiquetage des parties du dis-cours, segmentation et l'analyse du texte, etc). En particulier, les architectures profondes pour la linguistique sont fondées sur la connaissance des spécificités linguistiques telles que la grammaire ou la structure sémantique. Ces mod eles sont considérés comme les plus compétitifs grâcè a leur capacité supposée de capturer la syntaxe. Toute-fois, si ces méthodes ont prouvé leur efficacité, leurs mécanismes sous-jacents, tant du point de vue théorique que du point de vue de l'analyse empirique, restent difficilè a la fois a expliciter et a maintenir stables, ce qui limite leur domaine d'application. Notre article visè a mettre enlumì ere certains des mécanismes impliqués dans l'apprentissage profond lorsqu'il est appliqué a des tâches de traitement automatique du langage (TAL). L'algorithme Query-By-Dropout-Committee (QBDC) est une technique d'apprentissage actif, nous avons conçu pour les architectures profondes : il sélectionne itérativement les echantillons les plus pertinents pour etre ajoutés a l'ensemble d'entrainement afin que le mod ele soit amélioré de façon optimale lorsqu'on il est mis a jour a partir du nouvel ensemble d'entrainement. Cependant, dans cet article, nous ne détaillons pas l'algorithme QBDC-qui a déj a ´ eté etudié dans l'article original sur QBDC-mais nous confrontons plutôt la pertinence des phrases choisies par notre stratégie active aux techniques de l'´ etat de l'art en phraséologie. Nous avons donc mené des expériences sur les discours présidentiels des présidents C. De Gaulle , N. Sarkozy et F. Hollande afin de présenter l' intérêt de notre méthode d'apprentissage profond actif en termes de d'identification de l'auteur d'un discours et pour analyser les motifs linguistiques extraits par notre approche artificielle par rapport aux techniques de phraséologie standard

    Identidade feminina no espaço multicultural: a voz narrativa de Tereza Albues

    Get PDF
    In a time and space in which identities are no longer fixed, feminine identities are represented in a strategic way, forming the image of contemporary woman who wants to free themselves from male oppression. Tereza Albues’ narrative represents how female identity has been affirmed and self-represented in multicultural space.Em um tempo e espaço em que as identidades não são mais fixas, as identidades femininas são representadas de modo estratégico, formando a imagem da mulher contemporânea que pretende libertar-se do jugo masculino. A narrativa de Tereza Albues representa como a identidade feminina tem se firmado e se autorrepresentado no espaço multicultural

    Statistical Model of Shape Moments with Active Contour Evolution for Shape Detection and Segmentation

    Get PDF
    This paper describes a novel method for shape representation and robust image segmentation. The proposed method combines two well known methodologies, namely, statistical shape models and active contours implemented in level set framework. The shape detection is achieved by maximizing a posterior function that consists of a prior shape probability model and image likelihood function conditioned on shapes. The statistical shape model is built as a result of a learning process based on nonparametric probability estimation in a PCA reduced feature space formed by the Legendre moments of training silhouette images. A greedy strategy is applied to optimize the proposed cost function by iteratively evolving an implicit active contour in the image space and subsequent constrained optimization of the evolved shape in the reduced shape feature space. Experimental results presented in the paper demonstrate that the proposed method, contrary to many other active contour segmentation methods, is highly resilient to severe random and structural noise that could be present in the data

    Patterns of adherence to and compliance with the Portuguese smoke-free law in the leisure-hospitality sector

    Get PDF
    CIEC – Research Centre on Child Studies, UM (FCT R&D 317)Background: In 2008, the Portuguese smoke-free law came into effect including partial bans in the leisure-hospitality (LH) sector. The objective of the study is to assess the prevalence of smoking control policies (total ban, smoking permission and designated smoking areas) adopted by the LH sector in Portugal. The levels of noncompliance with each policy are investigated as well as the main factors associated with smoking permission and noncompliance with the law. Methods: Cross-sectional study conducted between January 2010 and May 2011. A random sample of venues was selected from the Portuguese LH sector database, proportionally stratified according to type, size and geographical area. All venues were assessed in loco by an observer. The independent effects of venues’ characteristics on smoking permission and the level of noncompliance with the law were explored using logistic regression. Results: Overall, 1.412 venues were included. Total ban policy was adopted by 75.9% of venues, while 8.4% had designated smoking areas. Smoking ban was more prevalent in restaurants (85.9%). Only 29.7% of discos/bars/pubs opted for complete ban. Full or partial smoking permission was higher in discos/bar/pubs (OR = 7.37; 95%CI 4.87 to 11.17). Noncompliance with the law was higher in venues allowing smoking and lower in places with complete ban (33.6% and 7.6% respectively, p, 0.001). Discos/bars/pubs with full smoking permission had the highest level of noncompliance (OR = 3.31; 95%CI 1.40 to 7.83). Conclusions: Our findings show a high adherence to smoking ban policy by the Portuguese LH sector. Nonetheless, one quarter of the venues is fully or partially permissive towards smoking, with the discos/bars/pubs considerably contributing to this situation. Venues with smoking permission policies were less compliant with the legislation. The implementation of a comprehensive smoke-free law, without any exceptions, is essential to effectively protect people from the second hand smoke.The work is part of a large Epidemiological Study on the Portuguese Tobacco Control Policy, developed by the Instituto de Medicina Preventiva da Faculdade de Medicina de Lisboa and supported, in its preliminary part, by the Direccao Geral da Saude (DGS) and, in the second part, by the national funding institution Fundacao para a Ciencia e Tecnologia (FCT). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript

    Mass calibration and Relative Humidity compensation requirements for optical portable particulate matter monitors : the IMPASHS (Impact of smoke-free policies in EU Member States) WP2 preliminary results

    Get PDF
    Better knowledge of particulate matter (PM) concentrations needs portable, reliable, user friendly, low cost, real time mass analyzers of PM2.5 and PM10. Optical Particle Counters (OPC) measuring mass have manufacturer calibration specific gravity “K” factor referred to polystyrene latex particles which are completely different than those of the real world, therefore they require specific calibrations. Measurements are also subject to Relative Humidity (RH) heavy interference.Fundação para a Ciência e a Tecnologia (FCT
    • …
    corecore