16 research outputs found

    Generalised Mutual Information: a Framework for Discriminative Clustering

    Full text link
    In the last decade, recent successes in deep clustering majorly involved the Mutual Information (MI) as an unsupervised objective for training neural networks with increasing regularisations. While the quality of the regularisations have been largely discussed for improvements, little attention has been dedicated to the relevance of MI as a clustering objective. In this paper, we first highlight how the maximisation of MI does not lead to satisfying clusters. We identified the Kullback-Leibler divergence as the main reason of this behaviour. Hence, we generalise the mutual information by changing its core distance, introducing the Generalised Mutual Information (GEMINI): a set of metrics for unsupervised neural network training. Unlike MI, some GEMINIs do not require regularisations when training as they are geometry-aware thanks to distances or kernels in the data space. Finally, we highlight that GEMINIs can automatically select a relevant number of clusters, a property that has been little studied in deep discriminative clustering context where the number of clusters is a priori unknown.Comment: Submitted for review at the IEEE Transactions on Pattern Analysis and Machine Intelligence. This article is an extension of an original NeurIPS 2022 article [arXiv:2210.06300

    Kernel KMeans clustering splits for end-to-end unsupervised decision trees

    No full text
    Trees are convenient models for obtaining explainable predictions on relatively small datasets. Although there are many proposals for the end-to-end construction of such trees in supervised learning, learning a tree end-to-end for clustering without labels remains an open challenge. As most works focus on interpreting with trees the result of another clustering algorithm, we present here a novel end-to-end trained unsupervised binary tree for clustering: Kauri. This method performs a greedy maximisation of the kernel KMeans objective without requiring the definition of centroids. We compare this model on multiple datasets with recent unsupervised trees and show that Kauri performs identically when using a linear kernel. For other kernels, Kauri often outperforms the concatenation of kernel KMeans and a CART decision tree

    Pearl shape classification using deep convolutional neural networks from Tahitian pearl rotation in Pinctada margaritifera

    No full text
    Abstract Tahitian pearls, artificially cultivated from the black-lipped pearl oyster Pinctada margaritifera, are renowned for their unique color and large size, making the pearl industry vital for the French Polynesian economy. Understanding the mechanisms of pearl formation is essential for enabling quality and sustainable production. In this paper, we explore the process of pearl formation by studying pearl rotation. Here we show, using a deep convolutional neural network, a direct link between the rotation of the pearl during its formation in the oyster and its final shape. We propose a new method for non-invasive pearl monitoring and a model for predicting the final shape of the pearl from rotation data with 81.9% accuracy. These novel resources provide a fresh perspective to study and enhance our comprehension of the overall mechanism of pearl formation, with potential long-term applications for improving pearl production and quality control in the industry

    Best-worst scaling identified adequate statistical methods and literature search as the most important items of AMSTAR2 (A measurement tool to assess systematic reviews)

    Get PDF
    Objective: To assess the relative importance of A MeaSurement Tool to Assess systematic Reviews 2 (AMSTAR2) items. Study design and setting: A best-worst scaling object case was conducted among a sample of experts in the field of systematic reviews (SRs) and meta-analyses (MAs). Respondents were asked in a series of 15 choice tasks to choose the most and the least important item from a set of four items from the master list, which included the 16 AMSTAR2 items. Hierarchical Bayes analysis was used to generate the relative importance score for each item. Results: The most important items highlighted by our 242 experts to conduct overview of reviews and critically assess SRs/MAs were the appropriateness of statistical analyses and adequacy of the literature search, followed by items regarding the assessment of risk of bias, the research protocol, and the assessment of heterogeneity (relative importance score >6.5). Items related to funding sources and the assessment of study selection and data extraction in duplicate were rated as least important. Conclusion: Although all AMSTAR2 items can be considered as important, our results highlighted the importance of keeping the two items (the appropriateness of statistical analyses and the adequacy of the literature search) among the critical items proposed by AMSTAR2 to critically appraise SRs/MAs

    3D microscopic reconstruction of pearls using combined optical microscopy and photogrammetry

    No full text
    In this study, we introduce an affordable and accessible method that combines optical microscopy and photogrammetry to reconstruct 3D models of Tahitian pearls. We present a novel device designed for acquiring microscopic images around a sphere using translational displacement stages and outline our method for reconstructing these images. We successfully created 3D models of two individual pearl rings, each representing 6.3% of the pearl’s surface. Additionally, we generated a combined model representing 10.3% of the pearl’s surface. This showcases the potential for reconstructing entire pearls with appropriate instrumentation. We emphasize that our approach extends beyond pearls and spherical objects and can be adapted for various object types using appropriate acquisition devices. We provide a proof of concept demonstrating the feasibility of 3D photogrammetry using optical microscopy. Consequently, our method offers a practical and cost-effective alternative for generating 3D models at a microscopic scale, particularly when detailed internal structure information is unnecessary

    Sparse GEMINI for joint discriminative clustering and feature selection

    No full text
    Feature selection in clustering is a hard task which involves simultaneously the discovery of relevant clusters as well as relevant variables with respect to these clusters. While feature selection algorithms are often model-based through optimised model selection or strong assumptions on p(x x x), we introduce a discriminative clustering model trying to maximise a geometry-aware generalisation of the mutual information called GEMINI with a simple 1 penalty: the Sparse GEMINI. This algorithm avoids the burden of combinatorial feature subset exploration and is easily scalable to high-dimensional data and large amounts of samples while only designing a clustering model p θ (y|x x x). We demonstrate the performances of Sparse GEMINI on synthetic datasets as well as large-scale datasets. Our results show that Sparse GEMINI is a competitive algorithm and has the ability to select relevant subsets of variables with respect to the clustering without using relevance criteria or prior hypotheses

    Generalised Mutual Information for Discriminative Clustering

    No full text
    International audienceIn the last decade, recent successes in deep clustering majorly involved the mutual information (MI) as an unsupervised objective for training neural networks with increasing regularisations. While the quality of the regularisations have been largely discussed for improvements, little attention has been dedicated to the relevance of MI as a clustering objective. In this paper, we first highlight how the maximisation of MI does not lead to satisfying clusters. We identified the Kullback-Leibler divergence as the main reason of this behaviour. Hence, we generalise the mutual information by changing its core distance, introducing the generalised mutual information (GEMINI): a set of metrics for unsupervised neural network training. Unlike MI, some GEMINIs do not require regularisations when training. Some of these metrics are geometry-aware thanks to distances or kernels in the data space. Finally, we highlight that GEMINIs can automatically select a relevant number of clusters, a property that has been little studied in deep clustering context where the number of clusters is a priori unknown

    Generalised Mutual Information: a Framework for Discriminative Clustering

    No full text
    Submitted for review at the IEEE Transactions on Pattern Analysis and Machine Intelligence. This article is an extension of an original NeurIPS 2022 article [arXiv:2210.06300]In the last decade, recent successes in deep clustering majorly involved the Mutual Information (MI) as an unsupervised objective for training neural networks with increasing regularisations. While the quality of the regularisations have been largely discussed for improvements, little attention has been dedicated to the relevance of MI as a clustering objective. In this paper, we first highlight how the maximisation of MI does not lead to satisfying clusters. We identified the Kullback-Leibler divergence as the main reason of this behaviour. Hence, we generalise the mutual information by changing its core distance, introducing the Generalised Mutual Information (GEMINI): a set of metrics for unsupervised neural network training. Unlike MI, some GEMINIs do not require regularisations when training as they are geometry-aware thanks to distances or kernels in the data space. Finally, we highlight that GEMINIs can automatically select a relevant number of clusters, a property that has been little studied in deep discriminative clustering context where the number of clusters is a priori unknown

    Bacterial membrane bilayer as drug target

    No full text
    National audienceThe widespread emergence of bacterial resistance has led to an urgent need to develop new strategies to regain the efficacy of antibacterials. One of the emerging concept is to target the bacterial membrane bilayer. Aminoglycosides are among the most potent antimicrobials to treat severe infections. In the search for new antibiotics, we have synthesized derivatives of the small aminoglycoside, neamine in the aim to obtain amphiphilic antibiotics able to disturb bacterial membrane bilayer. One to four hydroxyl functions of neamine were capped with phenyl, naphthyl, pyridyl, or quinolyl rings. The 3',4'-, 3',6- and the 3',4',6-2-naphthylmethylene (2NM) derivatives were active against both sensitive and resistant S. aureus strains. The trisubstituted derivative, also showed marked antibacterial activity against Gram (-) bacteria, including resistant strains (1). Regarding its mechanism of action, it showed only a weak and aspecific binding to a model bacterial 16S rRNA as well as a lower ability to decrease 3H leucine incorporation into proteins in P.aeruginosa, suggesting it acts through a mechanism probably involving membrane destabilization. To understand the molecular mechanism involved, we determined the ability of 3’,4’,6-tri-2NM neamine to interact with the bacterial membranes of P. aeruginosa or models mimicking these membranes. Using Atomic Force Microscopy (AFM), we observed a decrease of P. aeruginosa cell thickness. In models of bacterial lipid membranes, we showed a lipid membrane permeabilization in agreement with the deep insertion of 3’,4’,6-tri-2NM neamine within lipid bilayer as predicted by modeling. This new amphiphilic aminoglycoside bound to lipopolysaccharides and induced P. aeruginosa membrane depolarization. All these effects were compared to those obtained with neamine, the disubstituted neamine derivative (3’,6-di-2NM neamine), conventional aminoglycosides (neomycin B and gentamicin) as well as to compounds acting on lipid bilayers like colistin and chlorhexidine. All together, the data showed that 3’,4’,6-tri-2NM neamine derivatives target the membrane of P. aeruginosa (2). This should offer promising prospects in the search for new antibacterials against resistant drug or biocide strains
    corecore