6,833 research outputs found

    Matching Image Sets via Adaptive Multi Convex Hull

    Get PDF
    Traditional nearest points methods use all the samples in an image set to construct a single convex or affine hull model for classification. However, strong artificial features and noisy data may be generated from combinations of training samples when significant intra-class variations and/or noise occur in the image set. Existing multi-model approaches extract local models by clustering each image set individually only once, with fixed clusters used for matching with various image sets. This may not be optimal for discrimination, as undesirable environmental conditions (eg. illumination and pose variations) may result in the two closest clusters representing different characteristics of an object (eg. frontal face being compared to non-frontal face). To address the above problem, we propose a novel approach to enhance nearest points based methods by integrating affine/convex hull classification with an adapted multi-model approach. We first extract multiple local convex hulls from a query image set via maximum margin clustering to diminish the artificial variations and constrain the noise in local convex hulls. We then propose adaptive reference clustering (ARC) to constrain the clustering of each gallery image set by forcing the clusters to have resemblance to the clusters in the query image set. By applying ARC, noisy clusters in the query set can be discarded. Experiments on Honda, MoBo and ETH-80 datasets show that the proposed method outperforms single model approaches and other recent techniques, such as Sparse Approximated Nearest Points, Mutual Subspace Method and Manifold Discriminant Analysis.Comment: IEEE Winter Conference on Applications of Computer Vision (WACV), 201

    On the exponential decay of the Euler-Bernoulli beam with boundary energy dissipation

    Get PDF
    We study the asymptotic behavior of the Euler-Bernoulli beam which is clamped at one end and free at the other end. We apply a boundary control with memory at the free end of the beam and prove that the "exponential decay" of the memory kernel is a necessary and sufficient condition for the exponential decay of the energy.Comment: 13 page

    Hit Dexter 2.0: Machine-Learning Models for the Prediction of Frequent Hitters

    Get PDF
    Assay interference caused by small molecules continues to pose a significant challenge for early drug discovery. A number of rule-based and similarity-based approaches have been derived that allow the flagging of potentially “badly behaving compounds”, “bad actors”, or “nuisance compounds”. These compounds are typically aggregators, reactive compounds, and/or pan-assay interference compounds (PAINS), and many of them are frequent hitters. Hit Dexter is a recently introduced machine learning approach that predicts frequent hitters independent of the underlying physicochemical mechanisms (including also the binding of compounds based on “privileged scaffolds” to multiple binding sites). Here we report on the development of a second generation of machine learning models which now covers both primary screening assays and confirmatory dose–response assays. Protein sequence clustering was newly introduced to minimize the overrepresentation of structurally and functionally related proteins. The models correctly classified compounds of large independent test sets as (highly) promiscuous or nonpromiscuous with Matthews correlation coefficient (MCC) values of up to 0.64 and area under the receiver operating characteristic curve (AUC) values of up to 0.96. The models were also utilized to characterize sets of compounds with specific biological and physicochemical properties, such as dark chemical matter, aggregators, compounds from a high-throughput screening library, drug-like compounds, approved drugs, potential PAINS, and natural products. Among the most interesting outcomes is that the new Hit Dexter models predict the presence of large fractions of (highly) promiscuous compounds among approved drugs. Importantly, predictions of the individual Hit Dexter models are generally in good agreement and consistent with those of Badapple, an established statistical model for the prediction of frequent hitters. The new Hit Dexter 2.0 web service, available at http://hitdexter2.zbh.uni-hamburg.de, not only provides user-friendly access to all machine learning models presented in this work but also to similarity-based methods for the prediction of aggregators and dark chemical matter as well as a comprehensive collection of available rule sets for flagging frequent hitters and compounds including undesired substructures.acceptedVersio

    Classification of Human Epithelial Type 2 Cell Indirect Immunofluoresence Images via Codebook Based Descriptors

    Full text link
    The Anti-Nuclear Antibody (ANA) clinical pathology test is commonly used to identify the existence of various diseases. A hallmark method for identifying the presence of ANAs is the Indirect Immunofluorescence method on Human Epithelial (HEp-2) cells, due to its high sensitivity and the large range of antigens that can be detected. However, the method suffers from numerous shortcomings, such as being subjective as well as time and labour intensive. Computer Aided Diagnostic (CAD) systems have been developed to address these problems, which automatically classify a HEp-2 cell image into one of its known patterns (eg., speckled, homogeneous). Most of the existing CAD systems use handpicked features to represent a HEp-2 cell image, which may only work in limited scenarios. In this paper, we propose a cell classification system comprised of a dual-region codebook-based descriptor, combined with the Nearest Convex Hull Classifier. We evaluate the performance of several variants of the descriptor on two publicly available datasets: ICPR HEp-2 cell classification contest dataset and the new SNPHEp-2 dataset. To our knowledge, this is the first time codebook-based descriptors are applied and studied in this domain. Experiments show that the proposed system has consistent high performance and is more robust than two recent CAD systems

    An integrated approach to modelling the fluid-structure interaction of a collapsible tube

    Get PDF
    The well known collapsible tube experiment was conducted to obtain flow, pressure and materials property data for steady state conditions. These were then used as the boundary conditions for a fully coupled fluid-structure interaction (FSI) model using a propriety computer code, LS-DYNA. The shape profiles for the tube were also recorded. In order to obtain similar collapse modes to the experiment, it was necessary to model the tube flat, and then inflate it into a circular profile, leaving residual stresses in the walls. The profile shape then agreed well with the experimental ones. Two departures from the physical properties were required to reduce computer time to an acceptable level. One of these was the lowering of the speed of sound by two orders of magnitude which, due to the low velocities involved, still left the mach number below 0.2. The other was to increase the thickness of the tube to prevent the numerical collapse of elements. A compensation for this was made by lowering the Young's modulus for the tube material. Overall the results are qualitatively good. They give an indication of the power of the current FSI algorithms and the need to combine experiment and computer models in order to maximise the information that can be extracted both in terms of quantity and quality

    Protocol for a systematic review and network meta-analysis of the use of prophylactic antibiotics in hand trauma surgery

    Get PDF
    Background: The use of prophylactic antibiotics in surgery is contentious. With the rise in antimicrobial resistance, evidence-based antibiotic use should be followed. This systematic review and network meta-analysis will assess the effectiveness of different antibiotics on the prevention of surgical site infection (SSI) following hand trauma surgery. Methods and analysis: The databases Embase, MEDLINE, CINAHL and CENTRAL, ClinicalTrials.gov and the WHO International Clinical Trials Registry Platform will be searched. Abstracts will be screened by two persons independently to identify eligible studies. This systematic review will include both randomised and non-randomised prospective comparative studies in participants with hand and/or wrist injuries requiring surgery; bite injuries will be excluded. The network meta-analysis will compare the use of different prophylactic antibiotics against each other, placebo and/or no antibiotics on the development of SSI within 30 days of surgery (or 90 days if there is an implanted device). The Cochrane risk-of-bias tool 2 will be used to assess the risk of methodological bias in randomised controlled trials, and the Newcastle-Ottowa scale (NOS) will be used to assess the risk of bias in non-randomised studies. A random-effects network meta-analysis will be conducted along with subgroup analyses looking at antibiotic timing, injury type, and operation location. Sensitivity analyses including only low risk-of-bias studies will be conducted, and the confidence in the results will be assessed using Confidence in Network Meta‐Analysis (CINEMA). Discussion: This systematic review and network meta-analysis aims to provide an up-to-date synthesis of the studies assessing the use of antibiotics following hand and wrist trauma to enable evidence-based peri-operative prescribing. Systematic review registration: PROSPERO CRD42023429618
    • 

    corecore