1,474 research outputs found

    Object Counting with Deep Learning

    Get PDF
    This thesis explores various empirical aspects of deep learning or convolutional network based models for efficient object counting. First, we train moderately large convolutional networks on comparatively smaller datasets containing few hundred samples from scratch with conventional image processing based data augmentation. Then, we extend this approach for unconstrained, outdoor images using more advanced architectural concepts. Additionally, we propose an efficient, randomized data augmentation strategy based on sub-regional pixel distribution for low-resolution images. Next, the effectiveness of depth-to-space shuffling of feature elements for efficient segmentation is investigated for simpler problems like binary segmentation -- often required in the counting framework. This depth-to-space operation violates the basic assumption of encoder-decoder type of segmentation architectures. Consequently, it helps to train the encoder model as a sparsely connected graph. Nonetheless, we have found comparable accuracy to that of the standard encoder-decoder architectures with our depth-to-space models. After that, the subtleties regarding the lack of localization information in the conventional scalar count loss for one-look models are illustrated. At this point, without using additional annotations, a possible solution is proposed based on the regulation of a network-generated heatmap in the form of a weak, subsidiary loss. The models trained with this auxiliary loss alongside the conventional loss perform much better compared to their baseline counterparts, both qualitatively and quantitatively. Lastly, the intricacies of tiled prediction for high-resolution images are studied in detail, and a simple and effective trick of eliminating the normalization factor in an existing computational block is demonstrated. All of the approaches employed here are thoroughly benchmarked across multiple heterogeneous datasets for object counting against previous, state-of-the-art approaches

    Ground Truth Evaluation of Neural Network Explanations with CLEVR-XAI

    Full text link
    The rise of deep learning in today's applications entailed an increasing need in explaining the model's decisions beyond prediction performances in order to foster trust and accountability. Recently, the field of explainable AI (XAI) has developed methods that provide such explanations for already trained neural networks. In computer vision tasks such explanations, termed heatmaps, visualize the contributions of individual pixels to the prediction. So far XAI methods along with their heatmaps were mainly validated qualitatively via human-based assessment, or evaluated through auxiliary proxy tasks such as pixel perturbation, weak object localization or randomization tests. Due to the lack of an objective and commonly accepted quality measure for heatmaps, it was debatable which XAI method performs best and whether explanations can be trusted at all. In the present work, we tackle the problem by proposing a ground truth based evaluation framework for XAI methods based on the CLEVR visual question answering task. Our framework provides a (1) selective, (2) controlled and (3) realistic testbed for the evaluation of neural network explanations. We compare ten different explanation methods, resulting in new insights about the quality and properties of XAI methods, sometimes contradicting with conclusions from previous comparative studies. The CLEVR-XAI dataset and the benchmarking code can be found at https://github.com/ahmedmagdiosman/clevr-xai.Comment: 37 pages, 9 tables, 2 figures (plus appendix 14 pages

    Unmasking Clever Hans Predictors and Assessing What Machines Really Learn

    Full text link
    Current learning machines have successfully solved hard application problems, reaching high accuracy and displaying seemingly "intelligent" behavior. Here we apply recent techniques for explaining decisions of state-of-the-art learning machines and analyze various tasks from computer vision and arcade games. This showcases a spectrum of problem-solving behaviors ranging from naive and short-sighted, to well-informed and strategic. We observe that standard performance evaluation metrics can be oblivious to distinguishing these diverse problem solving behaviors. Furthermore, we propose our semi-automated Spectral Relevance Analysis that provides a practically effective way of characterizing and validating the behavior of nonlinear learning machines. This helps to assess whether a learned model indeed delivers reliably for the problem that it was conceived for. Furthermore, our work intends to add a voice of caution to the ongoing excitement about machine intelligence and pledges to evaluate and judge some of these recent successes in a more nuanced manner.Comment: Accepted for publication in Nature Communication

    Single-cell analysis of ER-positive breast cancer treated with letrozole and ribociclib

    Get PDF
    Breast cancer is the most widespread cancer in the world, accounting for 25% of all female cancers. There is a high inter- and intra-tumor heterogeneity in breast cancer which makes it challenging to optimize the treatment for the individual patient. In recent years, the role of immune infiltration in tumor carcinogenesis and pathophysiology has been increasingly recognized. It has therefore become a priority to understand the interactions and cooperation between immune and cancer cells. Despite a thorough attempt to match treatment options with clinicopathological features such as histological classification, grade, stage, biomarkers, molecular subtypes, and intrinsic subtypes, many patients show resistance to treatment. One attempt to overcome treatment resistance is the emergence of combinatorial treatment, meaning treating patients with two drugs at the same time. CDK4/6 inhibitors are anti-cancer drugs which prohibits cell growth and is shown to have promising results in combination with aromatase inhibitors for breast cancer patients with hormone receptor positive disease. This drug combination is not yet approved in Norway as standard neoadjuvant treatment. The NeoLetRib clinical trial facilitates the access to the combinations of aromatase and CDK4/6 inhibitor to patients. The study also gives the opportunity to investigate potential biomarkers for more personalized treatment, novel predictive biomarkers and assess how the tumor microenvironment changes during treatment. Single cell analysis is the method we used to capture each cells transcriptome in the tumor microenvironment. We performed scRNA-seq of breast cancer biopsies from patients enrolled in the clinical trial NeoLetRib before the neoadjuvant treatment and after 21 days. This study shows that five cellular subtypes including Tregs, and four monocyte subtypes had a significant proportional change. These cell types have been associated with the promotion of a proinflammatory microenvironment and may be associated with tumor progression.M-K

    Counting and Locating High-Density Objects Using Convolutional Neural Network

    Full text link
    This paper presents a Convolutional Neural Network (CNN) approach for counting and locating objects in high-density imagery. To the best of our knowledge, this is the first object counting and locating method based on a feature map enhancement and a Multi-Stage Refinement of the confidence map. The proposed method was evaluated in two counting datasets: tree and car. For the tree dataset, our method returned a mean absolute error (MAE) of 2.05, a root-mean-squared error (RMSE) of 2.87 and a coefficient of determination (R2^2) of 0.986. For the car dataset (CARPK and PUCPR+), our method was superior to state-of-the-art methods. In the these datasets, our approach achieved an MAE of 4.45 and 3.16, an RMSE of 6.18 and 4.39, and an R2^2 of 0.975 and 0.999, respectively. The proposed method is suitable for dealing with high object-density, returning a state-of-the-art performance for counting and locating objects.Comment: 15 pages, 10 figures, 8 table

    EZH2 modifies sunitinib resistance in renal cell carcinoma by kinome reprogramming

    Get PDF
    Acquired and intrinsic resistance to receptor tyrosine kinase inhibitors (RTKi) represent a major hurdle in improving the management of clear cell renal cell carcinoma (ccRCC). Recent reports suggest that drug resistance is driven by tumor adaptation via epigenetic mechanisms that activate alternative survival pathways. The histone methyl transferase EZH2 is frequently altered in many cancers including ccRCC. To evaluate its role in ccRCC resistance to RTKi, we established and characterized a spontaneously metastatic, patient-derived xenograft (PDX) model that is intrinsically resistant to the RTKI sunitinib but not to the VEGF therapeutic antibody bevacizumab. Sunitinib maintained its anti-angiogenic and anti-metastatic activity but lost its direct anti-tumor effects due to kinome reprogramming, which resulted in suppression of pro- apoptotic and cell cycle regulatory target genes. Modulating EZH2 expression or activity suppressed phosphorylation of certain RTK, restoring the anti-tumor effects of sunitnib in models of acquired or intrinsically resistant ccRCC. Overall, our results highlight EZH2 as a rational target for therapeutic intervention in sunitinib-resistant ccRCC as well as a predictive marker for RTKi response in this disease.This research was funded by Roswell Park Cancer Institute’s Cancer Center Support Grant from National Cancer Institute, NIH P30CA016056 (RP) and a generous donation by Richard and Deidre Turner (RP). This investigation was conducted in-part in a facility constructed with support from Research Facilities Improvement Program Grant Number C06 RR020128-01 from the National Center for Research Resources, National Institutes of Health
    • …
    corecore