14,938 research outputs found

    Personalized Pancreatic Tumor Growth Prediction via Group Learning

    Full text link
    Tumor growth prediction, a highly challenging task, has long been viewed as a mathematical modeling problem, where the tumor growth pattern is personalized based on imaging and clinical data of a target patient. Though mathematical models yield promising results, their prediction accuracy may be limited by the absence of population trend data and personalized clinical characteristics. In this paper, we propose a statistical group learning approach to predict the tumor growth pattern that incorporates both the population trend and personalized data, in order to discover high-level features from multimodal imaging data. A deep convolutional neural network approach is developed to model the voxel-wise spatio-temporal tumor progression. The deep features are combined with the time intervals and the clinical factors to feed a process of feature selection. Our predictive model is pretrained on a group data set and personalized on the target patient data to estimate the future spatio-temporal progression of the patient's tumor. Multimodal imaging data at multiple time points are used in the learning, personalization and inference stages. Our method achieves a Dice coefficient of 86.8% +- 3.6% and RVD of 7.9% +- 5.4% on a pancreatic tumor data set, outperforming the DSC of 84.4% +- 4.0% and RVD 13.9% +- 9.8% obtained by a previous state-of-the-art model-based method

    Prospects for Theranostics in Neurosurgical Imaging: Empowering Confocal Laser Endomicroscopy Diagnostics via Deep Learning

    Get PDF
    Confocal laser endomicroscopy (CLE) is an advanced optical fluorescence imaging technology that has the potential to increase intraoperative precision, extend resection, and tailor surgery for malignant invasive brain tumors because of its subcellular dimension resolution. Despite its promising diagnostic potential, interpreting the gray tone fluorescence images can be difficult for untrained users. In this review, we provide a detailed description of bioinformatical analysis methodology of CLE images that begins to assist the neurosurgeon and pathologist to rapidly connect on-the-fly intraoperative imaging, pathology, and surgical observation into a conclusionary system within the concept of theranostics. We present an overview and discuss deep learning models for automatic detection of the diagnostic CLE images and discuss various training regimes and ensemble modeling effect on the power of deep learning predictive models. Two major approaches reviewed in this paper include the models that can automatically classify CLE images into diagnostic/nondiagnostic, glioma/nonglioma, tumor/injury/normal categories and models that can localize histological features on the CLE images using weakly supervised methods. We also briefly review advances in the deep learning approaches used for CLE image analysis in other organs. Significant advances in speed and precision of automated diagnostic frame selection would augment the diagnostic potential of CLE, improve operative workflow and integration into brain tumor surgery. Such technology and bioinformatics analytics lend themselves to improved precision, personalization, and theranostics in brain tumor treatment.Comment: See the final version published in Frontiers in Oncology here: https://www.frontiersin.org/articles/10.3389/fonc.2018.00240/ful

    The role of Intangible Assets in the Relationship between HRM and Innovation: A Theoretical and Empirical Exploration

    Get PDF
    This paper, as far as known, provides a first attempt to explore the role of intellectual capital (IC) and knowledge management (KM) in an integrative way between the relationship of human resource (HR) practices and two types of innovation (radical and incremental). More specifically, the study investigates two sub-components of IC – human capital and organizational social capital. At the same time, four KM channels are discussed, such as knowledge creation, acquisition, transfer and responsiveness.\ud The research is a part of a bigger project financed by the Ministry of Economic Affairs and the province of Overijssel in the Netherlands. The project studies the ‘competencies for innovation’ and is conducted in collaboration with innovative companies in the Eastern part of the Netherlands. \ud An exploratory survey design with qualitative and quantitative data is used for\ud investigating the topic in six companies from industrial and service sector in the region of Twente, the Netherlands. Mostly, the respondents were HR directors. The findings showed that some parts of IC and KM configurations were related to different types of innovation. To make the picture even more complicated, HR practices were sometimes perceived interchangeably with IC and KM by HR directors. Overall, the whole picture about the relationships stays unclear and opens a floor for further research

    Applying Deep Learning To Airbnb Search

    Full text link
    The application to search ranking is one of the biggest machine learning success stories at Airbnb. Much of the initial gains were driven by a gradient boosted decision tree model. The gains, however, plateaued over time. This paper discusses the work done in applying neural networks in an attempt to break out of that plateau. We present our perspective not with the intention of pushing the frontier of new modeling techniques. Instead, ours is a story of the elements we found useful in applying neural networks to a real life product. Deep learning was steep learning for us. To other teams embarking on similar journeys, we hope an account of our struggles and triumphs will provide some useful pointers. Bon voyage!Comment: 8 page

    Personalized automatic sleep staging with single-night data: a pilot study with Kullback-Leibler divergence regularization.

    Get PDF
    OBJECTIVE: Brain waves vary between people. This work aims to improve automatic sleep staging for longitudinal sleep monitoring via personalization of algorithms based on individual characteristics extracted from sleep data recorded during the first night. APPROACH: As data from a single night are very small, thereby making model training difficult, we propose a Kullback-Leibler (KL) divergence regularized transfer learning approach to address this problem. We employ the pretrained SeqSleepNet (i.e. the subject independent model) as a starting point and finetune it with the single-night personalization data to derive the personalized model. This is done by adding the KL divergence between the output of the subject independent model and it of the personalized model to the loss function during finetuning. In effect, KL-divergence regularization prevents the personalized model from overfitting to the single-night data and straying too far away from the subject independent model. MAIN RESULTS: Experimental results on the Sleep-EDF Expanded database consisting of 75 subjects show that sleep staging personalization with single-night data is possible with help of the proposed KL-divergence regularization. On average, we achieve a personalized sleep staging accuracy of 79.6%, a Cohen's kappa of 0.706, a macro F1-score of 73.0%, a sensitivity of 71.8%, and a specificity of 94.2%. SIGNIFICANCE: We find both that the approach is robust against overfitting and that it improves the accuracy by 4.5 percentage points compared to the baseline method without personalization and 2.2 percentage points compared to it with personalization but without regularization

    From unsupervised to semi-supervised adversarial domain adaptation in EEG-based sleep staging.

    Get PDF
    OBJECTIVE: The recent breakthrough of wearable sleep monitoring devices results in large amounts of sleep data. However, as limited labels are available, interpreting these data requires automated sleep stage classification methods with a small need for labeled training data. Transfer learning and domain adaptation offer possible solutions by enabling models to learn on a source dataset and adapt to a target dataset. APPROACH: In this paper, we investigate adversarial domain adaptation applied to real use cases with wearable sleep datasets acquired from diseased patient populations. Different practical aspects of the adversarial domain adaptation framework \hl{are examined}, including the added value of (pseudo-)labels from the target dataset and the influence of domain mismatch between the source and target data. The method is also implemented for personalization to specific patients. MAIN RESULTS: The results show that adversarial domain adaptation is effective in the application of sleep staging on wearable data. When compared to a model applied on a target dataset without any adaptation, the domain adaptation method in its simplest form achieves relative gains of 7%-27% in accuracy. The performance on the target domain is further boosted by adding pseudo-labels and real target domain labels when available, and by choosing an appropriate source dataset. Furthermore, unsupervised adversarial domain adaptation can also personalize a model, improving the performance by 1%-2% compared to a non-personal model. SIGNIFICANCE: In conclusion, adversarial domain adaptation provides a flexible framework for semi-supervised and unsupervised transfer learning. This is particularly useful in sleep staging and other wearable EEG applications

    Inside Success: Strategies of 25 Effective Small High Schools in NYC

    Get PDF
    For decades, New York City's high school graduation rates hovered at or below 50 percent. In attempt to turn around these disappointing results, the NYC Department of Education enacted a series of large-scale reforms, including opening hundreds of new "small schools of choice" (SSCs). Recent research by MDRC has shown that these schools have had large and sustained positive effects on students' graduation rates and other outcomes. How have they done it? What decisions -- made by the educators who created, supported, and operated these schools -- have been critical to their success? What challenges do these schools face as they try to maintain that success over time? The Research Alliance set out to answer these questions, conducting in-depth interviews with teachers and principals in 25 of the most highly effective SSCs. Educators reported three features as essential to their success:Personalization, which was widely seen as the most important success factor. This includes structures that foster strong relationships with students and their families, systems for monitoring student progress -- beyond just grades and test scores, and working to address students' social and emotional needs, as well as academic ones.High expectations -- for students and for educators -- and instructional programs that are aligned with these ambitious goals.Dedicated and flexible teachers, who were willing to take on multiple roles, sometimes outside their areas of expertise.The findings, presented in this report, paint a picture of how these features were developed in practice. The report also describes challenges these schools face and outlines lessons for other schools and districts that can be drawn from the SSCs' experience. These include the need to avoid teacher burnout, improving the fit between schools and external partners, and expanding current notions of accountability
    • 

    corecore