3,218 research outputs found

    A unified evaluation of iterative projection algorithms for phase retrieval

    Get PDF
    Iterative projection algorithms are successfully being used as a substitute of lenses to recombine, numerically rather than optically, light scattered by illuminated objects. Images obtained computationally allow aberration-free diffraction-limited imaging and the possibility of using radiation for which no lenses exist. The challenge of this imaging technique is transfered from the lenses to the algorithms. We evaluate these new computational ``instruments'' developed for the phase retrieval problem, and discuss acceleration strategies.Comment: 12 pages, 9 figures, revte

    Using Machine Learning and Natural Language Processing to Review and Classify the Medical Literature on Cancer Susceptibility Genes

    Full text link
    PURPOSE: The medical literature relevant to germline genetics is growing exponentially. Clinicians need tools monitoring and prioritizing the literature to understand the clinical implications of the pathogenic genetic variants. We developed and evaluated two machine learning models to classify abstracts as relevant to the penetrance (risk of cancer for germline mutation carriers) or prevalence of germline genetic mutations. METHODS: We conducted literature searches in PubMed and retrieved paper titles and abstracts to create an annotated dataset for training and evaluating the two machine learning classification models. Our first model is a support vector machine (SVM) which learns a linear decision rule based on the bag-of-ngrams representation of each title and abstract. Our second model is a convolutional neural network (CNN) which learns a complex nonlinear decision rule based on the raw title and abstract. We evaluated the performance of the two models on the classification of papers as relevant to penetrance or prevalence. RESULTS: For penetrance classification, we annotated 3740 paper titles and abstracts and used 60% for training the model, 20% for tuning the model, and 20% for evaluating the model. The SVM model achieves 89.53% accuracy (percentage of papers that were correctly classified) while the CNN model achieves 88.95 % accuracy. For prevalence classification, we annotated 3753 paper titles and abstracts. The SVM model achieves 89.14% accuracy while the CNN model achieves 89.13 % accuracy. CONCLUSION: Our models achieve high accuracy in classifying abstracts as relevant to penetrance or prevalence. By facilitating literature review, this tool could help clinicians and researchers keep abreast of the burgeoning knowledge of gene-cancer associations and keep the knowledge bases for clinical decision support tools up to date

    Recent developments in New Testament textual criticism

    Get PDF
    This is a preprint version of an article published in Early Christianity 2.2 (2011). \ud \ud The article provides an overview of recent developments in New Testament Textual Criticism. The four sections cover editions, manuscripts, citational evidence and methodology. Particular attention is paid to the Editio Critica Maior, the development of electronic resources, newly discovered manuscripts, and the Coherence Based Genealogical Method

    Exploiting Iterative Flattening Search to Solve Job Shop Scheduling Problems with Setup Times

    Get PDF
    No abstract availableThis paper presents a heuristic algorithm for solving a jobshop scheduling problem with sequence dependent setup times (SDST-JSSP). This strategy, known as Iterative Flattening Search (IFS), iteratively applies two steps: (1) a relaxation-step, in which a subset of scheduling decisions are randomly retracted from the current solution; and (2) a solving-step, in which a new solution is incrementally recomputed from this partial schedule. The algorithm relies on a core constraint-based search procedure, which generates consistent orderings of activities that require the same resource by incrementally imposing precedence constraints on a temporally feasible solution. Key to the effectiveness of the search procedure is a conflict sampling method biased toward selection of the most critical conflicts. The efficacy of the overall heuristic optimization algorithm is demonstrated empirically on a set of well known SDST-JSSP benchmarks
    • …
    corecore