72,368 research outputs found

    Locally Learning Biomedical Data Using Diffusion Frames

    Get PDF
    Diffusion geometry techniques are useful to classify patterns and visualize high-dimensional datasets. Building upon ideas from diffusion geometry, we outline our mathematical foundations for learning a function on high-dimension biomedical data in a local fashion from training data. Our approach is based on a localized summation kernel, and we verify the computational performance by means of exact approximation rates. After these theoretical results, we apply our scheme to learn early disease stages in standard and new biomedical datasets

    ROOT - A C++ Framework for Petabyte Data Storage, Statistical Analysis and Visualization

    Full text link
    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, ROOT offers packages for complex data modeling and fitting, as well as multivariate classification based on machine learning techniques. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way

    X-ray ptychography on low-dimensional hard-condensed matter materials

    Get PDF
    Tailoring structural, chemical, and electronic (dis-)order in heterogeneous media is one of the transformative opportunities to enable new functionalities and sciences in energy and quantum materials. This endeavor requires elemental, chemical, and magnetic sensitivities at the nano/atomic scale in two- and three-dimensional space. Soft X-ray radiation and hard X-ray radiation provided by synchrotron facilities have emerged as standard characterization probes owing to their inherent element-specificity and high intensity. One of the most promising methods in view of sensitivity and spatial resolution is coherent diffraction imaging, namely, X-ray ptychography, which is envisioned to take on the dominance of electron imaging techniques offering with atomic resolution in the age of diffraction limited light sources. In this review, we discuss the current research examples of far-field diffraction-based X-ray ptychography on two-dimensional and three-dimensional semiconductors, ferroelectrics, and ferromagnets and their blooming future as a mainstream tool for materials sciences
    corecore