1,191 research outputs found

    Modeling the Probabilistic Distribution of Unlabeled Data forOne-shot Medical Image Segmentation

    Full text link
    Existing image segmentation networks mainly leverage large-scale labeled datasets to attain high accuracy. However, labeling medical images is very expensive since it requires sophisticated expert knowledge. Thus, it is more desirable to employ only a few labeled data in pursuing high segmentation performance. In this paper, we develop a data augmentation method for one-shot brain magnetic resonance imaging (MRI) image segmentation which exploits only one labeled MRI image (named atlas) and a few unlabeled images. In particular, we propose to learn the probability distributions of deformations (including shapes and intensities) of different unlabeled MRI images with respect to the atlas via 3D variational autoencoders (VAEs). In this manner, our method is able to exploit the learned distributions of image deformations to generate new authentic brain MRI images, and the number of generated samples will be sufficient to train a deep segmentation network. Furthermore, we introduce a new standard segmentation benchmark to evaluate the generalization performance of a segmentation network through a cross-dataset setting (collected from different sources). Extensive experiments demonstrate that our method outperforms the state-of-the-art one-shot medical segmentation methods. Our code has been released at https://github.com/dyh127/Modeling-the-Probabilistic-Distribution-of-Unlabeled-Data.Comment: AAAI 202

    BIOMECHANICAL MEASUREMENT AND EVALUATION FOR WUSU COMPULSORY PROGRAM

    Get PDF
    INTRODUCTION: The Wusu compulsory program is a new form of competition. It differs from the Wusu optional program in that all the athletes must execute the same compulsory program. Following this, judges asses the level of expertise of athletes, by scores gained, based on their performance. The comparability and objectivity in grading a compulsory program is better than in an optional program. So it is feasible to analyze technique of the Wusu compulsory program using a biomechanical measuring method. The compulsory exercise described as, “the turn about for flying kick on the right“, was measured and analyzed quantitatively in this paper. With the calculations from this study, a new evaluation factor and advanced development of competitive Wusu wasdeveloped

    Highly-Accurate Electricity Load Estimation via Knowledge Aggregation

    Full text link
    Mid-term and long-term electric energy demand prediction is essential for the planning and operations of the smart grid system. Mainly in countries where the power system operates in a deregulated environment. Traditional forecasting models fail to incorporate external knowledge while modern data-driven ignore the interpretation of the model, and the load series can be influenced by many complex factors making it difficult to cope with the highly unstable and nonlinear power load series. To address the forecasting problem, we propose a more accurate district level load prediction model Based on domain knowledge and the idea of decomposition and ensemble. Its main idea is three-fold: a) According to the non-stationary characteristics of load time series with obvious cyclicality and periodicity, decompose into series with actual economic meaning and then carry out load analysis and forecast. 2) Kernel Principal Component Analysis(KPCA) is applied to extract the principal components of the weather and calendar rule feature sets to realize data dimensionality reduction. 3) Give full play to the advantages of various models based on the domain knowledge and propose a hybrid model(XASXG) based on Autoregressive Integrated Moving Average model(ARIMA), support vector regression(SVR) and Extreme gradient boosting model(XGBoost). With such designs, it accurately forecasts the electricity demand in spite of their highly unstable characteristic. We compared our method with nine benchmark methods, including classical statistical models as well as state-of-the-art models based on machine learning, on the real time series of monthly electricity demand in four Chinese cities. The empirical study shows that the proposed hybrid model is superior to all competitors in terms of accuracy and prediction bias

    Quantum imaginary time evolution and quantum annealing meet topological sector optimization

    Full text link
    Optimization problems are the core challenge in many fields of science and engineering, yet general and effective methods are scarce for searching optimal solutions. Quantum computing has been envisioned to help solve such problems, for example, the quantum annealing (QA) method based on adiabatic evolution has been extensively explored and successfully implemented on quantum simulators such as D-wave's annealers and some Rydberg arrays. In this work, we investigate topological sector optimization (TSO) problem, which attracts particular interests in the quantum many-body physics community. We reveal that the topology induced by frustration in the spin model is an intrinsic obstruction for QA and other traditional methods to approach the ground state. We demonstrate that the optimization difficulties of TSO problem are not restricted to the gaplessness, but are also due to the topological nature which are often ignored for the analysis of optimization problems before. To solve TSO problems, we utilize quantum imaginary time evolution (QITE) with a possible realization on quantum computers, which exploits the property of quantum superposition to explore the full Hilbert space and can thus address optimization problems of topological nature. We report the performance of different quantum optimization algorithms on TSO problems and demonstrate that their capability to address optimization problems are distinct even when considering the quantum computational resources required for practical QITE implementations

    Txilm: Lossy Block Compression with Salted Short Hashing

    Get PDF
    Current blockchains are restricted by the low throughput. Aimed at this problem, we propose Txilm, a protocol that compresses the size of transaction presentation in each block to save the bandwidth of the network. In this protocol, a block carries short hashes of TXIDs instead of complete transactions. Combined with the sorted transactions based on TXIDs, Txilm realizes 80 times of data size reduction compared with the original blockchains. We also evaluate the probability of hash collisions, and provide methods of resolving such collisions. Finally, we design strategies to protect against potential attacks on Txilm.Comment: 5 pages and 2 figure
    • …
    corecore