63 research outputs found

    A Comparative Analysis on Volatility and Scalability Properties of Blockchain Compression Protocols

    Full text link
    Increasing popularity of trading digital assets can lead to significant delays in Blockchain networks when processing transactions. When transaction fees become miners' primary revenue, an imbalance in reward may lead to miners adopting deviant mining strategies. Scaling the block capacity is one of the potential approaches to alleviate the problem. To address this issue, this paper reviews and evaluates six state-of-the-art compression protocols for Blockchains. Specifically, we designed a Monte Carlo simulation to simulate two of the six protocols to observe their compression performance under larger block capacities. Furthermore, extensive simulation experiments were conducted to observe the mining behaviour when the block capacity is increased. Experimental results reveal an interesting trade-off between volatility and scalability. When the throughput is higher than a critical point, it worsens the volatility and threatens Blockchain security. In the experiments, we further analyzed the relationship between volatility and scalability properties with respect to the distribution of transaction values. Based on the analysis results, we proposed the recommended maximum block size for each protocol. At last, we discuss the further improvement of the compression protocols

    Multi-objective Optimization for Incremental Decision Tree Learning

    Get PDF
    Abstract. Decision tree learning can be roughly classified into two categories: static and incremental inductions. Static tree induction applies greedy search in splitting test for obtaining a global optimal model. Incremental tree induction constructs a decision model by analyzing data in short segments; during each segment a local optimal tree structure is formed. Very Fast Decision Tree [4] is a typical incremental tree induction based on the principle of Hoeffding bound for node-splitting test. But it does not work well under noisy data. In this paper, we propose a new incremental tree induction model called incrementally Optimized Very Fast Decision Tree (iOVFDT), which uses a multi-objective incremental optimization method. iOVFDT also integrates four classifiers at the leaf levels. The proposed incremental tree induction model is tested with a large volume of data streams contaminated with noise. Under such noisy data, we investigate how iOVFDT that represents incremental induction method working with local optimums compares to C4.5 which loads the whole dataset for building a globally optimal decision tree. Our experiment results show that iOVFDT is able to achieve similar though slightly lower accuracy, but the decision tree size and induction time are much smaller than that of C4.5
    • …
    corecore