10 research outputs found

    Attestation of Improved SimBlock Node Churn Simulation

    Get PDF
    Node churn, or the constant joining and leaving of nodes in a network, can impact the performance of a blockchain network. The difficulties of performing research on the actual blockchain network, particularly on a live decentralized global network like Bitcoin, pose challenges that good simulators can overcome. While various tools, such as NS-3 and OMNet++, are useful for simulating network behavior, SimBlock is specifically designed to simulate the complex Bitcoin blockchain network. However, the current implementation of SimBlock has limitations when replicating actual node churn activity. In this study, the SimBlock simulator was improved to simulate node churn more accurately by removing churning nodes and dropping their connections and increasing additional instrumentation for validation. The methodology used in the study involved modeling the Bitcoin node churn behavior based on previous studies and using the enhanced SimBlock simulator to simulate node churn. Empirical studies were then conducted to determine the suitability and limitations of the node churn simulation. This study found that the improved SimBlock could produce results similar to observed indicators in a 100-node network. However, it still had limitations in replicating node churn behavior accurately. It was discovered that SimBlock limits all nodes to operate as mining nodes and that mining is simulated in a way that does not depict churn accurately at any time but only at specific intervals or under certain conditions. Despite these limitations, the study’s improvements to SimBlock and the identification of its limitations can be useful for future research on node churn in blockchain networks and the development of more effective simulation tools

    Robust estimation of bacterial cell count from optical density

    Get PDF
    Optical density (OD) is widely used to estimate the density of cells in liquid culture, but cannot be compared between instruments without a standardized calibration protocol and is challenging to relate to actual cell count. We address this with an interlaboratory study comparing three simple, low-cost, and highly accessible OD calibration protocols across 244 laboratories, applied to eight strains of constitutive GFP-expressing E. coli. Based on our results, we recommend calibrating OD to estimated cell count using serial dilution of silica microspheres, which produces highly precise calibration (95.5% of residuals <1.2-fold), is easily assessed for quality control, also assesses instrument effective linear range, and can be combined with fluorescence calibration to obtain units of Molecules of Equivalent Fluorescein (MEFL) per cell, allowing direct comparison and data fusion with flow cytometry measurements: in our study, fluorescence per cell measurements showed only a 1.07-fold mean difference between plate reader and flow cytometry data

    Improved difficulty adjustment for proof-of-work based blockchains with genetic algorithm

    No full text
    Blocks and transactions are produced and confirmed respectively in blockchains by participating nodes through a consensus algorithm, one of which is Proof­ofWork (PoW). The principle of PoW is to compute a complex mathematical problem, and the complexity of this computation, is governed by the difficulty, adjusted on a periodic basis to control the rate of new block creation. This is determined by the network hash rate, and if the hash rate grows or declines exponentially, the block creation interval cannot be maintained. Genetic algorithm (GA) is proposed as an additional mechanism to the existing difficulty adjustment algorithm as a response to suppress the slow difficulty adjustment reaction due to fluctuation of hash rate caused by miners leaving or entering the network such as an attack or sudden events, by optimizing the blockchain parameters. This research aims to achieve a more dynamic difficulty adjustment mechanism that is able to meet the network objectives. Reparameterization of blockchain was performed on forks of actual Bitcoin blockchain network with virtual machines. Mining was also performed in order to measure the effects of reparameterization. Study was performed using SimBlock, a blockchain network simulator as a base. Difficulty adjustment algorithm and the capacity to increase or decrease hash rate dynamically were then added to the simulator and the efficacy of the implementation was investigated. With new implementations, SimBlock is able to more accurately simulate an actual Bitcoin blockchain network in terms of obtained block time than before especially in the case of fluctuating hash rate. Furthermore, SimBlock is now capable of reproducing the occurrence of fluctuating hash rate in an actual blockchain network. Non­dominated sorting genetic algorithm II (NSGA­II) is then implemented into SimBlock and simulation of different scenarios (default without GA, fixed block interval with GA, fixed difficulty adjustment interval with GA & variable block and difficulty adjustment intervals with GA) were carried out. All the scenarios with GA were able to respond more quickly to sudden increase or decrease in hash rate. This can be proven by the result of obtaining lower standard deviation of average block time and difficulty as compared to the default blockchain network without GA. Based on observation, if the block interval decreases, the stale block rate will increases accordingly as a low block interval raises the likelihood of several miners mining a valid block at the simultaneously. Moreover, an increasing stale block rate would result in a higher median block propagation time as more bandwidth was required to propagate stale blocks. Thus by finding a middle ground, the scenario of fixed difficulty adjustment interval with GA achieved the best overall performance with the lowest standard deviation of average block time and difficulty in addition to moderate median block propagation time and stale block rate

    Genetic-Algorithm-Inspired Difficulty Adjustment for Proof-of-Work Blockchains

    No full text
    In blockchains, the principle of proof-of-work (PoW) is used to compute a complex mathematical problem. The computation complexity is governed by the difficulty, adjusted periodically to control the rate at which new blocks are created. The network hash rate determines this, a phenomenon of symmetry, as the difficulty also increases when the hash rate increases. If the hash rate grows or declines exponentially, the block creation interval cannot be maintained. A genetic algorithm (GA) is proposed as an additional mechanism to the existing difficulty adjustment algorithm for optimizing the blockchain parameters. The study was conducted with four scenarios in mind, including a default scenario that simulates a regular blockchain. All the scenarios with the GA were able to achieve a lower standard deviation of the average block time and difficulty compared to the default blockchain network without GA. The scenario of a fixed difficulty adjustment interval with GA was able to reduce the standard deviation of the average block time by 80.1%, from 497.1 to 98.9, and achieved a moderate median block propagation time of 6.81 s and a stale block rate of 6.67%

    On the trade-offs of Proof-of-Work algorithms in blockchains

    No full text
    There are many different protocols that regulate the way in which each node on a blockchain network is able to reach consensus that a newly created block is valid. One of the protocols, Proof-of-Work (PoW) gained popularity when it was implemented in a blockchain-based cryptocurrency known as Bitcoin. However, there are inherent deficiencies in its current implementation. This paper discusses these deficiencies, as well as the parameters that directly and indirectly affect its efficacy and performance so that possible enhancements to the protocol can be investigated

    Genetic-Algorithm-Inspired Difficulty Adjustment for Proof-of-Work Blockchains

    No full text
    In blockchains, the principle of proof-of-work (PoW) is used to compute a complex mathematical problem. The computation complexity is governed by the difficulty, adjusted periodically to control the rate at which new blocks are created. The network hash rate determines this, a phenomenon of symmetry, as the difficulty also increases when the hash rate increases. If the hash rate grows or declines exponentially, the block creation interval cannot be maintained. A genetic algorithm (GA) is proposed as an additional mechanism to the existing difficulty adjustment algorithm for optimizing the blockchain parameters. The study was conducted with four scenarios in mind, including a default scenario that simulates a regular blockchain. All the scenarios with the GA were able to achieve a lower standard deviation of the average block time and difficulty compared to the default blockchain network without GA. The scenario of a fixed difficulty adjustment interval with GA was able to reduce the standard deviation of the average block time by 80.1%, from 497.1 to 98.9, and achieved a moderate median block propagation time of 6.81 s and a stale block rate of 6.67%

    On the trade-offs of Proof-of-Work algorithms in blockchains

    No full text
    There are many different protocols that regulate the way in which each node on a blockchain network is able to reach consensus that a newly created block is valid. One of the protocols, Proof-of-Work (PoW) gained popularity when it was implemented in a blockchain-based cryptocurrency known as Bitcoin. However, there are inherent deficiencies in its current implementation. This paper discusses these deficiencies, as well as the parameters that directly and indirectly affect its efficacy and performance so that possible enhancements to the protocol can be investigated

    Proof-of-Work Difficulty Readjustment with Genetic Algorithm

    No full text
    Blockchain is a decentralized, distributed and public digital ledger technology. It can be visualized as a gradually increasing list of “blocks” which contains data that are linked together using cryptographic hash. Each transaction is verified by several participating nodes to compute a complex mathematical problem. The complexity of this computation, also known as Proof-of-Work (PoW), is governed by the difficulty set on a periodic basis. If the hash rate of the blockchain’s PoW grows or declines exponentially, the blockchain will be unable to maintain the block creation interval. The utilization of genetic algorithm (GA) in addition with the existing difficulty adjustment algorithm is proposed as a response to this by optimizing the blockchain parameters. A simulation of 3 scenarios as well as the default, were performed and the results were recorded. Based on the results, we are able to observe that the blockchain is able to reach the expected block time 74.4% faster than the blockchain without GA. Moreover, the standard deviations of the average block time and difficulty decreased by 99.4% and 99.5% respectively when block and difficulty intervals were considered for optimization, when compared to the default blockchain without GA
    corecore