307 research outputs found

    P2CP: A New Cloud Storage Model to Enhance Performance of Cloud Services

    Get PDF
    This paper presents a storage model named Peer to Cloud and Peer (P2CP). Assuming that the P2CP model follows the Poisson process or Little’s law, we prove that the speed and availability of P2CP is generally better than that of the pure Peer to Peer (P2P) model, the Peer to Server, Peer (P2SP) model or the cloud model. A key feature of our P2CP is that it has three data transmission tunnels: the cloud-user data transmission tunnel, the clients’ data transmission tunnel, and the common data transmission tunnel. P2CP uses the cloud storage system as a common storage system. When data transmission occurs, the data nodes, cloud user, and the non-cloud user are all together involved to complete the transaction

    Intelligent microscope III

    Get PDF

    Exploring the Impact of Serverless Computing on Peer To Peer Training Machine Learning

    Full text link
    The increasing demand for computational power in big data and machine learning has driven the development of distributed training methodologies. Among these, peer-to-peer (P2P) networks provide advantages such as enhanced scalability and fault tolerance. However, they also encounter challenges related to resource consumption, costs, and communication overhead as the number of participating peers grows. In this paper, we introduce a novel architecture that combines serverless computing with P2P networks for distributed training and present a method for efficient parallel gradient computation under resource constraints. Our findings show a significant enhancement in gradient computation time, with up to a 97.34\% improvement compared to conventional P2P distributed training methods. As for costs, our examination confirmed that the serverless architecture could incur higher expenses, reaching up to 5.4 times more than instance-based architectures. It is essential to consider that these higher costs are associated with marked improvements in computation time, particularly under resource-constrained scenarios. Despite the cost-time trade-off, the serverless approach still holds promise due to its pay-as-you-go model. Utilizing dynamic resource allocation, it enables faster training times and optimized resource utilization, making it a promising candidate for a wide range of machine learning applications
    • …
    corecore