74 research outputs found

    Cookery: A Framework for Creating Data Processing Pipeline Using Online Services

    Get PDF
    With the increasing amount of data the importance of data analysis in various scientific domains has grown. A large amount of the scientific data has shifted to cloud based storage. The cloud offers storage and computation power. The Cookery framework is a tool developed to build scientific applications using cloud services. In this paper we present the Cookery systems and how they can be used to authenticate and use standard online third party services to easily create data analytic pipelines. Cookery framework is not limited to work with standard web services; it can also integrate and work with the emerging AWS Lambda which is part of a new computing paradigm, collectively, known as serverless computing. The combination of AWS Lambda and Cookery, which makes it possible for users in many scientific domains, who do not have any program experience, to create data processing pipelines using cloud services in a short time

    Asynchronous Execution Platform for Edge Node Devices

    Get PDF
    A Asynchronous distributed execution platform which enables efficient and seamless task submissions on a remote node from a cluster of edge node devices using a reactive framework and provide real-time metrics of execution persisted on elastic database. Queues are used for job submission along with different compute units delivering the infrastructure for execution of submitted jobs. The proposed system is a generic framework that can be used in any enterprise web application where execution on a remote node is required. Through this we aim to provide an enterprise grade solution for task submission and management on a remote machine, using new, efficient technologies like SpringBoot and RabbitMQ. There is a demand in remote computing and huge workloads that cannot be executed on small local machines, our system can be used directly or indirectly by incorporated in other solutions

    A review on serverless architectures - function as a service (FaaS) in cloud computing

    Get PDF
    Emergence of cloud computing as the inevitable IT computing paradigm, the perception of the compute reference model and building of services has evolved into new dimensions. Serverless computing is an execution model in which the cloud service provider dynamically manages the allocation of compute resources of the server. The consumer is billed for the actual volume of resources consumed by them, instead paying for the pre-purchased units of compute capacity. This model evolved as a way to achieve optimum cost, minimum configuration overheads, and increases the application's ability to scale in the cloud. The prospective of the serverless compute model is well conceived by the major cloud service providers and reflected in the adoption of serverless computing paradigm. This review paper presents a comprehensive study on serverless computing architecture and also extends an experimentation of the working principle of serverless computing reference model adapted by AWS Lambda. The various research avenues in serverless computing are identified and presented

    No more, no less - A formal model for serverless computing

    Get PDF
    Serverless computing, also known as Functions-as-a-Service, is a recent paradigm aimed at simplifying the programming of cloud applications. The idea is that developers design applications in terms of functions, which are then deployed on a cloud infrastructure. The infrastructure takes care of executing the functions whenever requested by remote clients, dealing automatically with distribution and scaling with respect to inbound traffic. While vendors already support a variety of programming languages for serverless computing (e.g. Go, Java, Javascript, Python), as far as we know there is no reference model yet to formally reason on this paradigm. In this paper, we propose the first formal programming model for serverless computing, which combines ideas from both the λ\lambda-calculus (for functions) and the π\pi-calculus (for communication). To illustrate our proposal, we model a real-world serverless system. Thanks to our model, we are also able to capture and pinpoint the limitations of current vendor technologies, proposing possible amendments

    An investigation of the impact of language runtime on the performance and cost of serverless functions

    Get PDF
    Serverless, otherwise known as “Function-as-a- Service” (FaaS), is a compelling evolution of cloud computing that is highly scalable and event-driven. Serverless applications are composed of multiple independent functions, each of which can be implemented in a range of programming languages. This paper seeks to understand the impact of the choice of language runtime on the performance and subsequent cost of serverless function execution. It presents the design and implementation of a new serverless performance testing framework created to analyse performance and cost metrics for both AWS Lambda and Azure Functions. For optimum performance and cost management of serverless applications, Python is the clear choice on AWS Lambda. C# .NET is the top performer and most economical option for Azure Functions. NodeJS on Azure Functions and .NET Core 2 on AWS should be avoided or at the very least, used carefully in order to avoid their potentially slow and costly start-up times

    Collaborative, Distributed Simulations of Agri-Food Supply Chains. Analysis on How Linking Theory and Practice by Using Multi-agent Structures

    Get PDF
    Simulations help to understand and predict the behaviour of complex phenomena’s, likewise distributed socio-technical systems or how stakeholders interacts in complex domains. Such domains are normally based on networked based interaction, where information, product and decision flows comes in to play, especially under the well-known supply chains structures. Although tools exist to simulate supply chains, they do not adequately support multiple stakeholders to collaboratively create and explore a variety of decision-making scenarios. Hence, in order to provide a preliminary understanding on how these interaction affects stakeholders decision-making, this research presents an study, analysis and proposal development of robust platform to collaboratively build and simulate communication among supply chain. Since realistic supply chain behaviours are complex, a multi-agent approach was selected in order to represent such complexities in a standardised manner. The platform provides agent behaviours for common agent patterns. It provides extension hotspots to implement more specific agent behaviour for expert users (that requires programming). Therefore, as key contribution, technical aspects of the platform are presented, and also the role of multi-level supply chain scenario simulation is discussed and analysed, especially under de context of digital supply chain transformation in the agri-food context. Finally, we discuss lessons learned from early tests with the reference implementation of the platform

    ProFaaStinate: Delaying Serverless Function Calls to Optimize Platform Performance

    Full text link
    Function-as-a-Service (FaaS) enables developers to run serverless applications without managing operational tasks. In current FaaS platforms, both synchronous and asynchronous calls are executed immediately. In this paper, we present ProFaaStinate, which extends serverless platforms to enable delayed execution of asynchronous function calls. This allows platforms to execute calls at convenient times with higher resource availability or lower load. ProFaaStinate is able to optimize performance without requiring deep integration into the rest of the platform, or a complex systems model. In our evaluation, our prototype built on top of Nuclio can reduce request response latency and workflow duration while also preventing the system from being overloaded during load peaks. Using a document preparation use case, we show a 54% reduction in average request response latency. This reduction in resource usage benefits both platforms and users as cost savings.Comment: Accepted for publication in Proc. of 9th International Workshop on Serverless Computing (WoSC 23
    corecore