27,981 research outputs found

    Fog computing pour l'intégration d'agents et de services Web dans un middleware réflexif autonome

    Get PDF
    International audienceService Oriented Architecture (SOA) has emerged as a dominant architecture for interoperability between applications, by using a weak-coupled model based on the flexibility provided by Web Services, which has led to a wide range of applications, what is known as cloud computing. On the other hand, Multi-Agent System (MAS) is widely used in the industry, because it provides an appropriate solution to complex problems, in a proactive and intelligent way. Specifically, Intelligent Environments (Smart City, Smart Classroom, Cyber Physical System, and Smart Factory, among others) obtain great benefits by using both architectures, because MAS endows intelligence to the environment, while SOA enables users to interact with cloud services, which improve the capabilities of the devices deployed in the environment. Additionally, the fog computing paradigm extends the cloud computing paradigm to be closer to the things that produce and act on the intelligent environment, allowing to deal with issues like mobility, real time, low latency, geo-localization, among other aspects. In this sense, in this article we present a middleware, which not only is capable of allowing MAS and SOA to communicate in a bidirectional and transparent way, but also, it uses the fog computing paradigm autonomously, according to the context and to the system load factor. Additionally, we analyze the performance of the incorporation of the fog-computing paradigm in our middleware and compare it with other works

    New benchmarking methodology and programming model for big data processing

    Get PDF
    Big data processing is becoming a reality in numerous real-world applications. With the emergence of new data intensive technologies and increasing amounts of data, new computing concepts are needed. The integration of big data producing technologies, such as wireless sensor networks, Internet of Things, and cloud computing, into cyber-physical systems is reducing the available time to find the appropriate solutions. This paper presents one possible solution for the coming exascale big data processing: a data flow computing concept. The performance of data flow systems that are processing big data should not be measured with the measures defined for the prevailing control flow systems. A new benchmarking methodology is proposed, which integrates the performance issues of speed, area, and power needed to execute the task. The computer ranking would look different if the new benchmarking methodologies were used; data flow systems would outperform control flow systems. This statement is backed by the recent results gained from implementations of specialized algorithms and applications in data flow systems. They show considerable factors of speedup, space savings, and power reductions regarding the implementations of the same in control flow computers. In our view, the next step of data flow computing development should be a move from specialized to more general algorithms and applications.Peer ReviewedPostprint (published version
    • …
    corecore