486 research outputs found

    Features for Killer Apps from a Semantic Web Perspective

    Get PDF
    There are certain features that that distinguish killer apps from other ordinary applications. This chapter examines those features in the context of the semantic web, in the hope that a better understanding of the characteristics of killer apps might encourage their consideration when developing semantic web applications. Killer apps are highly tranformative technologies that create new e-commerce venues and widespread patterns of behaviour. Information technology, generally, and the Web, in particular, have benefited from killer apps to create new networks of users and increase its value. The semantic web community on the other hand is still awaiting a killer app that proves the superiority of its technologies. The authors hope that this chapter will help to highlight some of the common ingredients of killer apps in e-commerce, and discuss how such applications might emerge in the semantic web

    SDN-enabled Resource Provisioning Framework for Geo-Distributed Streaming Analytics

    Get PDF
    Geographically distributed (geo-distributed) datacenters for stream data processing typically comprise multiple edges and core datacenters connected through Wide-Area Network (WAN) with a master node responsible for allocating tasks to worker nodes. Since WAN links significantly impact the performance of distributed task execution, the existing task assignment approach is unsuitable for distributed stream data processing with low latency and high throughput demand. In this paper, we propose SAFA, a resource provisioning framework using the Software-Defined Networking (SDN) concept with an SDN controller responsible for monitoring the WAN, selecting an appropriate subset of worker nodes, and assigning tasks to the designated worker nodes. We implemented the data plane of the framework in P4 and the control plane components in Python. We tested the performance of the proposed system on Apache Spark, Apache Storm, and Apache Flink using the Yahoo! streaming benchmark on a set of custom topologies. The results of the experiments validate that the proposed approach is viable for distributed stream processing and confirm that it can improve at least 1.64× the processing time of incoming events of the current stream processing systems.</p

    A stream processing framework based on linked data for information collaborating of regional energy networks

    Get PDF
    Š 2005-2012 IEEE. Coordinating of energy networks to form a city-level multidimensional integrated energy system becomes a new trend in Energy Internet (EI). The collaborating in the information layer is a core issue to achieve smart integration. However, the heterogeneity of multiagent data, the volatility of components, and the real-time analysis requirement in EI bring significant challenges. To solve these problems, in this article we propose a stream processing framework based on linked data for information collaboration among multiple energy networks. The framework provides a universal data representation based on linked data and semantic relation discovery approach to model and semantically fuse heterogeneous data. Semantics-based information transmission contracts and channels are automatically generated to adapt to structural changes in EI. A multimodel-based dynamic adjusting stream processing is implemented using data semantics. A real-world case study is implemented to demonstrate the adaptability, feasibility, and flexibility of the proposed framework
    • …
    corecore