222 research outputs found

    Ontology based contextualization and context constraints management in web service processes

    Get PDF
    The flexibility and dynamism of service-based applications impose shifting the validation process to runtime; therefore, runtime monitoring of dynamic features attached to service-based systems is becoming an important direction of research that motivated the definition of our work. We propose an ontology based contextualization and a framework and techniques for managing context constraints in a Web service process for dynamic requirements validation monitoring at process runtime. Firstly, we propose an approach to define and model dynamic service context attached to composition and execution of services in a service process at run-time. Secondly, managing context constraints are defined in a framework, which has three main processes for context manipulation and reasoning, context constraints generation, and dynamic instrumentation and validation monitoring of context constraints. The dynamic requirements attached to service composition and execution are generated as context constraints. The dynamic service context modeling is investigated based on empirical analysis of application scenarios in the classical business domain and analysing previous models in the literature. The orientation of context aspects in a general context taxonomy is considered important. The Ontology Web Language (OWL) has many merits on formalising dynamic service context such as shared conceptualization, logical language support for composition and reasoning, XML based interoperability, etc. XML-based constraint representation is compatible with Web service technologies. The analysis of complementary case study scenarios and expert opinions through a survey illustrate the validity and completeness of our context model. The proposed techniques for context manipulation, context constraints generation, instrumentation and validation monitoring are investigated through a set of experiments from an empirical evaluation. The analytical evaluation is also used to evaluate algorithms. Our contributions and evaluation results provide a further step towards developing a highly automated dynamic requirements management system for service processes at process run-time

    A Semantic Framework for Declarative and Procedural Knowledge

    Get PDF
    In any scientic domain, the full set of data and programs has reached an-ome status, i.e. it has grown massively. The original article on the Semantic Web describes the evolution of a Web of actionable information, i.e.\ud information derived from data through a semantic theory for interpreting the symbols. In a Semantic Web, methodologies are studied for describing, managing and analyzing both resources (domain knowledge) and applications (operational knowledge) - without any restriction on what and where they\ud are respectively suitable and available in the Web - as well as for realizing automatic and semantic-driven work\ud ows of Web applications elaborating Web resources.\ud This thesis attempts to provide a synthesis among Semantic Web technologies, Ontology Research, Knowledge and Work\ud ow Management. Such a synthesis is represented by Resourceome, a Web-based framework consisting of two components which strictly interact with each other: an ontology-based and domain-independent knowledge manager system (Resourceome KMS) - relying on a knowledge model where resource and operational knowledge are contextualized in any domain - and a semantic-driven work ow editor, manager and agent-based execution system (Resourceome WMS).\ud The Resourceome KMS and the Resourceome WMS are exploited in order to realize semantic-driven formulations of work\ud ows, where activities are semantically linked to any involved resource. In the whole, combining the use of domain ontologies and work ow techniques, Resourceome provides a exible domain and operational knowledge organization, a powerful engine for semantic-driven work\ud ow composition, and a distributed, automatic and\ud transparent environment for work ow execution

    Programming Languages for Distributed Computing Systems

    Get PDF
    When distributed systems first appeared, they were programmed in traditional sequential languages, usually with the addition of a few library procedures for sending and receiving messages. As distributed applications became more commonplace and more sophisticated, this ad hoc approach became less satisfactory. Researchers all over the world began designing new programming languages specifically for implementing distributed applications. These languages and their history, their underlying principles, their design, and their use are the subject of this paper. We begin by giving our view of what a distributed system is, illustrating with examples to avoid confusion on this important and controversial point. We then describe the three main characteristics that distinguish distributed programming languages from traditional sequential languages, namely, how they deal with parallelism, communication, and partial failures. Finally, we discuss 15 representative distributed languages to give the flavor of each. These examples include languages based on message passing, rendezvous, remote procedure call, objects, and atomic transactions, as well as functional languages, logic languages, and distributed data structure languages. The paper concludes with a comprehensive bibliography listing over 200 papers on nearly 100 distributed programming languages

    Integrating Learning and Business Process Management

    Get PDF
    Recent research activities in the field of TEL have created a new awareness for intelligent learning infrastructures. To foster the usage of innovative TEL in the workplace, it must be integrated into organizational business operations and aligned with their learning requirements. Being the semantic interface of organizational ICT infrastructure, business processes represent the potential linkage between learning and business IS. Today, most organizations and their supporting ICT systems have incorporated processes as central objects of control. They manage their businesses along their processes, starting with process design over process execution up to process control and monitoring that feed back into improved business process design. As this process lifecycle has become the central instrument of BPM, it lends to be the vehicle for a businessintegrated learning management. This paper aims to position the thesis of a reciprocal relationship between business and learning processes being the prerequisite for prospective integrated workplace learning

    Dokumentverifikation mit Temporaler Beschreibungslogik

    Get PDF
    The thesis proposes a new formal framework for checking the content of web documents along individual reading paths. It is vital for the readability of web documents that their content is consistent and coherent along the possible browsing paths through the document. Manually ensuring the coherence of content along the possibly huge number of different browsing paths in a web document is time-consuming and error-prone. Existing methods for document validation and verification are not sufficiently expressive and efficient. The innovative core idea of this thesis is to combine the temporal logic CTL and description logic ALC for the representation of consistency criteria. The resulting new temporal description logics ALCCTL can - in contrast to existing specification formalisms - compactly represent coherence criteria on documents. Verification of web documents is modelled as a model checking problem of ALCCTL. The decidability and polynomial complexity of the ALCCTL model checking problem is proven and a sound, complete, and optimal model checking algorithm is presented. Case studies on real and realistic web documents demonstrate the performance and adequacy of the proposed methods. Existing methods such as symbolic model checking or XML-based document validation are outperformed in both expressiveness and speed.Die Dissertation stellt ein neues formales Framework für die automatische Prüfung inhaltlich-struktureller Konsistenzkriterien an Web-Dokumente vor. Viele Informationen werden heute in Form von Web-Dokumenten zugänglich gemacht. Komplexe Dokumente wie Lerndokumente oder technische Dokumentationen müssen dabei vielfältige Qualitätskriterien erfüllen. Der Informationsgehalt des Dokuments muss aktuell, vollständig und in sich stimmig sein. Die Präsentationsstruktur muss unterschiedlichen Zielgruppen mit unterschiedlichen Informationsbedürfnissen genügen. Die Sicherstellung grundlegender Konsistenzeigenschaften von Dokumenten ist angesichts der Vielzahl der Anforderungen und Nutzungskontexte eines elektronischen Dokuments nicht trivial. In dieser Arbeit werden aus der Hard-/Softwareverifikation bekannte Model-Checking-Verfahren mit Methoden zur Repräsentation von Ontologien kombiniert, um sowohl die Struktur des Dokuments als auch inhaltliche Zusammenhänge bei der Prüfung von Konsistenzkriterien berücksichtigen zu können. Als Spezifikationssprache für Konsistenzkriterien wird die neue temporale Beschreibungslogik ALCCTL vorgeschlagen. Grundlegende Eigenschaften wie Entscheidbarkeit, Ausdruckskraft und Komplexität werden untersucht. Die Adäquatheit und Praxistauglichkeit des Ansatzes werden in Fallstudien mit eLearning-Dokumenten evaluiert. Die Ergebnisse übertreffen bekannte Ansätze wie symbolisches Model-Checking oder Methoden zur Validierung von XML-Dokumenten in Performanz, Ausdruckskraft hinsichtlich der prüfbaren Kriterien und Flexibilität hinsichtlich des Dokumenttyps und -formats

    Rethinking Pedagogy: Exploring the Potential of Digital Technology in Achieving Quality Education

    Get PDF
    (First Paragraph) The Mahatma Gandhi Institute of Education for Peace and Sustainable Development (MGIEP) is UNESCO’s Category 1 education Institute in the Asia-Pacific region devoted to education for peace and sustainable development, as enshrined in SDG Target 4.7. UNESCO MGIEP promotes the use of digital learning platforms where teachers and students can co-create and share a highly interactive learning experience. With the rise of the internet, there has been a proliferation of online content and digital resources intended to support teaching and learning, albeit widely varying in quality. Digital education media and resources, if carefully designed and implemented, have a significant potential to be mobilized on a massive scale to support transformative learning for building sustainable, flourishing societies

    Robot graphic simulation testbed

    Get PDF
    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts

    Interoperability of wireless communication technologies in hybrid networks : evaluation of end-to-end interoperability issues and quality of service requirements

    Get PDF
    Hybrid Networks employing wireless communication technologies have nowadays brought closer the vision of communication “anywhere, any time with anyone”. Such communication technologies consist of various standards, protocols, architectures, characteristics, models, devices, modulation and coding techniques. All these different technologies naturally may share some common characteristics, but there are also many important differences. New advances in these technologies are emerging very rapidly, with the advent of new models, characteristics, protocols and architectures. This rapid evolution imposes many challenges and issues to be addressed, and of particular importance are the interoperability issues of the following wireless technologies: Wireless Fidelity (Wi-Fi) IEEE802.11, Worldwide Interoperability for Microwave Access (WiMAX) IEEE 802.16, Single Channel per Carrier (SCPC), Digital Video Broadcasting of Satellite (DVB-S/DVB-S2), and Digital Video Broadcasting Return Channel through Satellite (DVB-RCS). Due to the differences amongst wireless technologies, these technologies do not generally interoperate easily with each other because of various interoperability and Quality of Service (QoS) issues. The aim of this study is to assess and investigate end-to-end interoperability issues and QoS requirements, such as bandwidth, delays, jitter, latency, packet loss, throughput, TCP performance, UDP performance, unicast and multicast services and availability, on hybrid wireless communication networks (employing both satellite broadband and terrestrial wireless technologies). The thesis provides an introduction to wireless communication technologies followed by a review of previous research studies on Hybrid Networks (both satellite and terrestrial wireless technologies, particularly Wi-Fi, WiMAX, DVB-RCS, and SCPC). Previous studies have discussed Wi-Fi, WiMAX, DVB-RCS, SCPC and 3G technologies and their standards as well as their properties and characteristics, such as operating frequency, bandwidth, data rate, basic configuration, coverage, power, interference, social issues, security problems, physical and MAC layer design and development issues. Although some previous studies provide valuable contributions to this area of research, they are limited to link layer characteristics, TCP performance, delay, bandwidth, capacity, data rate, and throughput. None of the studies cover all aspects of end-to-end interoperability issues and QoS requirements; such as bandwidth, delay, jitter, latency, packet loss, link performance, TCP and UDP performance, unicast and multicast performance, at end-to-end level, on Hybrid wireless networks. Interoperability issues are discussed in detail and a comparison of the different technologies and protocols was done using appropriate testing tools, assessing various performance measures including: bandwidth, delay, jitter, latency, packet loss, throughput and availability testing. The standards, protocol suite/ models and architectures for Wi-Fi, WiMAX, DVB-RCS, SCPC, alongside with different platforms and applications, are discussed and compared. Using a robust approach, which includes a new testing methodology and a generic test plan, the testing was conducted using various realistic test scenarios on real networks, comprising variable numbers and types of nodes. The data, traces, packets, and files were captured from various live scenarios and sites. The test results were analysed in order to measure and compare the characteristics of wireless technologies, devices, protocols and applications. The motivation of this research is to study all the end-to-end interoperability issues and Quality of Service requirements for rapidly growing Hybrid Networks in a comprehensive and systematic way. The significance of this research is that it is based on a comprehensive and systematic investigation of issues and facts, instead of hypothetical ideas/scenarios or simulations, which informed the design of a test methodology for empirical data gathering by real network testing, suitable for the measurement of hybrid network single-link or end-to-end issues using proven test tools. This systematic investigation of the issues encompasses an extensive series of tests measuring delay, jitter, packet loss, bandwidth, throughput, availability, performance of audio and video session, multicast and unicast performance, and stress testing. This testing covers most common test scenarios in hybrid networks and gives recommendations in achieving good end-to-end interoperability and QoS in hybrid networks. Contributions of study include the identification of gaps in the research, a description of interoperability issues, a comparison of most common test tools, the development of a generic test plan, a new testing process and methodology, analysis and network design recommendations for end-to-end interoperability issues and QoS requirements. This covers the complete cycle of this research. It is found that UDP is more suitable for hybrid wireless network as compared to TCP, particularly for the demanding applications considered, since TCP presents significant problems for multimedia and live traffic which requires strict QoS requirements on delay, jitter, packet loss and bandwidth. The main bottleneck for satellite communication is the delay of approximately 600 to 680 ms due to the long distance factor (and the finite speed of light) when communicating over geostationary satellites. The delay and packet loss can be controlled using various methods, such as traffic classification, traffic prioritization, congestion control, buffer management, using delay compensator, protocol compensator, developing automatic request technique, flow scheduling, and bandwidth allocation.EThOS - Electronic Theses Online ServiceGBUnited Kingdo
    corecore