18 research outputs found

    Towards a new approach for enterprise integration : the semantic modeling approach

    Get PDF
    Manufacturing today has become a matter of the effective and efficient application of information technology and knowledge engineering. Manufacturing firms’ success depends to a great extent on information technology, which emphasizes the integration of the information systems used by a manufacturing enterprise. This integration is also called enterprise application integration (here the term application means information systems or software systems). The methodology for enterprise application integration, in particular enterprise application integration automation, has been studied for at least a decade; however, no satisfactory solution has been found. Enterprise application integration is becoming even more difficult due to the explosive growth of various information systems as a result of ever increasing competition in the software market. This thesis aims to provide a novel solution to enterprise application integration. The semantic data model concept that evolved in database technology is revisited and applied to enterprise application integration. This has led to two novel ideas developed in this thesis. First, an ontology of an enterprise with five levels (following the data abstraction: generalization/specialization) is proposed and represented using unified modeling language. Second, both the ontology for the enterprise functions and the ontology for the enterprise applications are modeled to allow automatic processing of information back and forth between these two domains. The approach with these novel ideas is called the enterprise semantic model approach. The thesis presents a detailed description of the enterprise semantic model approach, including the fundamental rationale behind the enterprise semantic model, the ontology of enterprises with levels, and a systematic way towards the construction of a particular enterprise semantic model for a company. A case study is provided to illustrate how the approach works and to show the high potential of solving the existing problems within enterprise application integration

    An Infrastructure for the Dynamic Distribution of Web Applications and Services

    Full text link
    This paper presents the design and implementation of an infrastructure that enables any Web application, regardless of its current state, to be stopped and uninstalled from a particular server, transferred to a new server, then installed, loaded, and resumed, with all these events occurring "on the fly" and totally transparent to clients. Such functionalities allow entire applications to fluidly move from server to server, reducing the overhead required to administer the system, and increasing its performance in a number of ways: (1) Dynamic replication of new instances of applications to several servers to raise throughput for scalability purposes, (2) Moving applications to servers to achieve load balancing or other resource management goals, (3) Caching entire applications on servers located closer to clients.National Science Foundation (9986397

    Project Venezia-Gondola (A Framework for P-Commerce)

    Get PDF
    A novel project named Venezia-Gondola (Project V-G) was presented, which describes an application platform that enables the activities of Peer-to-Peer commerce (P-Commerce). A new pattern called the Inverted Model-View-Controller (IMVC) pattern was claimed that is suitable for P-Commerce. The author also explains the principles of the Project V-G and possible architecture for future development

    Towards secure web services: Performance analysis, decision making and steganography approaches

    Get PDF
    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Web services provide a platform neutral and programming language independent technology that supports interoperable machine-to-machine interaction over a network. Clients and other systems interact with Web services using a standardised XML messaging system, such as the Simple Object Access Protocol (SOAP), typically conveyed using HTTP with an XML serialisation in conjunction with other related Web standards. Nevertheless, the idea of applications from different parties communicating together raises a security threat. The challenge of Web services security is to understand and consider the risks of securing a Web-based service depending on the existing security techniques and simultaneously follow evolving standards in order to fill the gap in Web services security. However, the performance of the security mechanisms is fraught with concerns due to additional security contents in SOAP messages, the higher number of message exchanges to establish trust, as well as the extra CPU time to process these additions. As the interaction between service providers and requesters occurs via XML-based SOAP messages, securing Web services tends to make these messages longer than they would be otherwise and consequently requires interpretation by XML parsers on both sides, which reduces the performance of Web services. The work described in this thesis can be broadly divided into three parts, the first of which is studying and comparing the performance of various security profiles applied on a Web service tested with different initial message sizes. The second part proposes a multi-criteria decision making framework to aid Web services developers and architects in selecting the best suited security profile that satisfies the different requirements of a given application during the development process in a systematic, manageable, and effective way. The proposed framework, based on the Analytical Hierarchy Process (AHP) approach, incorporates not only the security requirements, but also the performance considerations as well as the configuration constraints of these security profiles. The framework is then validated and evaluated using a scenario-driven approach to demonstrate situations where the decision making framework is used to make informed decisions to rank various security profiles in order to select the most suitable one for each scenario. Finally, the last part of this thesis develops a novel steganography method to be used for SOAP messages within Web services environments. This method is based on changing the order of XML elements according to a secret message. This method has a high imperceptibility; it leaves almost no trail because it uses the communication protocol as a cover medium, and keeps the structure and size of the SOAP message intact. The method is empirically validated using a feasible scenario so as to indicate its utility and value

    Pragmatic development of service based real-time change data capture

    Get PDF
    This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources

    A Framework for Secure Management of Web Services (SMaWS) in Enterprise Application Integration

    Get PDF
    This dissertation addresses challenges currently faced by enterprises that have embraced the new technology called Web Service in order to reduce the cost of enterprise application integration (EAI) as well as improve operational efficiency of their mission-critical business processes. The nature of Web Service introduces new challenges such as dependency among applications, and a failure in one application can lead to a failure in other dependent applications. Such challenges have led to a growing need for enterprises to confront Web Service monitoring and management issues as a priority. As a solution, this dissertation proposes a SMaWS (Secure Management of Web Services) infrastructure for secure monitoring and management of Web Services. Its goals are to provide deeper visibility into Web Service runtime activities as compared to currently Web Service management tools; access to information about the Quality of Service (QoS) of these Web Services; and a unified monitoring environment for Web Services deployed across enterprise business units. This enables an earlier detection of poor performance problem in each interdependent Web Service, which would lead to a faster diagnose and fixing of possible performance issue, and thus maximize availability. This dissertation describes the requirements analysis for monitoring and management of Web Services across an enterprise environment. It describes the architecture and design of the SMaWS infrastructure proposed for secure monitoring and management of Web Service. The proposed SMaWS framework enables the instrumentation of existing and newly developed Web Service applications, and extracts Web Service performance statistics. It determines Web Service identity, reliability, availability, security, usage, and license used by Web Service consumers to access a given service. This dissertation describes the SMaWS Repository and Security concepts that are proposed to address the challenges faced by most distributed architectures to enable the client applications determine the location of the server (“bootstrapping problem”), and at the same time ensuring both the integrity and confidentiality of parties involved. Finally, this dissertation presents a prototype implementation of SMaWS Manager Application and Sample SMaWS Web Service applications. The experimental results obtained, in terms of overhead induced by the SMaWS framework on the monitored Web Service applications, demonstrate the feasibility of the SMaWS infrastructure

    Towards secure web services : performance analysis, decision making and steganography approaches

    Get PDF
    Web services provide a platform neutral and programming language independent technology that supports interoperable machine-to-machine interaction over a network. Clients and other systems interact with Web services using a standardised XML messaging system, such as the Simple Object Access Protocol (SOAP), typically conveyed using HTTP with an XML serialisation in conjunction with other related Web standards. Nevertheless, the idea of applications from different parties communicating together raises a security threat. The challenge of Web services security is to understand and consider the risks of securing a Web-based service depending on the existing security techniques and simultaneously follow evolving standards in order to fill the gap in Web services security. However, the performance of the security mechanisms is fraught with concerns due to additional security contents in SOAP messages, the higher number of message exchanges to establish trust, as well as the extra CPU time to process these additions. As the interaction between service providers and requesters occurs via XML-based SOAP messages, securing Web services tends to make these messages longer than they would be otherwise and consequently requires interpretation by XML parsers on both sides, which reduces the performance of Web services. The work described in this thesis can be broadly divided into three parts, the first of which is studying and comparing the performance of various security profiles applied on a Web service tested with different initial message sizes. The second part proposes a multi-criteria decision making framework to aid Web services developers and architects in selecting the best suited security profile that satisfies the different requirements of a given application during the development process in a systematic, manageable, and effective way. The proposed framework, based on the Analytical Hierarchy Process (AHP) approach, incorporates not only the security requirements, but also the performance considerations as well as the configuration constraints of these security profiles. The framework is then validated and evaluated using a scenario-driven approach to demonstrate situations where the decision making framework is used to make informed decisions to rank various security profiles in order to select the most suitable one for each scenario. Finally, the last part of this thesis develops a novel steganography method to be used for SOAP messages within Web services environments. This method is based on changing the order of XML elements according to a secret message. This method has a high imperceptibility; it leaves almost no trail because it uses the communication protocol as a cover medium, and keeps the structure and size of the SOAP message intact. The method is empirically validated using a feasible scenario so as to indicate its utility and value.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Pragmatic development of service based real-time change data capture

    Get PDF
    This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Computer-supported virtual collaborative learning and assessment framework for distributed learning environment

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2002.Includes bibliographical references (leaves 166-168).by Wei Wang.S.M

    Highly Interactive Web-Based Courseware

    Get PDF
    Zukünftige Lehr-/Lernprogramme sollen als vernetzte Systeme die Lernenden befähigen, Lerninhalte zu erforschen und zu konstruieren, sowie Verständnisschwierigkeiten und Gedanken in der Lehr-/Lerngemeinschaft zu kommunizieren. Lehrmaterial soll dabei in digitale Lernobjekte übergeführt, kollaborativ von Programmierern, Pädagogen und Designern entwickelt und in einer Datenbank archiviert werden, um von Lehrern und Lernenden eingesetzt, angepasst und weiterentwickelt zu werden. Den ersten Schritt in diese Richtung machte die Lerntechnologie, indem sie Wiederverwendbarkeit und Kompabilität für hypermediale Kurse spezifizierte. Ein größeres Maß an Interaktivität wird bisher allerdings noch nicht in Betracht gezogen. Jedes interaktive Lernobjekt wird als autonome Hypermedia-Einheit angesehen, aufwändig in der Erstellung, und weder mehrstufig verschränk- noch anpassbar, oder gar adäquat spezifizierbar. Dynamische Eigenschaften, Aussehen und Verhalten sind fest vorgegeben. Die vorgestellte Arbeit konzipiert und realisiert Lerntechnologie für hypermediale Kurse unter besonderer Berücksichtigung hochgradig interaktiver Lernobjekte. Innovativ ist dabei zunächst die mehrstufige, komponenten-basierte Technologie, die verschiedenste strukturelle Abstufungen von kompletten Lernobjekten und Werkzeugsätzen bis hin zu Basiskomponenten und Skripten, einzelnen Programmanweisungen, erlaubt. Zweitens erweitert die vorgeschlagene Methodik Kollaboration und individuelle Anpassung seitens der Teilnehmer eines hypermedialen Kurses auf die Software-Ebene. Komponenten werden zu verknüpfbaren Hypermedia-Objekten, die in der Kursdatenbank verwaltet und von allen Kursteilnehmern bewertet, mit Anmerkungen versehen und modifiziert werden. Neben einer detaillierten Beschreibung der Lerntechnologie und Entwurfsmuster für interaktive Lernobjekte sowie verwandte hypermediale Kurse wird der Begriff der Interaktivität verdeutlicht, indem eine kombinierte technologische und symbolische Definition von Interaktionsgraden vorgestellt und daraus ein visuelles Skriptschema abgeleitet wird, welches Funktionalität übertragbar macht. Weiterhin wird die Evolution von Hypermedia und Lehr-/Lernprogrammen besprochen, um wesentliche Techniken für interaktive, hypermediale Kurse auszuwählen. Die vorgeschlagene Architektur unterstützt mehrsprachige, alternative Inhalte, bietet konsistente Referenzen und ist leicht zu pflegen, und besitzt selbst für interaktive Inhalte Online-Assistenten. Der Einsatz hochgradiger Interaktivität in Lehr-/Lernprogrammen wird mit hypermedialen Kursen im Bereich der Computergraphik illustriert.The grand vision of educational software is that of a networked system enabling the learner to explore, discover, and construct subject matters and communicate problems and ideas with other community members. Educational material is transformed into reusable learning objects, created collaboratively by developers, educators, and designers, preserved in a digital library, and utilized, adapted, and evolved by educators and learners. Recent advances in learning technology specified reusability and interoperability in Web-based courseware. However, great interactivity is not yet considered. Each interactive learning object represents an autonomous hypermedia entity, laborious to create, impossible to interlink and to adapt in a graduated manner, and hard to specify. Dynamic attributes, the look and feel, and functionality are predefined. This work designs and realizes learning technology for Web-based courseware with special regard to highly interactive learning objects. The innovative aspect initially lies in the multi-level, component-based technology providing a graduated structuring. Components range from complex learning objects to toolkits to primitive components and scripts. Secondly, the proposed methodologies extend community support in Web-based courseware – collaboration and personalization – to the software layer. Components become linkable hypermedia objects and part of the courseware repository, rated, annotated, and modified by all community members. In addition to a detailed description of technology and design patterns for interactive learning objects and matching Web-based courseware, the thesis clarifies the denotation of interactivity in educational software formulating combined levels of technological and symbolical interactivity, and deduces a visual scripting metaphor for transporting functionality. Further, it reviews the evolution of hypermedia and educational software to extract substantial techniques for interactive Web-based courseware. The proposed framework supports multilingual, alternative content, provides link consistency and easy maintenance, and includes state-driven online wizards also for interactive content. The impact of great interactivity in educational software is illustrated with courseware in the Computer Graphics domain
    corecore