38 research outputs found

    The Evolution, current status, and future direction of XML

    Get PDF
    The Extensible Markup Language (XML) is now established as a multifaceted open-ended markup language and continues to increase in popularity. The major players that have shaped its development include the United States government, several key corporate entities, and the World Wide Web Consortium (W3C). This paper will examine these influences on XML and will address the emergence, the current status, and the future direction of this language. In addition, it will review best practices and research that have contributed to the continued development and advancement of XML

    Using the XML Key Management Specification (and breaking X.509 rules as you go)

    Get PDF
    Abstract. Implementing X.509 based public-key infrastructure requires following a complex set of rules to establish if a public key certificate is valid. The XML Key Management Specification has been developed as one way in which the implementation burden can be reduced by moving some of this complexity from clients and onto a server. In this paper we give a brief overview of the XML key management specification standard, and describe how, in addition to the above, this system also provides us with the means to sensibly break many of the rules specified for X.509 based public key infrastructure

    Data Ingredients: smart disclosure and open government data as complementary tools to meet policy objectives. The case of energy efficiency.

    Get PDF
    Open government data are considered a key asset for eGovernment. One could argue that governments can influence other types of data disclosure, as potential ingredients of innovative services. To discuss this assumption, we took the example of the U.S. 'Green Button' initiative – based on the disclosure of energy consumption data to each user – and analysed 36 energy-oriented digital services reusing these and other data, in order to highlight their set of inputs. We find that apps suggesting to a user a more efficient consumption behaviour also benefit from average retail electricity cost/price information; that energy efficiency 'scoring' apps also need, at least, structured and updated information on buildings performance; and that value-added services that derive insights from consumption data frequently rely on average energy consumption information. More in general, most of the surveyed services combine consumption data, open government data, and corporate data. When setting sector-specific agendas grounded on data disclosure, public agencies should therefore consider (contributing) to make available all three layers of information. No widely acknowledged initiatives of energy consumption data disclosure to users are being implemented in the EU. Moreover, browsing EU data portals and websites of public agencies, we find that other key data ingredients are not supplied (or, at least, not as open data), leaving room for possible improvements in this arena

    A metamodel to annotate knowledge based engineering codes as enterprise knowledge resources

    Get PDF
    The encoding of Knowledge Based Engineering (KBE) software applications is becoming a prominent tool for the automation of knowledge intensive tasks carried out using Computer Aided Design (CAD) technology. However, limitations exist on the ability to manage the engineering knowledge models embedded in these executable KBE applications. This research proposes a metamodel to annotate encoded KBE applications. Resulting from the annotation, XKMs become explicit knowledge resources whose content can be better accessed and managed. The attachment of metadata to data sets in enterprise repositories is a necessary step to identify and index them so they can be queried, browsed and changed. The sophistication of metadata models for these data “items” ranges from the simple indexing using numbers to more sophisticated representations describing their context information (i.e. author, creation date, etc.), their internal structure and their content. Current engineering data repositories like Product Data Management and Product Lifecycle Management systems offer predefined metamodels to annotate a range of engineering data items including CAD files or special types of documents. At the moment, there is no metadata model specifically designed to annotate KBE codes. In this situation, an undifferentiated metadata model needs to be used for XKMs. However, in this case the only information retained by the system about them would be context metadata. Once an instance of the metadata is attached to an XKM, it can be used as its identifier within an enterprise data repository. The proposed metamodel contains abstract entities to annotate XKMs. The resulting descriptive model for an XKM pays attention to its internal structure and its operation at different levels of granularity. The particular design of the proposed metamodel positions it at a level of abstraction between non executable domain knowledge models and executable KBE applications. This design choice is made to support the use of the metadata not only as an informative model but also as an executable one. The achievement of this target is becoming possible through the emergence of semantic modelling standards that allow the description of data models independently from the language of implementation. Using this approach, the generation of code and metadata is made automatically using mapping rules resulting from the semantic agreement between models and specific syntax rules. The immediate application of the developed metamodel is to annotate XKMs within PLM systems. The approach shall contribute not only to systematically store instances of XKMs but also to manage the lifecycle of the engineering knowledge encoded within them. The proposed representation provides a more comprehensive approach for non KBE language experts to understand the code. On this basis, the change on the metamodels can be automatically traced back to the code and vice-versa. During the research, evidence has been gathered from the community of KBE technology users and vendors on the need to support this research effort. In the long term, the research contributes to the use of PLM systems as a platform for engineering knowledge management.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Agent-based techniques for National Infrastructure Simulation

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2002.Includes bibliographical references (leaves 35-37).Modern society is dependent upon its networks of infrastructure. These networks have grown in size and complexity to become interdependent, creating within them hidden vulnerabilities. The critical nature of these infrastructures has led to the establishment of the National Infrastructure Simulation and Analysis Center (NISAC) by the United States Government. The goal of NISAC is to provide the simulation capability to understand infrastructure interdependencies, detect vulnerabilities, and provide infrastructure planning and crises response assistance. This thesis examines recent techniques for simulation and analyzes their suitability for the national infrastructure simulation problem. Variable and agent-based simulation models are described and compared. The bottom-up approach of the agent-based model is found to be more suitable than the top-down approach of the variable-based model. Supercomputer and distributed, or grid computing solutions are explored. Both are found to be valid solutions and have complimentary strengths. Software architectures for implementation such as the traditional object-oriented approach and the web service model are examined. Solutions to meet NISAC objectives using the agent-based simulation model implemented with web services and a combination of hardware configurations are proposed.by Kenny Lin.S.M

    The Development of a graduate course on identity management for the Department of Networking, Security, and Systems Administration

    Get PDF
    Digital identities are being utilized more than ever as a means to authenticate computer users in order to control access to systems, web services, and networks. To maintain these digital identities, administrators turn to Identity Management solutions to offer protection for users, business partners, and networks. This paper proposes an analysis of Identity Management to be accomplished in the form of a graduate level course of study for a ten-week period for the Networking, Security, and Systems Administration department at Rochester Institute of Technology. This course will be designed for this department because of its emphasis on securing, protecting, and managing the identities of users within and across networks. Much of the security-related courses offered by the department focus primarily on security within enterprises. Therefore, Identity Management, a topic that is becoming more popular within enterprises each day, would compliment these courses. Students that enroll in this course will be more equipped to satisfy the needs of modern enterprises when they graduate because they will have a better understanding of how to address security issues that involve managing user identities across networks, systems, and enterprises. This course will focus on several aspects of Identity Management and its use in enterprises today. Covered during the course will be the frameworks of Identity Management, for instance, Liberty Identity Federation Framework and OASIS SAML 2.0; the Identity Management models; and some of the major Identity Management solutions that are in use today such as Liberty Alliance, Microsoft Passport, and Shibboleth. This course will also provide the opportunity to gain hands on experience by facilitating exemplar technologies used in laboratory investigations

    A framework for promoting interoperability in a global electronic market-space

    Get PDF
    The primary contributions to the area of electronic business integration, propounded by this thesis, are (in no particular order):  A novel examination of global Business-to-Business (B2B) interoperability in terms of a "multiplicity paradox" and of a "global electronic market-space" from a Complex Systems Science perspective.  A framework for an, integrated, global electronic market-space, which is based on a hierarchical, incremental, minimalist-business-pattern approach. A Web Services-SOA forms the basis of application-to-application integration within the framework. The framework is founded in a comprehensive study of existing technologies, standards and models for secure interoperability and the SOA paradigm. The Complex Systems Science concepts of "predictable structure" and "structural complexity" are used consistently throughout the progressive formulation of the framework.  A model for a global message handler (including a standards-based message-format) which obviates the common problems implicit in standard SOAP-RPC. It is formulated around the "standardized, common, abstract application interface" critical success factor, deduced from examining existing models. The model can be used in any collaboration context.  An open standards-based security model for the global message handler. Conceptually, the framework comprises the following:  An interoperable standardized message format: a standardized SOAP-envelope with standardized attachments (8-bit binary MIME-serialized XOP packages).  An interoperable standardized message-delivery infrastructure encompassing an RPC-invoked message-handler - a Web service, operating in synchronous and/or asynchronous mode, which relays attachments to service endpoints.  A business information processing infrastructure comprised of: a standardized generic minimalist-business-pattern (simple buying/selling), comprising global pre-specifications for business processes (for example, placing an order), standardized specific atomic business activities (e.g. completing an order-form), a standardized document-set (including, e.g. an order-form) based on standardized metadata (common nomenclature and common semantics used in XSD's, e.g. the order-form), the standardized corresponding choreography for atomic activities (e.g. acknowledgement of receipt of order-form) and service endpoints (based on standardized programming interfaces and virtual methods with customized implementations).Theoretical ComputingPHD (INFORMATION SYSTEMS

    Ασφάλεια Web Services

    Get PDF
    Σημείωση: διατίθεται συμπληρωματικό υλικό σε ξεχωριστό αρχείο

    A message-level security approach for RESTful services

    Get PDF
    In the past ten years Web Services have positioned themselves to be one of the leading distributed technologies. The technology, supported by major IT companies, offers specifications to many challenges in a distributed environment like strong interface and message contacts, service discovery, reliable message exchange and advanced security mechanisms. On the other hand, all these specifications have made Web Services very complex and the industry is struggling to implement those in a standardized manner. REST based services, also known as RESTful services, are based on pure HTTP and have risen as competitors to Web Services, mainly because of their simplicity. Now they are being adopted by the majority of the big industry corporations including Microsoft, Yahoo and Google, who have deprecated or passed on Web Services in favor of RESTful services. However, RESTful services have been criticized for lacking functionality offered by Web Services, especially message-level security. Since security is an important functionality which may tip the scale in a negative direction for REST based services, this thesis proposes a prototype solution for message-level security for RESTful services. The solution is for the most part technical and utilizes well-known, cross-platform mechanisms which are composed together while a smaller part of the solution discusses a non-technical approach regarding the token distribution. During the development of the prototype, much of the focus was to adapt the solution according to the REST principals and guidelines, such are multi-format support (XML or JSON) and light-weight, human readable messages

    Critical Investigation of Virtual Universities: Applying the UK Structure to Saudi Arabia

    Get PDF
    The purpose of this study was to investigate the feasibility, practicality and desirability of establishing a virtual university (VU) using new technologies in Saudi Arabia and to explore how to apply the existing VU frameworks to the Saudi Arabian education system. This is desirable in order to accommodate the rapid growth in the number of secondary school graduates, and is regarded as one of the most important challenges currently facing Saudi Universities. The study traces the origins of VUs in the UK and Europe, then examines the tools, forums and methods in use, focusing on the main service-oriented architecture and the Simple Object Access Protocol framework. Primary data were gathered by means of two sets of questionnaires, to explore the appetite for a virtual university in Saudi Arabia and to investigate the use of virtual learning in the UK. Three UK universities that strongly promote virtual learning (The Open University, the International Virtual University and Oxford University) were also researched online, providing an additional edge to the wider research on other universities. The investigation was motivated by a desire to produce a model that would widen learning opportunities for those who otherwise have no access to formal education in Saudi Arabia. The result is a virtual university model designed and developed to be a safe and secure Web-based educational system, providing online education for all, regardless of geographical position or time of day. Data were gathered mainly from secondary sources, such as journals, conference reports and books. A literature review critically assessed several technologies and protocols, and a critical comparison of Web services was conducted. Evidence from the questionnaire, the literature review and informal discussions led this researcher to pursue further the concepts of messaging technology and distributed communication, focusing on implementing JMS and a message-passing system. As a result, a chat application which utilises the publish-and-subscribe messaging model and a translator are presented and recommended as essential elements in achieving virtualisation in higher education. The thesis proposes a third-generation virtual university utilising cloud computing, offering integrated services to learners and including different types of online learning materials, specialized virtual centres for the development of educational courses, library and administrative functions, an interactive environment and online collaboration
    corecore