71 research outputs found
A Unified Format for Language Documents
We have analyzed a substantial number of language documentation
artifacts, including language standards, language specifications,
language reference manuals, as well as internal documents of
standardization bodies. We have reverse-engineered their intended
internal structure, and compared the results. The Language Document
Format (LDF), was developed to specifically support the
documentation domain. We have also integrated LDF into an
engineering discipline for language documents including tool
support, for example, for rendering language documents, extracting
grammars and samples, and migrating existing documents into LDF. The
definition of LDF, tool support for LDF, and LDF applications are
freely available through SourceForge
Automated Runtime Testing of Web Services
Service-oriented computing (SOC) is a relatively new paradigm for developing software applications through the composition of software units called services. With services, software is no longer owned but offered remotely, within or across organisational borders. Currently, the dominant technology for implementing services is that of Web services. Since service requestors do not usually have access to the implementation source code, from their perspective, services are offered as black boxes. However, requestors need to verify first that provided services are trustworthy and implemented correctly before they are integrated into their own business-critical systems. The verification and testing of remote, third-party services involve unique considerations, since testing must be performed in a blackbox manner and at runtime.
Addressing the aforementioned concerns, the research work described in this thesis investigates the feasibility of testing Web services for functional correctness, especially at runtime. The aim is to introduce rigour and automation to the testing process, so that service requestors can verify Web services with correctness guarantees and with the aid of tools. Thus, formal methods are utilised to specify the functionality of Web services unambiguously, so that they are amenable to automated and systematic testing. The well-studied stream X-machine (SXM) formalism has been selected as suitable for modelling both the dynamic behavior and static data of Web services, while a proven testing method associated with SXMs is used to derive test sets that can verify the correctness of the implementations.
This research concentrates on testing stateful Web services, in which the presence of state makes their behaviour more complex and more difficult to specify and test. The nature of Web service state, its effect on service behaviour, and implications on service modelling and testing, are investigated. In addition, comprehensive techniques are described for deriving a stream X-machine specification of a Web service, and for subsequently testing its implementation for equivalence to the specification. Then, a collaborative approach that makes possible third-party Web service verification and validation is proposed, in which the service provider is required to supply a SXM specification of the service functionality along with the standard WSDL description of its interface. On top of that, techniques are proposed for service providers to include information that ground the abstract SXM specification to the concrete Web service implementation. Having these descriptions available, it is possible to automate at runtime not only test set generation but also test case execution on Web services. A tool has been developed as part of this work, which extends an existing SXM-based testing tool (JSXM). The tool supports the tester activities, consisting of generation of abstract test cases from the SXM specification and their execution on the Web service under test using the supplied grounding information. Practical Web service examples are also used throughout the thesis to demonstrate the proposed techniques
Software documentation
This thesis report is submitted in partial fulfillment of the requirements for the degree of Bachelor of Science in Computer Science and Engineering, 2006.Cataloged from PDF version of thesis report.Includes bibliographical references (page 92).The main objective of my thesis is to generate a user manual that would be very much
comprehensible at the same time well structured and would act as an effective navigator.
Documentation is mainly requisite for better communication among the different
members of a software development team, such as designers of finer grained components,
builders of interfacing system, implementers, testers, performance engineers, technical
managers, analysts, quality specialists. In order to develop a very comprehensive
documentation there are certain conventions that are requisite to be taken care of. Those
conventions and rules have been high lighted extensively.
There are different types of documentation based on the requirements of each individual
associated with the software development life cycle and they are design, code, user,
architectural, trade study and marketing are few to mention.
However, my focus area is user documentation. Unlike code documents, user documents
are usually far divorced from the source code of the program, and instead simply describe
how it is used. The use of XML and Docbook is there. DocBook simply provides a
framework. All the presentation issues are devolved to style sheets.Tahmina Zaman KhanB. Computer Science and Engineerin
From Relations to XML: Cleaning, Integrating and Securing Data
While relational databases are still the preferred approach for storing data, XML is emerging
as the primary standard for representing and exchanging data. Consequently, it has
been increasingly important to provide a uniform XML interface to various data sourcesā
integration; and critical to protect sensitive and confidential information in XML data ā
access control. Moreover, it is preferable to first detect and repair the inconsistencies in
the data to avoid the propagation of errors to other data processing steps. In response to
these challenges, this thesis presents an integrated framework for cleaning, integrating and
securing data.
The framework contains three parts. First, the data cleaning sub-framework makes
use of a new class of constraints specially designed for improving data quality, referred
to as conditional functional dependencies (CFDs), to detect and remove inconsistencies in
relational data. Both batch and incremental techniques are developed for detecting CFD
violations by SQL efficiently and repairing them based on a cost model. The cleaned relational
data, together with other non-XML data, is then converted to XML format by using
widely deployed XML publishing facilities. Second, the data integration sub-framework
uses a novel formalism, XML integration grammars (XIGs), to integrate multi-source XML
data which is either native or published from traditional databases. XIGs automatically
support conformance to a target DTD, and allow one to build a large, complex integration
via composition of component XIGs. To efficiently materialize the integrated data, algorithms
are developed for merging XML queries in XIGs and for scheduling them. Third, to
protect sensitive information in the integrated XML data, the data security sub-framework
allows users to access the data only through authorized views. User queries posed on these
views need to be rewritten into equivalent queries on the underlying document to avoid the
prohibitive cost of materializing and maintaining large number of views. Two algorithms
are proposed to support virtual XML views: a rewriting algorithm that characterizes the
rewritten queries as a new form of automata and an evaluation algorithm to execute the
automata-represented queries. They allow the security sub-framework to answer queries
on views in linear time.
Using both relational and XML technologies, this framework provides a uniform approach
to clean, integrate and secure data. The algorithms and techniques in the framework
have been implemented and the experimental study verifies their effectiveness and efficiency
Betsy - A BPEL Engine Test System
More than five years have passed since the final release of the long-desired OASIS standard of a process language for web service orchestration, the Web Services Business Process Execution Language (BPEL). The aim of this standard is to establish a universally accepted orchestration language that forms a core part of current service-oriented architectures and, because of standardisation, avoids vendor lock-in. High expectations, in academia and practice alike, have been set on it. By now, several fully conformant and highly scalable engines should have arrived in the market. The perception of many however, is that standard conformance in current engines is far from given. It is our aim to shed light on this situation. In this study, we present the tool betsy, a BPEL Engine Test System that allows for a fully-automatic assessment of the standard conformance of a given BPEL engine. We use it to examine the five most important open source BPEL engines available today. Betsy comes with a large set of engineindependent conformance test cases for assessing BPEL standard conformance. This enables us to give a view of the state of the art in BPEL support
Recommended from our members
Managing Next Generation Networks (NGNs) based on the Service-Oriented Architechture (SOA). Design, Development and testing of a message-based Network Management platform for the integration of heterogeneous management systems.
Next Generation Networks (NGNs) aim to provide a unified network
infrastructure to offer multimedia data and telecommunication services
through IP convergence. NGNs utilize multiple broadband, QoS-enabled
transport technologies, creating a converged packet-switched network
infrastructure, where service-related functions are separated from the
transport functions. This requires significant changes in the way how
networks are managed to handle the complexity and heterogeneity of
NGNs.
This thesis proposes a Service Oriented Architecture (SOA) based
management framework that integrates heterogeneous management
systems in a loose coupling manner. The key benefit of the proposed
management architecture is the reduction of the complexity through
service and data integration. A network management middleware layer
that merges low level management functionality with higher level
management operations to resolve the problem of heterogeneity was
proposed.
A prototype was implemented using Web Services and a testbed was
developed using trouble ticket systems as the management application to
demonstrate the functionality of the proposed framework. Test results
show the correcting functioning of the system. It also concludes that the
proposed framework fulfils the principles behind the SOA philosophy
Palvelukeskeiset liiketoimintaprosessit kƤynnissƤpidon toimintojen mallintamisessa
Service-Oriented Architecture (SOA) is a paradigm for modeling the interaction of different parties in a distributed system. In SOA, a high abstraction level leads to platform-independent interoperability. Moreover, different parties are only loosely coupled to each other. As a result of these, SOA is a scalable and flexible architecture.
As industrial automation systems are typically inflexible and expensive to install or to modify, it would be beneficial to have all devices interact in the SOA manner. However, current technologies to implement a SOA are problematic from the devices point of view. The technologies require a lot of computational resources, and they also lack support for hard real-time functions. Work has been done to overcome these challenges, but especially hard real-time capable SOA cannot currently be implemented.
Despite their limitations, current SOA technologies can be used for several functions of industrial plants. In this study, service-oriented solutions are created for the estimation of environmental footprints and for condition monitoring. The solutions are modeled as diagrams using a standard graphical notation after which the diagrams are converted to an executable language.
Both implementations show the efficiency of the selected modeling method. The principles of SOA enable the reuse of different resources flexibly in different applications which saves work. A standard structured data format was used in both solutions, and it facilitates integration. As there is a built-in support for the format in modern applications, a solution designer can concentrate on data contents on a high level. Compatibility problems were also encountered, but they were overcome using wrapper services. There were also other integration problems with the technologies used. Despite the problems, graphical modeling saves time compared to textual methods to model communication. It was also recognized that careful design is required in distributed systems to avoid performance problems. /Kir1
- ā¦