455 research outputs found

    Efficient Implementation of Estelle Specifications

    Get PDF
    Efficient implementation of communication software is of critical importance for high-speed networks. We analyze performance bottlenecks in existing implementations and propose two techniques for improvements: The first exploits parallelism not only in the actions of the FSMs, but also in the runtime system of the protocol stack. The second integrates adjacent layers leading to considerable savings in inter-layer interface handling and in the number of transitions occurring in the FSMs. Both techniques are discussed in the context of OSI upper layers, and are based on protocol specification in Estelle

    Leistungsbewertung automatisch generierter Protokollimplementierungen mit Estelle

    Get PDF
    Formale Beschreibungstechniken (FDTs) erlauben durch ihre formale Syntax und Semantik eine präzise Systembeschreibung und sind Grundlage für die formale Verifikation. Bei der Implementierung von Systemen wird jedoch nach wie vor von Hand implementiert, selbst wenn ausgereifte Werkzeuge zur automatischen Generierung von Kode direkt aus der formalen Spezifikation existieren. Die Ursache dafür liegt in dem Ruf dieser Werkzeuge, Kode mit extrem geringer Leistungsfähigkeit zu erzeugen. Es gibt jedoch kaum quantitative Leistungsvergleiche zwischen manuell und automatisch generierten Implementierungen, die dieses Vorurteil stützen oder widerlegen könnten. In diesem Beitrag wird ein solcher Leistungsvergleich anhand des Hochleistungsprotokolls XTP und der FDT Estelle vorgestellt. Er liefert eine Bestandsaufnahme des momentanen Entwicklungsstandes bei der automatischen Generierung von Kode aus Estelle-Spezifikationen im direkten Vergleich zu gut optimierten Handimplementierungen. Es zeigt sich, daß in dem betrachteten Fall eines komplexen Protokolls die Handimplementierung zwar merklich leistungsstärker ist. Dieser Leistungsvorteil wird jedoch durch einen sehr hohen Implementierungsaufwand sowie die Schwierigkeit, die Korrektheit bzgl. der Spezifikation sicherzustellen, erkauft. Im einzelnen Anwendungsfall kann es daher trotz der Leistungseinbußen durchaus vorteilhaft sein, automatisch Kode zu erzeugen, zumal in der Bestandsaufnahme festgestellt wurde, daß automatisch generierte Implementierungen z.T. besser abschneiden als erwartet. Zudem besteht - anders als bei der bereits umfassend optimierten Handimplementierung - noch ein erhebliches ungenutztes Potential zur Leistungsverbesserung der automatisch generierten Implementierung

    Metabolomics : a tool for studying plant biology

    Get PDF
    In recent years new technologies have allowed gene expression, protein and metabolite profiles in different tissues and developmental stages to be monitored. This is an emerging field in plant science and is applied to diverse plant systems in order to elucidate the regulation of growth and development. The goal in plant metabolomics is to analyze, identify and quantify all low molecular weight molecules of plant organisms. The plant metabolites are extracted and analyzed using various sensitive analytical techniques, usually mass spectrometry (MS) in combination with chromatography. In order to compare the metabolome of different plants in a high through-put manner, a number of biological, analytical and data processing steps have to be performed. In the work underlying this thesis we developed a fast and robust method for routine analysis of plant metabolite patterns using Gas Chromatography-Mass Spectrometry (GC/MS). The method was performed according to Design of Experiment (DOE) to investigate factors affecting the extraction and derivatization of the metabolites from leaves of the plant Arabidopsis thaliana. The outcome of metabolic analysis by GC/MS is a complex mixture of approximately 400 overlapping peaks. Resolving (deconvoluting) overlapping peaks is time-consuming, difficult to automate and additional processing is needed in order to compare samples. To avoid deconvolution being a major bottleneck in high through-put analyses we developed a new semi-automated strategy using hierarchical methods for processing GC/MS data that can be applied to all samples simultaneously. The two methods include base-line correction of the non-processed MS-data files, alignment, time-window determinations, Alternating Regression and multivariate analysis in order to detect metabolites that differ in relative concentrations between samples. The developed methodology was applied to study the effects of the plant hormone GA on the metabolome, with specific emphasis on auxin levels in Arabidopsis thaliana mutants defective in GA biosynthesis and signalling. A large series of plant samples was analysed and the resulting data were processed in less than one week with minimal labour; similar to the time required for the GC/MS analyses of the samples

    Easing the Transition from Inspiration to Implementation: A Rapid Prototyping Platform for Wireless Medium Access Control Protocols

    Get PDF
    Packet broadcast networks are in widespread use in modern wireless communication systems. Medium access control is a key functionality within such technologies. A substantial research effort has been and continues to be invested into the study of existing protocols and the development of new and specialised ones. Academic researchers are restricted in their studies by an absence of suitable wireless MAC protocol development methods. This thesis describes an environment which allows rapid prototyping and evaluation of wireless medium access control protocols. The proposed design flow allows specification of the protocol using the specification and description language (SDL) formal description technique. A tool is presented to convert the SDL protocol description into a C++ model suitable for integration into both simulation and implementation environments. Simulations at various levels of abstraction are shown to be relevant at different stages of protocol design. Environments based on the Cinderella SDL simulator and the ns-2 network simulator have been developed which allow early functional verification, along with detailed and accurate performance analysis of protocols under development. A hardware platform is presented which allows implementation of protocols with flexibility in the hardware/software trade-off. Measurement facilities are integral to the hardware framework, and provide a means for accurate real-world feedback on protocol performance

    A graphical representation for the formal description technique Estelle

    Get PDF
    Includes bibliographical references.This dissertation concerns the specification and description of complex communicating systems using Formal Description Techniques. Specifically, we propose a standard graphical representation for the Formal Description Technique Estelle and present a prototype editor based on this representation. Together they integrate the new graphical representation with existing Estelle textual tools to create a powerful graphical design technique for Estelle. The perennial popularity of graphical techniques, combined with recent advances in computer graphics hardware and software which enable their effective application in a computing environment, provide a double impetus for the development of a graphical representation for Estelle. Most importantly, a graphical technique is more easily read and understood by humans, and can better describe the complex structure and inter-relationships of components of concurrent communicating systems. Modern graphical technology also presents a number of opportunities, separate from the specification method, such as hyperlinking, multiple windows and hiding of detail, which enrich the graphical technique. The prototype editor makes use of these opportunities to provide the protocol engineer with an advanced interface which actively supports the protocol design process to improve the quality of design. The editor also implements translations between the graphical representation and the standard Estelle textual representation, on the one hand allowing the graphical interpretation to be applied to existing textual specifications, and on the other, the application of existing text-based processing tools to a graphical specification description

    A new framework for global data regulation

    Full text link
    Under the current regulatory framework for data protections, the protection of human rights writ large and the corresponding outcomes are regulated largely independently from the data and tools that both threaten those rights and are needed to protect them. This separation between tools and the outcomes they generate risks overregulation of the data and tools themselves when not linked to sensitive use cases. In parallel, separation risks under-regulation if the data can be collected and processed under a less-restrictive framework, but used to drive an outcome that requires additional sensitivity and restrictions. A new approach is needed to support differential protections based on the genuinely high-risk use cases within each sector. Here, we propose a regulatory framework designed to apply not to specific data or tools themselves, but to the outcomes and rights that are linked to the use of these data and tools in context. This framework is designed to recognize, address, and protect a broad range of human rights, including privacy, and suggests a more flexible approach to policy making that is aligned with current engineering tools and practices. We test this framework in the context of open banking and describe how current privacy-enhancing technologies and other engineering strategies can be applied in this context and that of contract tracing applications. This approach for data protection regulations more effectively builds on existing engineering tools and protects the wide range of human rights defined by legislation and constitutions around the globe.Comment: 15 pages, 2 figure

    Evaluation of the Statistical Machine Translation Service for Croatian-English

    Get PDF
    Much thought has been given in an endeavour to formalize the translation process. As a result, various approaches to MT (machine translation) were taken. With the exception of statistical translation, all approaches require cooperation between language and computer science experts. Most of the models use various hybrid approaches. Statistical translation approach is completely language independent if we disregard the fact that it requires huge parallel corpus that needs to be split into sentences and words. This paper compares and discusses state-of-the-art statistical machine translation (SMT) models and evaluation methods. Results of statistically-based Google Translate tool for Croatian-English translations are presented and multilevel analysis is given. Three different types of texts are manually evaluated and results are analysed by the χ2-test

    STGT program: Ada coding and architecture lessons learned

    Get PDF
    STGT (Second TDRSS Ground Terminal) is currently halfway through the System Integration Test phase (Level 4 Testing). To date, many software architecture and Ada language issues have been encountered and solved. This paper, which is the transcript of a presentation at the 3 Dec. meeting, attempts to define these lessons plus others learned regarding software project management and risk management issues, training, performance, reuse, and reliability. Observations are included regarding the use of particular Ada coding constructs, software architecture trade-offs during the prototyping, development and testing stages of the project, and dangers inherent in parallel or concurrent systems, software, hardware, and operations engineering

    An approach to enacting business process models in support of the life cycle of integrated manufacturing systems

    Get PDF
    The complexity of enterprise engineering processes requires the application of reference architectures as means of guiding the achievement of an adequate level of business integration. This research aims to address important aspects of this requirement by associating the formalism of reference architectures to various life cycle phases of integrating manufacturing systems (IMS) and enabling their use in addressing contemporary system engineering issues. In pursuit of this aim, the following research activities were carried out: (1) to devise a framework which supports key phases of the IMS life cycle and (2) to populate part of this framework with an initial combination of architectures which can be encapsulated into a computer-aided systems engineering environment. This has led to the creation of a workbench capable of providing support for modelling, analysis, simulation, rapid-prototyping, configuration and run-time operation of an IMS, based on a consistent set of models associated with the engineering processes involved. The research effort concentrated on selecting and investigating the use of appropriate formalisms which underpin a selection of architectures and tools (i. e. CIM-OSA, Petrinets, object-oriented methods and CIM-BIOSYS), this by designing, implementing, applying and testing the workbench. The main contribution of this research is to demonstrate that it is possible to retain an adequate level of formalism, via computational structures and models, which extend through the IMS life cycle from a conceptual description of the system through to actions that the system performs when operating. The underlying methodology which supported this contribution is based on enacting models of system behaviour which encode important coordination aspects of manufacturing systems. The strategy for demonstrating the incorporation of formalism to the IMS life cycle was to enable the aggregation into a workbench of knowledge of 'what' the system is expected to achieve (i. e. 'problems' to be addressed) and 'how' the system can achieve it (i. e possible 'solutions'). Within the workbench, such a knowledge is represented through an amalgamation of business process modelling and object-oriented modelling approaches which, when adequately manipulated, can lead to business integration

    Domain architecture a design framework for system development and integration

    Get PDF
    The ever growing complexity of software systems has revealed many short-comings in existing software engineering practices and has raised interest in architecture-driven software development. A system\u27s architecture provides a model of the system that suppresses implementation detail, allowing the architects to concentrate on the analysis and decisions that are most critical to structuring the system to satisfy its requirements. Recently, interests of researchers and practi-tioners have shifted from individual system architectures to architectures for classes of software systems which provide more general, reusable solutions to the issues of overall system organization, interoperability, and allocation of services to system components. These generic architectures, such as product line architectures and domain architectures, promote reuse and interoperability, and create a basis for cost effective construction of high-quality systems. Our focus in this dissertation is on domain architectures as a means of development and integration of large-scale, domain-specific business software systems. Business imperatives, including flexibility, productivity, quality, and ability to adapt to changes, have fostered demands for flexible, coherent and enterprise--wide integrated business systems. The components of such systems, developed separately or purchased off the shelf, need to cohesively form an overall compu-tational environment for the business. The inevitable complexity of such integrated solutions and the highly-demanding process of their construction, management, and evolution support require new software engineering methodologies and tools. Domain architectures, prescribing the organization of software systems in a business domain, hold a promise to serve as a foundation on which such integrated business systems can be effectively constructed. To meet the above expectations, software architectures must be properly defined, represented, and applied, which requires suitable methodologies as well as process and tool support. Despite research efforts, however, state-of-the-art methods and tools for architecture-based system development do not yet meet the practical needs of system developers. The primary focus of this dissertation is on developing methods and tools to support domain architecture engineering and on leveraging architectures to achieve improved system development and integration in presence of increased complexity. In particular, the thesis explores issues related to the following three aspects of software technology: system complexity and software architectures as tools to alleviate complexity; domain architectures as frameworks for construction of large scale, flexible, enterprise-wide software systems; and architectural models and representation techniques as a basis for good” design. The thesis presents an archi-tectural taxonomy to help categorize and better understand architectural efforts. Furthermore, it clarifies the purpose of domain architectures and characterizes them in detail. To support the definition and application of domain architectures we have developed a method for domain architecture engineering and representation: GARM-ASPECT. GARM, the Generic Architecture Reference Model, underlying the method, is a system of modeling abstractions, relations and recommendations for building representations of reference software architectures. The model\u27s focus on reference and domain architectures determines its main distinguishing features: multiple views of architectural elements, a separate rule system to express constraints on architecture element types, and annotations such as “libraries” of patterns and “logs” of guidelines. ASPECT is an architecture description language based on GARM. It provides a normalized vocabulary for representing the skeleton of an architecture, its structural view, and establishes a framework for capturing archi-tectural constraints. It also allows extensions of the structural view with auxiliary information, such as behavior or quality specifications. In this respect, ASPECT provides facilities for establishing relationships among different specifications and gluing them together within an overall architectural description. This design allows flexibility and adaptability of the methodology to the specifics of a domain or a family of systems. ASPECT supports the representation of reference architectures as well as individual system architectures. The practical applicability of this method has been tested through a case study in an industrial setting. The approach to architecture engineering and representation, presented in this dissertation, is pragmatic and oriented towards software practitioners. GARM-ASPECT, as well as the taxonomy of architectures are of use to architects, system planners and system engineers. Beyond these practical contributions, this thesis also creates a more solid basis for expbring the applicability of architectural abstractions, the practicality of representation approaches, and the changes required to the devel-opment process in order to achieve the benefits from an architecture-driven software technology
    corecore