26 research outputs found

    Metamodeling in EIA/CDIF - Meta-Metamodel and Metamodels

    Get PDF
    This article introduces the EIA/CDIF set of standards for the modeling of information systems and its exchange among computer-aided software tools of different vendors. It lays out the meta-metamodel and the standardized metamodels which are fully depicted in a hierarchical layout and annotated with the unique identifiers of all the standardized modeling concepts. The article also stresses the fact that EIA/CDIF has been used as the baseline in the creation of an international standard, the ISO/CDIF set of models, an ongoing project

    Precision Departure Release Capability (PDRC) Technology Description

    Get PDF
    After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Technology Description. Companion papers include the Final Report and a Concept of Operations

    A generic model for representing software development methods.

    Get PDF
    This thesis has adopted the premise that the use of a method offers a valuable contribution to the software development process. Many methods have not been adequately defined. This thesis is based on the hypothesis that it is possible to represent software development methods using a Generic Method Representation (GMR). This GMR includes the three basic components of the method, which are the product model, the process model and the heuristic model. The elements and interrelationships of these models are investigated. In addition to a graphical representation, a method specification language (MSL) is derived, to enhance the expressive and executable power of GMR. A three-stage knowledge acquisition model, known as IFV (inspection, fabrication and verification), is also introduced to elicit method semantics from the available acquisition media. Moreover, the key benefits of meta modelling, such as method comparison, fragment dissection, method evaluation and selection (or customisation) of a method, are highlighted. An application of GMR, that is the mapping to a practical metaCASE tool model, is also illustrated comprehensively to demonstrate the applicability of the approach

    Zeitgenaue Simulation gemischt virtuell-realer Prototypen

    Get PDF
    [no abstract

    A Model-Driven Framework to Support Games Development: An Application to Serious Games

    Get PDF
    Model Driven Engineering (MDE) is a software development approach which focuses on the creation of models to represent a domain with the aim of automatically generating software artefact using a set of software tools. This approach enables practitioners to produce a variation of software in by reusing the concepts in the domain model without worrying about the technical intricacies of software development. Therefore, this approach can help to increases productivity and it makes software design easier for the practitioners. The application of this approach into games development domain presents an interesting proposition and could help to simplify production of computer games.Computer games are interactive entertainment software designed and developed to engage users to participate in goal-directed play. Many find computer gaming to be persuasive and engaging, and they believe that through the application of game design and game technology in non-entertainment domains can create a positive impact. Computer games designed primarily for non-entertainment purpose are generally known as serious games. The development of games software, in no relation to the intended purpose of it, is technically complex and it requires specialist skills and knowledge. This is the major barrier that hinders domain experts who intend to apply computer gaming into their respective domains. Much research is already underway to address this challenge, whereby many of which have chosen to use readily available commercial-off-the-shelf games while others have attempted to develop serious games in-house or collaboratively with industry expertise. However, these approaches present issues including appropriateness of the serious game content and its activities, reliability of serious games developed and the financial cost involved. The MDE approach promises new hopes to the domain experts, especially to those with little or no technical knowledge who intend produce their own computer games. Using this approach, the technical aspects of games development can be hidden from the domain experts through the automated generation of software artefact. This simplifies the production of computer games and could provide the necessary support to help non-technical domain experts to realise their vision on serious gaming.This thesis investigates the development of a model-driven approach and technologies to aid non-technical domain experts in computer games production. It presents a novel model-driven games development framework designed to aid non-technical domain experts in producing computer games. A prototype based on the model-driven games development framework has been implemented to demonstrate the applicability of this solution. The framework has been validated through the prototypical implementations and these have been evaluated. A case study has been conducted to present a use-case scenario and to examine if this approach can help non-technical domain experts in producing computer games and also to find out if it would lower the barrier towards adoption of game-based learning as an alternative teaching and learning approach.The work in this thesis contributes to the area of software engineering in games. The contributions made in this research includes (1) a blueprint for model-driven engineering for games development, (2) a reusable formalised approach to document computer game design and (3) a model of game software that is independent of implementation platform

    Modellbasierter Hardware-in-the-Loop Test von eingebetteten elektronischen Systemen [online]

    Get PDF

    An approach to enacting business process models in support of the life cycle of integrated manufacturing systems

    Get PDF
    The complexity of enterprise engineering processes requires the application of reference architectures as means of guiding the achievement of an adequate level of business integration. This research aims to address important aspects of this requirement by associating the formalism of reference architectures to various life cycle phases of integrating manufacturing systems (IMS) and enabling their use in addressing contemporary system engineering issues. In pursuit of this aim, the following research activities were carried out: (1) to devise a framework which supports key phases of the IMS life cycle and (2) to populate part of this framework with an initial combination of architectures which can be encapsulated into a computer-aided systems engineering environment. This has led to the creation of a workbench capable of providing support for modelling, analysis, simulation, rapid-prototyping, configuration and run-time operation of an IMS, based on a consistent set of models associated with the engineering processes involved. The research effort concentrated on selecting and investigating the use of appropriate formalisms which underpin a selection of architectures and tools (i. e. CIM-OSA, Petrinets, object-oriented methods and CIM-BIOSYS), this by designing, implementing, applying and testing the workbench. The main contribution of this research is to demonstrate that it is possible to retain an adequate level of formalism, via computational structures and models, which extend through the IMS life cycle from a conceptual description of the system through to actions that the system performs when operating. The underlying methodology which supported this contribution is based on enacting models of system behaviour which encode important coordination aspects of manufacturing systems. The strategy for demonstrating the incorporation of formalism to the IMS life cycle was to enable the aggregation into a workbench of knowledge of 'what' the system is expected to achieve (i. e. 'problems' to be addressed) and 'how' the system can achieve it (i. e possible 'solutions'). Within the workbench, such a knowledge is represented through an amalgamation of business process modelling and object-oriented modelling approaches which, when adequately manipulated, can lead to business integration

    A framework for the analysis and evaluation of enterprise models

    Get PDF
    Bibliography: leaves 264-288.The purpose of this study is the development and validation of a comprehensive framework for the analysis and evaluation of enterprise models. The study starts with an extensive literature review of modelling concepts and an overview of the various reference disciplines concerned with enterprise modelling. This overview is more extensive than usual in order to accommodate readers from different backgrounds. The proposed framework is based on the distinction between the syntactic, semantic and pragmatic model aspects and populated with evaluation criteria drawn from an extensive literature survey. In order to operationalize and empirically validate the framework, an exhaustive survey of enterprise models was conducted. From this survey, an XML database of more than twenty relatively large, publicly available enterprise models was constructed. A strong emphasis was placed on the interdisciplinary nature of this database and models were drawn from ontology research, linguistics, analysis patterns as well as the traditional fields of data modelling, data warehousing and enterprise systems. The resultant database forms the test bed for the detailed framework-based analysis and its public availability should constitute a useful contribution to the modelling research community. The bulk of the research is dedicated to implementing and validating specific analysis techniques to quantify the various model evaluation criteria of the framework. The aim for each of the analysis techniques is that it can, where possible, be automated and generalised to other modelling domains. The syntactic measures and analysis techniques originate largely from the disciplines of systems engineering, graph theory and computer science. Various metrics to measure model hierarchy, architecture and complexity are tested and discussed. It is found that many are not particularly useful or valid for enterprise models. Hence some new measures are proposed to assist with model visualization and an original "model signature" consisting of three key metrics is proposed.Perhaps the most significant contribution ofthe research lies in the development and validation of a significant number of semantic analysis techniques, drawing heavily on current developments in lexicography, linguistics and ontology research. Some novel and interesting techniques are proposed to measure, inter alia, domain coverage, model genericity, quality of documentation, perspicuity and model similarity. Especially model similarity is explored in depth by means of various similarity and clustering algorithms as well as ways to visualize the similarity between models. Finally, a number of pragmatic analyses techniques are applied to the models. These include face validity, degree of use, authority of model author, availability, cost, flexibility, adaptability, model currency, maturity and degree of support. This analysis relies mostly on the searching for and ranking of certain specific information details, often involving a degree of subjective interpretation, although more specific quantitative procedures are suggested for some of the criteria. To aid future researchers, a separate chapter lists some promising analysis techniques that were investigated but found to be problematic from methodological perspective. More interestingly, this chapter also presents a very strong conceptual case on how the proposed framework and the analysis techniques associated vrith its various criteria can be applied to many other information systems research areas. The case is presented on the grounds of the underlying isomorphism between the various research areas and illustrated by suggesting the application of the framework to evaluate web sites, algorithms, software applications, programming languages, system development methodologies and user interfaces
    corecore