558 research outputs found

    Generating collaborative systems for digital libraries: A model-driven approach

    Get PDF
    This is an open access article shared under a Creative Commons Attribution 3.0 Licence (http://creativecommons.org/licenses/by/3.0/). Copyright @ 2010 The Authors.The design and development of a digital library involves different stakeholders, such as: information architects, librarians, and domain experts, who need to agree on a common language to describe, discuss, and negotiate the services the library has to offer. To this end, high-level, language-neutral models have to be devised. Metamodeling techniques favor the definition of domainspecific visual languages through which stakeholders can share their views and directly manipulate representations of the domain entities. This paper describes CRADLE (Cooperative-Relational Approach to Digital Library Environments), a metamodel-based framework and visual language for the definition of notions and services related to the development of digital libraries. A collection of tools allows the automatic generation of several services, defined with the CRADLE visual language, and of the graphical user interfaces providing access to them for the final user. The effectiveness of the approach is illustrated by presenting digital libraries generated with CRADLE, while the CRADLE environment has been evaluated by using the cognitive dimensions framework

    MDA-Based Reverse Engineering

    Get PDF

    The Application of Polynomial Response Surface and Polynomial Chaos Expansion Metamodels within an Augmented Reality Conceptual Design Environment

    Get PDF
    The engineering design process consists of many stages. In the conceptual phase, potential designs are generated and evaluated without considering specifics. Winning concepts then advance to the detail design and high fidelity simulation stages. At this point in the process, very accurate representations are made for each design and are then subjected to rigorous analysis. With the advancement of computer technology, these last two phases have been very well served by the software community. Engineering software such as computer-aided design (CAD), finite element analysis (FEA), and computational fluid dynamics (CFD) have become an inseparable part of the design process for many engineered products and processes. Conceptual design tools, on the other hand, have not undergone this type of advancement, where much of the work is still done with little to no digital technology. Detail oriented tools require a significant amount of time and training to use effectively. This investment is considered worthwhile when high fidelity models are needed. However, conceptual design has no need for this level of detail. Instead, rapid concept generation and evaluation are the primary goals. Considering the lack of adequate tools to suit these needs, new software was created. This thesis discusses the development of that conceptual design application. Traditional design tools rely on a two dimensional mouse to perform three dimensional actions. While many designers have become familiar with this approach, it is not intuitive to an inexperienced user. In order to enhance the usability of the developed application, a new interaction method was applied. Augmented reality (AR) is a developing research area that combines virtual elements with the real world. This capability was used to create a three dimensional interface for the engineering design application. Using specially tracked interface objects, the user\u27s hands become the primary method of interaction. Within this AR environment, users are able perform many of the basic actions available within a CAD system such as object manipulation, editing, and assembly. The same design environment also provides real time assessment data. Calculations for center of gravity and wheel loading can be done with the click of a few buttons. Results are displayed to the user in the AR scene. In order to support the quantitative analysis tools necessary for conceptual design, additional research was done in the area of metamodeling. Metamodels are capable of providing approximations for more complex analyses. In the case of the wheel loading calculation, the approximation takes the place of a time consuming FEA simulation. Two different metamodeling techniques were studied in this thesis: polynomial response surface (PRS) and polynomial chaos expansion (PCE). While only the wheel loading case study was included in the developed application, an additional design problem was analyzed to assess the capabilities of both methods for conceptual design. In the second study, the maximum stresses and displacements within the support frame of a bucket truck were modeled. The source data for building the approximations was generated via an FEA simulation of digital mockups, since no legacy data was available. With this information, experimental models were constructed by varying several factors, including: the distribution of source and test data, the number of input trials, the inclusion of interaction effects, and the addition of third order terms. Comparisons were also drawn between the two metamodeling techniques. For the wheel loading models, third order models with interaction effects provided a good fit of the data (root mean square error of less than 10%) with as few as thirty input data points. With minimal source data, however, second order models and those without interaction effects outperformed third order counterparts. The PRS and PCE methods performed almost equivalently with sufficient source data. Difference began to appear at the twenty trial case. PRS was more suited to wider distributions of data. The PCE technique better handled smaller distributions and extrapolation to larger test data. The support frame problem represented a more difficult analysis with non-linear responses. While initial third order results from the PCE models were better than those for PRS, both had significantly higher error than in the previous case study. However, with simpler second order models and sufficient input data (more than thirty trials) adequate approximation results were achieved. The less complex responses had error around 10%, and the model predictions for the non-linear response were reduced to around 20%. These results demonstrate that useful approximations can be constructed from minimal data. Such models, despite the uncertainty involved, will be able to provide designers with helpful information at the conceptual stage of a design process

    An Active Pattern Infrastructure for Domain-Specific Languages

    Get PDF
    Tool support for design patterns is a critically important area of computer-aided software engineering. With the proliferation of Domain-Specific Modeling Languages (DSMLs), the adaptation of the notion of design patterns appears to be a promising direction of research. This paper introduces a new approach to DSML patterns, namely, the Active Model Pattern infrastructure. In this framework, not only the traditional insertion of predefined partial models is supported, but interactive, localized design-time manipulation of models. Optionally, the infrastructure can be adapted to handling transactional tracing information as well as transactional undo and redo operations. Possible realizations of the framework are also discussed and compare

    Cost-Benefit Analysis of Error Reduction for Complex Systems

    Get PDF
    The cost-benefit tradeoff of analysis fidelity in complex systems analysis has been posed as an optimization problem providing quantitative guidance for investment in improving analysis. A nonlinear constrained optimizer to solve this problem has been integrated into an error tradeoff environment providing an intuitive tool for the decision maker to consider investment in fidelity at the same time as design decisions are considered. An example demonstrates the efficacy of the improved interface to enable fidelity improvement investment decisions

    A Design Pattern for Executable DSML

    Get PDF
    Model executability is now a key concern in model-driven engineering, mainly to support early validation and verification (V&V). Some approaches have allowed to weave executability into metamodels, defining executable domain-specific modeling languages (DSML). Then, model validation may be achieved by direct interpretation of the conforming models. Other approaches address model executability by model compilation, allowing to reuse the virtual machines or V&V tools existing in the target domain. Nevertheless, systematic methods are not available to help the language designer in the definition of such an execution semantics and related support tools. For instance, simulators are mostly hand-crafted in a tool specific manner for each DSML. In this paper, we propose to reify the elements commonly used to support execution in a DSML. We infer a design pattern (called Executable DSML pattern) providing a general reusable solution for the expression of the executability concerns in DSML. It favors flexibility and improves reusability in the definition of semantics-based tools for DSML. We illustrate how this pattern can be applied to V&V and models at runtime, and give insights on the development of generic and generative tools for model animators

    An Open Platform for Modeling Method Conceptualization: The OMiLAB Digital Ecosystem

    Get PDF
    This paper motivates, describes, demonstrates in use, and evaluates the Open Models Laboratory (OMiLAB)—an open digital ecosystem designed to help one conceptualize and operationalize conceptual modeling methods. The OMiLAB ecosystem, which a generalized understanding of “model value” motivates, targets research and education stakeholders who fulfill various roles in a modeling method\u27s lifecycle. While we have many reports on novel modeling methods and tools for various domains, we lack knowledge on conceptualizing such methods via a full-fledged dedicated open ecosystem and a methodology that facilitates entry points for novices and an open innovation space for experienced stakeholders. This gap continues due to the lack of an open process and platform for 1) conducting research in the field of modeling method design, 2) developing agile modeling tools and model-driven digital products, and 3) experimenting with and disseminating such methods and related prototypes. OMiLAB incorporates principles, practices, procedures, tools, and services required to address the issues above since it focuses on being the operational deployment for a conceptualization and operationalization process built on several pillars: 1) a granularly defined “modeling method” concept whose building blocks one can customize for the domain of choice, 2) an “agile modeling method engineering” framework that helps one quickly prototype modeling tools, 3) a model-aware “digital product design lab”, and 4) dissemination channels for reaching a global community. In this paper, we demonstrate and evaluate the OMiLAB in research with two selected application cases for domain- and case-specific requirements. Besides these exemplary cases, OMiLAB has proven to effectively satisfy requirements that almost 50 modeling methods raise and, thus, to support researchers in designing novel modeling methods, developing tools, and disseminating outcomes. We also measured OMiLAB’s educational impact

    Data-driven model based design and analysis of antenna structures

    Get PDF
    Data-driven models, or metamodels, offer an efficient way to mimic the behaviour of computation-intensive simulators. Subsequently, the usage of such computationally cheap metamodels is indispensable in the design of contemporary antenna structures where computation-intensive simulations are often performed in a large scale. Although metamodels offer sufficient flexibility and speed, they often suffer from an exponential growth of required training samples as the dimensionality of the problem increases. In order to alleviate this issue, a Gaussian process based approach, known as Gradient-Enhanced Kriging (GEK), is proposed in this work to achieve cost-efficient modelling of antenna structures. The GEK approach incorporates adjoint-based sensitivity data in addition to function data obtained from electromagnetic simulations. The approach is illustrated using a dielectric resonator and an ultra-wideband antenna structures. The method demonstrates significant accuracy improvement with the less number of training samples over the Ordinary Kriging (OK) approach which utilises function data only. The discussed technique has been favourably compared with OK in terms of computational cost
    • 

    corecore