476,832 research outputs found

    Teaching Construction in the Virtual University: the WINDS project

    No full text
    This paper introduces some of the Information Technology solutions adopted in Web based INtelligent Design Support (WINDS) to support education in A/E/C design. The WINDS project WINDS is an EC-funded project in the 5th Framework, Information Society Technologies programme, Flexible University key action. WINDS is divided into two actions: ·The research technology action is going to implement a learning environment integrating an intelligent tutoring system, a computer instruction management system and a set of co-operative supporting tools. ·The development action is going to build a large knowledge base supporting Architecture and Civil Engineering Design Courses and to experiment a comprehensive Virtual School of Architecture and Engineering Design. During the third year of the project, more than 400 students all over Europe will attend the Virtual School. During the next three years the WINDS project will span a total effort of about 150 man-years from 28 partners of 10 European countries. The missions of the WINDS project are: Advanced Methodologies in Design Education. WINDS drives a breakdown with conventional models in design education, i.e. classroom or distance education. WINDS implements a problem oriented knowledge transfer methodology following Roger Schank's Goal Based Scenario (GBS) pedagogical methodology. GBS encourages the learning of both skills and cases, and fosters creative problem solving. Multidisciplinary Design Education. Design requires creative synthesis and open-end problem definition at the intersection of several disciplines. WINDS experiments a valuable integration of multidisciplinary design knowledge and expertise to produce a high level standard of education. Innovative Representation, Delivery and Access to Construction Education. WINDS delivers individual education customisation by allowing the learner access through the Internet to a wide range of on-line courses and structured learning objects by means of personally tailored learning strategies. WINDS promotes the 3W paradigm: learn What you need, Where you want, When you require. Construction Practice. Construction industry is a repository of ""best practices"" and knowledge that the WINDS will profit. WINDS system benefits the ISO10303 and IFC standards to acquire knowledge of the construction process directly in digital format. On the other hand, WINDS reengineers the knowledge in up-to-date courses, educational services, which the industries can use to provide just-in-time rather than in-advance learning. WINDS IT Solutions The missions of the WINDS project state many challenging requirements both in knowledge and system architecture. Many of the solutions adopted in these fields are innovative; others are evolution of existing technologies. This paper focuses on the integration of this set of state-of-the-art technologies in an advanced and functionally sound Computer Aided Instruction system for A/E/C Design. In particular the paper deals with the following aspects: Standard Learning Technology Architecture The WINDS system relies on the in progress IEEE 1484.1 Learning Technology Standard Architecture. According to this standard the system consists of two data stores, the Knowledge Library and the Record Database, and four process: System Coach, Delivery, Evaluation and the Learner. WINDS implements the Knowledge Library into a three-tier architecture: 1.Learning Objects: ·Learning Units are collections of text and multimedia data. ·Models are represented in either IFC or STEP formats. ·Cases are sets of Learning Units and Models. Cases are noteworthy stories, which describes solutions, integrate technical detail, contain relevant design failures etc. 2.Indexes refer to the process in which the identification of relevant topics in design cases and learning units takes place. Indexing process creates structures of Learning Objects for course management, profile planning procedures and reasoning processes. 3.Courses are taxonomies of either Learning Units or a design task and Course Units. Knowledge Representation WINDS demonstrates that it is possible and valuable to integrate a widespread design expertise so that it can be effectively used to produce a high level standard of education. To this aim WINDS gathers area knowledge, design skills and expertise under the umbrellas of common knowledge representation structures and unambiguous semantics. Cases are one of the most valuable means for the representation of design expertise. A Case is a set of Learning Units and Product Models. Cases are noteworthy stories, which describe solutions, integrate technical details, contain relevant design failures, etc. Knowledge Integration Indexes are a medium among different kind of knowledge: they implement networks for navigation and access to disparate documents: HTML, video, images, CAD and product models (STEP or IFC). Concept indexes link learning topics to learning objects and group them into competencies. Index relationships are the base of the WINDS reasoning processes, and provide the foundation for system coaching functions, which proactively suggest strategies, solutions, examples and avoids students' design deadlock. Knowledge Distribution To support the data stores and the process among the partners in 10 countries efficiently, WINDS implements an object oriented client/server as COM objects. Behind the DCOM components there is the Dynamic Kernel, which dynamically embodies and maintains data stores and process. Components of the Knowledge Library can reside on several servers across the Internet. This provides for distributed transactions, e.g. a change in one Learning Object affects the Knowledge Library spread across several servers in different countries. Learning objects implemented as COM objects can wrap ownership data. Clear and univocal definition of ownerships rights enables Universities, in collaboration with telecommunication and publisher companies, to act as "education brokers". Brokerage in education and training is an innovative paradigm to provide just-in-time and personally customised value added learning knowledg

    A framework for the simulation of structural software evolution

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2008 ACM.As functionality is added to an aging piece of software, its original design and structure will tend to erode. This can lead to high coupling, low cohesion and other undesirable effects associated with spaghetti architectures. The underlying forces that cause such degradation have been the subject of much research. However, progress in this field is slow, as its complexity makes it difficult to isolate the causal flows leading to these effects. This is further complicated by the difficulty of generating enough empirical data, in sufficient quantity, and attributing such data to specific points in the causal chain. This article describes a framework for simulating the structural evolution of software. A complete simulation model is built by incrementally adding modules to the framework, each of which contributes an individual evolutionary effect. These effects are then combined to form a multifaceted simulation that evolves a fictitious code base in a manner approximating real-world behavior. We describe the underlying principles and structures of our framework from a theoretical and user perspective; a validation of a simple set of evolutionary parameters is then provided and three empirical software studies generated from open-source software (OSS) are used to support claims and generated results. The research illustrates how simulation can be used to investigate a complex and under-researched area of the development cycle. It also shows the value of incorporating certain human traits into a simulation—factors that, in real-world system development, can significantly influence evolutionary structures

    Encapsulating and representing the knowledge on the evaluation of an engineering system

    Get PDF
    This paper proposes a cross-disciplinary methodology for a fundamental question in product development: How can the innovation patterns during the evolution of an engineering system (ES) be encapsulated, so that it can later be mined through data mining analysis methods? Reverse engineering answers the question of which components a developed engineering system consists of, and how the components interact to make the working product. TRIZ answers the question of which problem-solving principles can be, or have been employed in developing that system, in comparison to its earlier versions, or with respect to similar systems. While these two methodologies have been very popular, to the best of our knowledge, there does not yet exist a methodology that reverseengineers and encapsulates and represents the information regarding the complete product development process in abstract terms. This paper suggests such a methodology, that consists of mathematical formalism, graph visualization, and database representation. The proposed approach is demonstrated by analyzing the design and development process for a prototype wrist-rehabilitation robot

    A Taxonomy for a Constructive Approach to Software Evolution

    Get PDF
    In many software design and evaluation techniques, either the software evolution problem is not systematically elaborated, or only the impact of evolution is considered. Thus, most of the time software is changed by editing the components of the software system, i.e. breaking down the software system. The software engineering discipline provides many mechanisms that allow evolution without breaking down the system; however, the contexts where these mechanisms are applicable are not taken into account. Furthermore, the software design and evaluation techniques do not support identifying these contexts. In this paper, we provide a taxonomy of software evolution that can be used to identify the context of the evolution problem. The identified contexts are used to retrieve, from the software engineering discipline, the mechanisms, which can evolve the software software without breaking it down. To build such a taxonomy, we build a model for software evolution and use this model to identify the factors that effect the selection of software evolution\ud mechanisms. Our approach is based on solution sets, however; the contents of these sets may vary at different stages of the software life-cycle. To address this problem, we introduce perspectives; that are filters to select relevant elements from a solution set. We apply our taxonomy to a parser tool to show how it coped with problematic evolution problems

    ACon: A learning-based approach to deal with uncertainty in contextual requirements at runtime

    Get PDF
    Context: Runtime uncertainty such as unpredictable operational environment and failure of sensors that gather environmental data is a well-known challenge for adaptive systems. Objective: To execute requirements that depend on context correctly, the system needs up-to-date knowledge about the context relevant to such requirements. Techniques to cope with uncertainty in contextual requirements are currently underrepresented. In this paper we present ACon (Adaptation of Contextual requirements), a data-mining approach to deal with runtime uncertainty affecting contextual requirements. Method: ACon uses feedback loops to maintain up-to-date knowledge about contextual requirements based on current context information in which contextual requirements are valid at runtime. Upon detecting that contextual requirements are affected by runtime uncertainty, ACon analyses and mines contextual data, to (re-)operationalize context and therefore update the information about contextual requirements. Results: We evaluate ACon in an empirical study of an activity scheduling system used by a crew of 4 rowers in a wild and unpredictable environment using a complex monitoring infrastructure. Our study focused on evaluating the data mining part of ACon and analysed the sensor data collected onboard from 46 sensors and 90,748 measurements per sensor. Conclusion: ACon is an important step in dealing with uncertainty affecting contextual requirements at runtime while considering end-user interaction. ACon supports systems in analysing the environment to adapt contextual requirements and complements existing requirements monitoring approaches by keeping the requirements monitoring specification up-to-date. Consequently, it avoids manual analysis that is usually costly in today’s complex system environments.Peer ReviewedPostprint (author's final draft

    Upon accounting for the impact of isoenzyme loss, gene deletion costs anticorrelate with their evolutionary rates

    Get PDF
    System-level metabolic network models enable the computation of growth and metabolic phenotypes from an organism’s genome. In particular, flux balance approaches have been used to estimate the contribution of individual metabolic genes to organismal fitness, offering the opportunity to test whether such contributions carry information about the evolutionary pressure on the corresponding genes. Previous failure to identify the expected negative correlation between such computed gene-loss cost and sequence-derived evolutionary rates in Saccharomyces cerevisiae has been ascribed to a real biological gap between a gene’s fitness contribution to an organism “here and now” and the same gene’s historical importance as evidenced by its accumulated mutations over millions of years of evolution. Here we show that this negative correlation does exist, and can be exposed by revisiting a broadly employed assumption of flux balance models. In particular, we introduce a new metric that we call “function-loss cost”, which estimates the cost of a gene loss event as the total potential functional impairment caused by that loss. This new metric displays significant negative correlation with evolutionary rate, across several thousand minimal environments. We demonstrate that the improvement gained using function-loss cost over gene-loss cost is explained by replacing the base assumption that isoenzymes provide unlimited capacity for backup with the assumption that isoenzymes are completely non-redundant. We further show that this change of the assumption regarding isoenzymes increases the recall of epistatic interactions predicted by the flux balance model at the cost of a reduction in the precision of the predictions. In addition to suggesting that the gene-to-reaction mapping in genome-scale flux balance models should be used with caution, our analysis provides new evidence that evolutionary gene importance captures much more than strict essentiality.This work was supported by the National Science Foundation, grant CCF-1219007 to YX; the Natural Sciences and Engineering Research Council of Canada, grant RGPIN-2014-03892 to YX; the National Institute of Health, grants 5R01GM089978 and 5R01GM103502 to DS; the Army Research Office - Multidisciplinary University Research Initiative, grant W911NF-12-1-0390 to DS; the US Department of Energy, grant DE-SC0012627 to DS; and by the Canada Research Chairs Program (YX). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. (CCF-1219007 - National Science Foundation; RGPIN-2014-03892 - Natural Sciences and Engineering Research Council of Canada; 5R01GM089978 - National Institute of Health; 5R01GM103502 - National Institute of Health; W911NF-12-1-0390 - Army Research Office - Multidisciplinary University Research Initiative; DE-SC0012627 - US Department of Energy; Canada Research Chairs Program)Published versio

    Designing Traceability into Big Data Systems

    Full text link
    Providing an appropriate level of accessibility and traceability to data or process elements (so-called Items) in large volumes of data, often Cloud-resident, is an essential requirement in the Big Data era. Enterprise-wide data systems need to be designed from the outset to support usage of such Items across the spectrum of business use rather than from any specific application view. The design philosophy advocated in this paper is to drive the design process using a so-called description-driven approach which enriches models with meta-data and description and focuses the design process on Item re-use, thereby promoting traceability. Details are given of the description-driven design of big data systems at CERN, in health informatics and in business process management. Evidence is presented that the approach leads to design simplicity and consequent ease of management thanks to loose typing and the adoption of a unified approach to Item management and usage.Comment: 10 pages; 6 figures in Proceedings of the 5th Annual International Conference on ICT: Big Data, Cloud and Security (ICT-BDCS 2015), Singapore July 2015. arXiv admin note: text overlap with arXiv:1402.5764, arXiv:1402.575

    Pattern Reification as the Basis for Description-Driven Systems

    Full text link
    One of the main factors driving object-oriented software development for information systems is the requirement for systems to be tolerant to change. To address this issue in designing systems, this paper proposes a pattern-based, object-oriented, description-driven system (DDS) architecture as an extension to the standard UML four-layer meta-model. A DDS architecture is proposed in which aspects of both static and dynamic systems behavior can be captured via descriptive models and meta-models. The proposed architecture embodies four main elements - firstly, the adoption of a multi-layered meta-modeling architecture and reflective meta-level architecture, secondly the identification of four data modeling relationships that can be made explicit such that they can be modified dynamically, thirdly the identification of five design patterns which have emerged from practice and have proved essential in providing reusable building blocks for data management, and fourthly the encoding of the structural properties of the five design patterns by means of one fundamental pattern, the Graph pattern. A practical example of this philosophy, the CRISTAL project, is used to demonstrate the use of description-driven data objects to handle system evolution.Comment: 20 pages, 10 figure

    Robust multi-fidelity design of a micro re-entry unmanned space vehicle

    Get PDF
    This article addresses the preliminary robust design of a small-scale re-entry unmanned space vehicle by means of a hybrid optimization technique. The approach, developed in this article, closely couples an evolutionary multi-objective algorithm with a direct transcription method for optimal control problems. The evolutionary part handles the shape parameters of the vehicle and the uncertain objective functions, while the direct transcription method generates an optimal control profile for the re-entry trajectory. Uncertainties on the aerodynamic forces and characteristics of the thermal protection material are incorporated into the vehicle model, and a Monte-Carlo sampling procedure is used to compute relevant statistical characteristics of the maximum heat flux and internal temperature. Then, the hybrid algorithm searches for geometries that minimize the mean value of the maximum heat flux, the mean value of the maximum internal temperature, and the weighted sum of their variance: the evolutionary part handles the shape parameters of the vehicle and the uncertain functions, while the direct transcription method generates the optimal control profile for the re-entry trajectory of each individual of the population. During the optimization process, artificial neural networks are utilized to approximate the aerodynamic forces required by the optimal control solver. The artificial neural networks are trained and updated by means of a multi-fidelity approach: initially a low-fidelity analytical model, fitted on a waverider type of vehicle, is used to train the neural networks, and through the evolution a mix of analytical and computational fluid dynamic, high-fidelity computations are used to update it. The data obtained by the high-fidelity model progressively become the main source of updates for the neural networks till, near the end of the optimization process, the influence of the data obtained by the analytical model is practically nullified. On the basis of preliminary results, the adopted technique is able to predict achievable performance of the small spacecraft and the requirements in terms of thermal protection materials
    corecore