5,663 research outputs found

    Computational tools for low energy building design : capabilities and requirements

    Get PDF
    Integrated building performance simulation (IBPS) is an established technology, with the ability to model the heat, mass, light, electricity and control signal flows within complex building/plant systems. The technology is used in practice to support the design of low energy solutions and, in Europe at least, such use is set to expand with the advent of the Energy Performance of Buildings Directive, which mandates a modelling approach to legislation compliance. This paper summarises IBPS capabilities and identifies developments that aim to further improving integrity vis-à-vis the reality

    Automatic surrogate model type selection during the optimization of expensive black-box problems

    Get PDF
    The use of Surrogate Based Optimization (SBO) has become commonplace for optimizing expensive black-box simulation codes. A popular SBO method is the Efficient Global Optimization (EGO) approach. However, the performance of SBO methods critically depends on the quality of the guiding surrogate. In EGO the surrogate type is usually fixed to Kriging even though this may not be optimal for all problems. In this paper the authors propose to extend the well-known EGO method with an automatic surrogate model type selection framework that is able to dynamically select the best model type (including hybrid ensembles) depending on the data available so far. Hence, the expected improvement criterion will always be based on the best approximation available at each step of the optimization process. The approach is demonstrated on a structural optimization problem, i.e., reducing the stress on a truss-like structure. Results show that the proposed algorithm consequently finds better optimums than traditional kriging-based infill optimization

    Can geocomputation save urban simulation? Throw some agents into the mixture, simmer and wait ...

    Get PDF
    There are indications that the current generation of simulation models in practical, operational uses has reached the limits of its usefulness under existing specifications. The relative stasis in operational urban modeling contrasts with simulation efforts in other disciplines, where techniques, theories, and ideas drawn from computation and complexity studies are revitalizing the ways in which we conceptualize, understand, and model real-world phenomena. Many of these concepts and methodologies are applicable to operational urban systems simulation. Indeed, in many cases, ideas from computation and complexity studies—often clustered under the collective term of geocomputation, as they apply to geography—are ideally suited to the simulation of urban dynamics. However, there exist several obstructions to their successful use in operational urban geographic simulation, particularly as regards the capacity of these methodologies to handle top-down dynamics in urban systems. This paper presents a framework for developing a hybrid model for urban geographic simulation and discusses some of the imposing barriers against innovation in this field. The framework infuses approaches derived from geocomputation and complexity with standard techniques that have been tried and tested in operational land-use and transport simulation. Macro-scale dynamics that operate from the topdown are handled by traditional land-use and transport models, while micro-scale dynamics that work from the bottom-up are delegated to agent-based models and cellular automata. The two methodologies are fused in a modular fashion using a system of feedback mechanisms. As a proof-of-concept exercise, a micro-model of residential location has been developed with a view to hybridization. The model mixes cellular automata and multi-agent approaches and is formulated so as to interface with meso-models at a higher scale

    Space station advanced automation

    Get PDF
    In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software

    State of the Art in the Optimisation of Wind Turbine Performance Using CFD

    Get PDF
    Wind energy has received increasing attention in recent years due to its sustainability and geographically wide availability. The efficiency of wind energy utilisation highly depends on the performance of wind turbines, which convert the kinetic energy in wind into electrical energy. In order to optimise wind turbine performance and reduce the cost of next-generation wind turbines, it is crucial to have a view of the state of the art in the key aspects on the performance optimisation of wind turbines using Computational Fluid Dynamics (CFD), which has attracted enormous interest in the development of next-generation wind turbines in recent years. This paper presents a comprehensive review of the state-of-the-art progress on optimisation of wind turbine performance using CFD, reviewing the objective functions to judge the performance of wind turbine, CFD approaches applied in the simulation of wind turbines and optimisation algorithms for wind turbine performance. This paper has been written for both researchers new to this research area by summarising underlying theory whilst presenting a comprehensive review on the up-to-date studies, and experts in the field of study by collecting a comprehensive list of related references where the details of computational methods that have been employed lately can be obtained

    Cellular Helmet Liner Design through Bio-inspired Structures and Topology Optimization of Compliant Mechanism Lattices

    Get PDF
    The continuous development of sport technologies constantly demands advancements in protective headgear to reduce the risk of head injuries. This article introduces new cellular helmet liner designs through two approaches. The first approach is the study of energy-absorbing biological materials. The second approach is the study of lattices comprised of force-diverting compliant mechanisms. On the one hand, bio-inspired liners are generated through the study of biological, hierarchical materials. An emphasis is given on structures in nature that serve similar concussion-reducing functions as a helmet liner. Inspiration is drawn from organic and skeletal structures. On the other hand, compliant mechanism lattice (CML)-based liners use topology optimization to synthesize rubber cellular unit cells with effective positive and negative Poisson's ratios. Three lattices are designed using different cellular unit cell arrangements, namely, all positive, all negative, and alternating effective Poisson's ratios. The proposed cellular (bio-inspired and CML-based) liners are embedded between two polycarbonate shells, thereby, replacing the traditional expanded polypropylene foam liner used in standard sport helmets. The cellular liners are analyzed through a series of 2D extruded ballistic impact simulations to determine the best performing liner topology and its corresponding rubber hardness. The cellular design with the best performance is compared against an expanded polypropylene foam liner in a 3D simulation to appraise its protection capabilities and verify that the 2D extruded design simulations scale to an effective 3D design

    A short history off-line

    Get PDF
    Emerging technologies for learning report - Article exploring the history of ICT in education and the lessons we can learn from the pas

    Value-based global optimization

    Get PDF
    Computational models and simulations are essential system design tools that allow for improved decision making and cost reductions during all phases of the design process. However, the most accurate models are often computationally expensive and can therefore only be used sporadically. Consequently, designers are often forced to choose between exploring many design alternatives with less accurate, inexpensive models and evaluating fewer alternatives with the most accurate models. To achieve both broad exploration of the alternatives and accurate determination of the best alternative with reasonable costs incurred, surrogate modeling and variable accuracy modeling are used widely. A surrogate model is a mathematically tractable approximation of a more expensive model based on a limited sampling of that model, while variable accuracy modeling involves a collection of different models of the same system with different accuracies and computational costs. As compared to using only very accurate and expensive models, designers can determine the best solutions more efficiently using surrogate and variable accuracy models because obviously poor solutions can be eliminated inexpensively using only the less expensive, less accurate models. The most accurate models are then reserved for discerning the best solution from the set of good solutions. In this thesis, a Value-Based Global Optimization (VGO) algorithm is introduced. The algorithm uses kriging-like surrogate models and a sequential sampling strategy based on Value of Information (VoI) to optimize an objective characterized by multiple analysis models with different accuracies. It builds on two primary research contributions. The first is a novel surrogate modeling method that accommodates data from any number of analysis models with different accuracies and costs. The second contribution is the use of Value of Information (VoI) as a new metric for guiding the sequential sampling process for global optimization. In this manner, the cost of further analysis is explicitly taken into account during the optimization process. Results characterizing the algorithm show that VGO outperforms Efficient Global Optimization (EGO), a similar global optimization algorithm that is considered to be the current state of the art. It is shown that when cost is taken into account in the final utility, VGO achieves a higher utility than EGO with statistical significance. In further experiments, it is shown that VGO can be successfully applied to higher dimensional problems as well as practical engineering design examples.PhDCommittee Chair: Paredis, Chris; Committee Member: Bras, Bert; Committee Member: Leamy, Michael; Committee Member: Romero, David; Committee Member: Wu, C.F. Jef

    Mapping knowledge management and organizational learning in support of organizational memory

    Get PDF
    The normative literature within the field of Knowledge Management has concentrated on techniques and methodologies for allowing knowledge to be codified and made available to individuals and groups within organizations. The literature on Organizational Learning however, has tended to focus on aspects of knowledge that are pertinent at the macro-organizational level (i.e. the overall business). The authors attempt in this paper to address a relative void in the literature, aiming to demonstrate the inter-locking factors within an enterprise information system that relate knowledge management and organizational learning, via a model that highlights key factors within such an inter-relationship. This is achieved by extrapolating data from a manufacturing organization using a case study, with these data then modeled using a cognitive mapping technique (Fuzzy Cognitive Mapping, FCM). The empirical enquiry explores an interpretivist view of knowledge, within an Information Systems Evaluation (ISE) process, through the associated classification of structural, interpretive and evaluative knowledge. This is achieved by visualizng inter-relationships within the ISE decision-making approach in the case organization. A number of decision paths within the cognitive map are then identified such that a greater understanding of ISE can be sought. The authors therefore present a model that defines a relationship between Knowledge Management (KM) and Organisational Learning (OL), and highlights factors that can lead a firm to develop itself towards a learning organization

    Equifinality of formal (DREAM) and informal (GLUE) Bayesian approaches in hydrologic modeling?

    Get PDF
    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented using the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchment
    corecore