211,414 research outputs found

    Analysis of Residential Building Energy Code Compliance for New and Existing Buildings Based on Building Energy

    Get PDF
    Currently, the International Energy Conservation Code (IECC) is the most widely-used residential building energy code in the United States. Either the IECC or IECC with amendments has been adopted by 33 states. The latest version of the IECC contains three compliance requirements, including: mandatory, prescriptive, and performance paths for compliance. The performance path includes specifications for the standard house design and the proposed design to be analyzed using whole-building energy simulations. In the performance path, the annual simulated energy cost of the proposed house must be less than the annual energy cost (or source energy usage) of the standard reference house. Unfortunately, most of the whole-building energy simulation programs are too complicated to be used by building energy code officials or homeowners without special training. To resolve this problem, simplified simulation tools have been developed that require fewer user input parameters. Such simplified software tools have had a significant impact on the increased use of the performance-based code compliance path for residential analysis. However, many of the simplified features may not represent the energy efficient features found in an existing residence. This may mis-represent the potential energy saving when/if a house owner decides to invest in a retrofit to reduce their annual energy costs. Currently, there are building energy simulation validation methods developed by ASHRAE, and RESNET including: ASHRAE Standard-140, IEA BESTEST, HVAC BESTEST, and BESTEST-EX. These tests have been developed to test the algorithms of building energy performance simulation, which require complex inputs and outputs to view the test results. Unfortunately, even though two different building simulation validation programs may produce the necessary inputs/outputs for certification, they are rarely tested side-by-side or on actual residences. Furthermore, results from a simplified analysis of a building is rarely compared against a detailed simulation of an existing building. Therefore, there is a need to compare the results of a simplified simulation versus a detailed simulation of an existing residence to better determine which parameters best represent the existing house so more accurate code-compliant simulations can be performed on existing structures. The purpose of this study is to develop an accurate, detailed simulation model of an existing single-family residence that is compared with a simplified building energy simulation of the same residence to help determine which on-site measurements can be made to help tune the simplified model so it better represents the existing residence. Such an improved building energy simulation can be used to better represent annual energy cost savings from retrofits to an existing building

    Fire Emergency Evacuation from a School Building Using an Evolutionary Virtual Reality Platform

    Get PDF
    In the last few years, modern technologies such as numerical simulations, virtual and augmented reality, and agent-based models represented effective tools to study phenomena, which may not be experimentally reproduced due to costs, inherent hazards, or other constraints (e.g., fire or earthquake emergencies and evacuation from buildings). This paper shows how to integrate a virtual reality platform with numerical simulation tools to reproduce an evolutionary fire emergency scenario. It is computed in real time based on the building information model and a fluid dynamic software. A specific software was also used to simulate in real time the crowd dynamic in the virtual environment during the emergency evacuation process. To demonstrate the applicability of the proposed methodology, the emergency fire evacuation process for an existing school building is presented. The results show that the proposed virtual reality-based system can be employed for reproducing fire emergency scenarios. It can be used to help decision-makers to determine emergency plans and to help firefighters as a training tool to simulate emergency evacuation actions

    Lessons from the evacuation of the World Trade Center, Sept 11th 2001 for the future development of computer simulations

    Get PDF
    This paper provides an overview of the state of the art in evacuation simulations. These interactive computer based tools have been developed to help the owners and designers of large public buildings to assess the risks that occupants might face during emergency egress. The development of the Glasgow Evacuation Simulator is used to illustrate the existing generation of tools. This system uses Monte Carlo techniques to control individual and group movements during an evacuation. The end-user can interactively open and block emergency exits at any point. It is also possible to alter the priorities that individuals associate with particular exit routes. A final benefit is that the tool can derive evacuation simulations directly from existing architects models; this reduces the cost of simulations and creates a more prominent role for these tools in the iterative development of large-scale public buildings. Empirical studies have been used to validate the GES system as a tool to support evacuation training. The development of these tools has been informed by numerous human factors studies and by recent accident investigations. For example, the 2003 fire in the Station nightclub in Rhode Island illustrated the way in which most building occupants retrace their steps to an entrance even when there are alternate fire exits. The second half of this paper uses this introduction to criticise the existing state of the art in evacuation simulations. These criticisms are based on a detailed study of the recent findings from the 9/11 Commission (2004). Ten different lessons are identified. Some relate to the need to better understand the role of building management and security systems in controlling egress from public buildings. Others relate to the human factors involved in coordinating distributed groups of emergency personnel who may be physically exhausted by the demands of an evacuation. Arguably the most important findings centre on the need to model the ingress and egress of emergency personnel from these structures. The previous focus of nearly all-existing simulation tools has been on the evacuation of building occupants rather than on the safety of first responders1

    A Cryogenic Fluid System Simulation in Support of Integrated Systems Health Management

    Get PDF
    Simulations serve as important tools throughout the design and operation of engineering systems. In the context of sys-tems health management, simulations serve many uses. For one, the underlying physical models can be used by model-based health management tools to develop diagnostic and prognostic models. These simulations should incorporate both nominal and faulty behavior with the ability to inject various faults into the system. Such simulations can there-fore be used for operator training, for both nominal and faulty situations, as well as for developing and prototyping health management algorithms. In this paper, we describe a methodology for building such simulations. We discuss the design decisions and tools used to build a simulation of a cryogenic fluid test bed, and how it serves as a core technology for systems health management development and maturation

    nanoHUB.org: A Gateway to Undergraduate Simulation-Based Research in Materials Science and Related Fields

    Get PDF
    Our future engineers and scientists will likely be required to use advanced simulations to solve many of tomorrow\u27s challenges in nanotechnology. To prepare students to meet this need, the Network for Computational Nanotechnology (NCN) provides simulation-focused research experiences for undergraduates at an early point in their educational path, to increase the likelihood that they will ultimately complete a doctoral program. The NCN summer research program currently serves over 20 undergraduate students per year who are recruited nationwide, and selected by NCN and the faculty for aptitude in their chosen field within STEM, as well as complementary skills such as coding and written communication. Under the guidance of graduate student and faculty mentors, undergraduates modify or build nanoHUB simulation tools for exploring interdisciplinary problems in materials science and engineering, and related fields. While the summer projects exist within an overarching research context, the specific tasks that NCN undergraduate students engage in range from modifying existing tools to building new tools for nanoHUB and using them to conduct original research. Simulation tool development takes place within nanoHUB, using nanoHUB’s workspace, computational clusters, and additional training and educational resources. One objective of the program is for the students to publish their simulation tools on nanoHUB. These tools can be accessed and executed freely from around the world using a standard web-browser, and students can remain engaged with their work beyond the summer and into their careers. In this work, we will describe the NCN model for undergraduate summer research. We believe that our model is one that can be adopted by other universities, and will discuss the potential for others to engage undergraduate students in simulation-based research using free nanoHUB resources

    Automatic visualization and control of arbitrary numerical simulations

    Get PDF
    Authors’ preprint version as submitted to ECCOMAS Congress 2016, Minisymposium 505 - Interactive Simulations in Computational Engineering. Abstract: Visualization of numerical simulation data has become a cornerstone for many industries and research areas today. There exists a large amount of software support, which is usually tied to specific problem domains or simulation platforms. However, numerical simulations have commonalities in the building blocks of their descriptions (e. g., dimensionality, range constraints, sample frequency). Instead of encoding these descriptions and their meaning into software architecures we propose to base their interpretation and evaluation on a data-centric model. This approach draws much inspiration from work of the IEEE Simulation Interoperability Standards Group as currently applied in distributed (military) training and simulation scenarios and seeks to extend those ideas. By using an extensible self-describing protocol format, simulation users as well as simulation-code providers would be able to express the meaning of their data even if no access to the underlying source code was available or if new and unforseen use cases emerge. A protocol definition will allow simulation-domain experts to describe constraints that can be used for automatically creating appropriate visualizations of simulation data and control interfaces. Potentially, this will enable leveraging innovations on both the simulation and visualization side of the problem continuum. We envision the design and development of algorithms and software tools for the automatic visualization of complex data from numerical simulations executed on a wide variety of platforms (e. g., remote HPC systems, local many-core or GPU-based systems). We also envisage using this automatically gathered information to control (or steer) the simulation while it is running, as well as providing the ability for fine-tuning representational aspects of the visualizations produced

    Automatic visualization and control of arbitrary numerical simulations

    Get PDF
    Authors’ preprint version as submitted to ECCOMAS Congress 2016, Minisymposium 505 - Interactive Simulations in Computational Engineering. Abstract: Visualization of numerical simulation data has become a cornerstone for many industries and research areas today. There exists a large amount of software support, which is usually tied to specific problem domains or simulation platforms. However, numerical simulations have commonalities in the building blocks of their descriptions (e. g., dimensionality, range constraints, sample frequency). Instead of encoding these descriptions and their meaning into software architecures we propose to base their interpretation and evaluation on a data-centric model. This approach draws much inspiration from work of the IEEE Simulation Interoperability Standards Group as currently applied in distributed (military) training and simulation scenarios and seeks to extend those ideas. By using an extensible self-describing protocol format, simulation users as well as simulation-code providers would be able to express the meaning of their data even if no access to the underlying source code was available or if new and unforseen use cases emerge. A protocol definition will allow simulation-domain experts to describe constraints that can be used for automatically creating appropriate visualizations of simulation data and control interfaces. Potentially, this will enable leveraging innovations on both the simulation and visualization side of the problem continuum. We envision the design and development of algorithms and software tools for the automatic visualization of complex data from numerical simulations executed on a wide variety of platforms (e. g., remote HPC systems, local many-core or GPU-based systems). We also envisage using this automatically gathered information to control (or steer) the simulation while it is running, as well as providing the ability for fine-tuning representational aspects of the visualizations produced

    APPLICATION OF CLOUD-BASED SPREADSHEETS TO ARTIFICIAL NEURAL NETWORK MODELLING

    Get PDF
    The article substantiates the necessity to develop methods of computer simulation of neural networks in the spreadsheet environment. The systematic review of their application to simulating artificial neural networks is performed. The authors distinguish basic approaches to solving the problem of network computer simulation training in the spreadsheet environment, joint application of spreadsheets and tools of neural network simulation, application of third-party add-ins to spreadsheets, development of macros using the embedded languages of spreadsheets; use of standard spreadsheet add-ins for non-linear optimization, creation of neural networks in the spreadsheet environment without add-ins and macros. It is shown that to acquire neural simulation competences in the spreadsheet environment, one should master the models based on the historical and genetic approach. The article considers ways of building neural network models in cloud-based spreadsheets, Google Sheets. The model is based on the problem of classifying multidimensional data provided in “The Use of Multiple Measurements in Taxonomic Problems” by R. A. Fisher. Edgar Anderson’s role in collecting and preparing the data in the 1920s–1930s is discussed as well as some peculiarities of data selection

    Simulation, no problem, of course we offer this service! (observations on firms who have worked to make this true)

    Get PDF
    The paper focuses on the practical experiences of a number of professional firms striving to use simulation to deliver information of value to their clients. It exposes issues such as limitations in existing working practices and the mismatch between language routinely used by facilitators and trainees as well as their different expectations. The paper also discusses the differences observed between incremental implementation of simulation within practices and firms who wished to "jump in at the deep end". Lastly, it addresses the dilemma of how to move simulation tools into the already busy schedules and overloaded programmes of design practices successfully

    Business success through process based application of simulation

    Get PDF
    Progressive design practices are increasingly cognisant of the potential of building energy simulation to assist the delivery of energy efficient, sustainable buildings. However, the success of any building performance assessment hinges on the capabilities of the tool; the collective competences of the team formed to apply it; and, crucially, the existence of an in-house framework within which simulation can be applied with confidence (McElroy and Clarke 1999). There is also a need for the professions to set up mechanisms that facilitate dialogue with vendors in order to influence tool capabilities. And on the related issues of building an in-house competency and a framework for application, the two core issues facing the professions are: i) a need for the development of in-house procedures for management of simulation; and ii) quality assurance of the related models and appraisal results
    • 

    corecore