40,798 research outputs found

    Using simulations and artificial life algorithms to grow elements of construction

    Get PDF
    'In nature, shape is cheaper than material', that is a common truth for most of the plants and other living organisms, even though they may not recognize that. In all living forms, shape is more or less directly linked to the influence of force, that was acting upon the organism during its growth. Trees and bones concentrate their material where thy need strength and stiffness, locating the tissue in desired places through the process of self-organization. We can study nature to find solutions to design problems. That’s where inspiration comes from, so we pick a solution already spotted somewhere in the organic world, that closely resembles our design problem, and use it in constructive way. First, examining it, disassembling, sorting out conclusions and ideas discovered, then performing an act of 'reverse engineering' and putting it all together again, in a way that suits our design needs. Very simple ideas copied from nature, produce complexity and exhibit self-organization capabilities, when applied in bigger scale and number. Computer algorithms of simulated artificial life help us to capture them, understand well and use where needed. This investigation is going to follow the question : How can we use methods seen in nature to simulate growth of construction elements? Different ways of extracting ideas from world of biology will be presented, then several techniques of simulated emergence will be demonstrated. Specific focus will be put on topics of computational modelling of natural phenomena, and differences in developmental and non-developmental techniques. Resulting 3D models will be shown and explained

    Exploring the Interplay between CAD and FreeFem++ as an Energy Decision-Making Tool for Architectural Design

    Get PDF
    The energy modelling software tools commonly used for architectural purposes do not allow a straightforward real-time implementation within the architectural design programs. In addition, the surrounding exterior spaces of the building, including the inner courtyards, hardly present a specific treatment distinguishing these spaces from the general external temperature in the thermal simulations. This is a clear disadvantage when it comes to streamlining the design process in relation to the whole-building energy optimization. In this context, the present study aims to demonstrate the advantages of the FreeFem++ open source program for performing simulations in architectural environments. These simulations include microclimate tests that describe the interactions between a building architecture and its local exterior. The great potential of this mathematical tool can be realized through its complete system integration within CAD (Computer-Aided Design) software such as SketchUp or AutoCAD. In order to establish the suitability of FreeFem++ for the performance of simulations, the most widely employed energy simulation tools able to consider a proposed architectural geometry in a specific environment are compared. On the basis of this analysis, it can be concluded that FreeFem++ is the only program displaying the best features for the thermal performance simulation of these specific outdoor spaces, excluding the currently unavailable easy interaction with architectural drawing programs. The main contribution of this research is, in fact, the enhancement of FreeFem++ usability by proposing a simple intuitive method for the creation of building geometries and their respective meshing (pre-processing). FreeFem++ is also considered a tool for data analysis (post-processing) able to help engineers and architects with building energy-efficiency-related tasks

    Simulations as Part of the BIM Concept in Urban Management

    Get PDF
    Smart city management is part of the Smart Cities concept and can be an essential element for further development in this area. The BIM concept, based on a 3D model, data and for all the beneficial cooperation, expands exponentially in the civil engineering. However, the BIM concept is so broad and there are many possibilities in his area, that the development will take many years. One of these areas may be simulations that are not so widely used so far, but which can very well specify different situations and conditions. For these conditions, buildings or cities can be prepared in advance to prevent crises or unnecessary costs. The simulations and the results obtained from them can help us in the decision making phase. They predict problems or situations that can occur during the life cycle and thus prevent them from occurring. Based on the information we receive, we can objectively decide on the design solution, the material, the internal arrangement or, for example, the location of the building in the surrounding area

    Simulation of reaction-diffusion processes in three dimensions using CUDA

    Get PDF
    Numerical solution of reaction-diffusion equations in three dimensions is one of the most challenging applied mathematical problems. Since these simulations are very time consuming, any ideas and strategies aiming at the reduction of CPU time are important topics of research. A general and robust idea is the parallelization of source codes/programs. Recently, the technological development of graphics hardware created a possibility to use desktop video cards to solve numerically intensive problems. We present a powerful parallel computing framework to solve reaction-diffusion equations numerically using the Graphics Processing Units (GPUs) with CUDA. Four different reaction-diffusion problems, (i) diffusion of chemically inert compound, (ii) Turing pattern formation, (iii) phase separation in the wake of a moving diffusion front and (iv) air pollution dispersion were solved, and additionally both the Shared method and the Moving Tiles method were tested. Our results show that parallel implementation achieves typical acceleration values in the order of 5-40 times compared to CPU using a single-threaded implementation on a 2.8 GHz desktop computer.Comment: 8 figures, 5 table

    Controlling Ozone and Fine Particulates: Cost Benefit Analysis with Meteorological Variability

    Get PDF
    In this paper, we develop an integrated cost-benefit analysis framework for ozone and fine particulate control, accounting for variability and uncertainty. The framework includes air quality simulation, sensitivity analysis, stochastic multi-objective air quality management, and stochastic cost-benefit analysis. This paper has two major contributions. The first is the development of stochastic source-receptor (S-R) coefficient matrices for ozone and fine particulate matter using an advanced air quality simulation model (URM-1ATM) and an efficient sensitivity algorithm (DDM-3D). The second is a demonstration of this framework for alternative ozone and PM2.5 reduction policies. Alternative objectives of the stochastic air quality management model include optimization of the net social benefits and maximization of the reliability of satisfying certain air quality goals. We also examine the effect of accounting for distributional concerns.ambient air, ozone, particulate matter, risk management, public policy, cost-benefit analysis, variability and uncertainty, stochastic simulation, stochastic multi-objective programming, decisionmaking, National Ambient Air Quality Standards

    Parallel Implementation of Lossy Data Compression for Temporal Data Sets

    Full text link
    Many scientific data sets contain temporal dimensions. These are the data storing information at the same spatial location but different time stamps. Some of the biggest temporal datasets are produced by parallel computing applications such as simulations of climate change and fluid dynamics. Temporal datasets can be very large and cost a huge amount of time to transfer among storage locations. Using data compression techniques, files can be transferred faster and save storage space. NUMARCK is a lossy data compression algorithm for temporal data sets that can learn emerging distributions of element-wise change ratios along the temporal dimension and encodes them into an index table to be concisely represented. This paper presents a parallel implementation of NUMARCK. Evaluated with six data sets obtained from climate and astrophysics simulations, parallel NUMARCK achieved scalable speedups of up to 8788 when running 12800 MPI processes on a parallel computer. We also compare the compression ratios against two lossy data compression algorithms, ISABELA and ZFP. The results show that NUMARCK achieved higher compression ratio than ISABELA and ZFP.Comment: 10 pages, HiPC 201
    • …
    corecore