2,727 research outputs found
Air pollution modelling using a graphics processing unit with CUDA
The Graphics Processing Unit (GPU) is a powerful tool for parallel computing.
In the past years the performance and capabilities of GPUs have increased, and
the Compute Unified Device Architecture (CUDA) - a parallel computing
architecture - has been developed by NVIDIA to utilize this performance in
general purpose computations. Here we show for the first time a possible
application of GPU for environmental studies serving as a basement for decision
making strategies. A stochastic Lagrangian particle model has been developed on
CUDA to estimate the transport and the transformation of the radionuclides from
a single point source during an accidental release. Our results show that
parallel implementation achieves typical acceleration values in the order of
80-120 times compared to CPU using a single-threaded implementation on a 2.33
GHz desktop computer. Only very small differences have been found between the
results obtained from GPU and CPU simulations, which are comparable with the
effect of stochastic transport phenomena in atmosphere. The relatively high
speedup with no additional costs to maintain this parallel architecture could
result in a wide usage of GPU for diversified environmental applications in the
near future.Comment: 5 figure
Numerical Approximation of the Transport Equation: Comparison of Five Positive Definite Algorithms
IIASA's Regional Acidification INformation and Simulation (RAINS) model will be used to develop and assess international control strategies to reduce emissions of acidifying pollutants. These strategies will involve the expenditure of large sum of money; it is important, therefore, to assess the effect of uncertainties in the model on its results. An important component of the RAINS model is its atmospheric transport component; this paper reports the results of examining several algorithms for solution of the atmospheric transport equation. It also represents a joint effort between IIASA scientists and those in the Institute of Meteorology and Water Management in Warsaw and Central Institute for Meteorology and Geodynamics in Vienna
Development of a GPGPU accelerated tool to simulate advection-reaction-diffusion phenomena in 2D
Computational models are powerful tools to the study of environmental systems, playing a fundamental
role in several fields of research (hydrological sciences, biomathematics, atmospheric
sciences, geosciences, among others). Most of these models require high computational capacity,
especially when one considers high spatial resolution and the application to large areas.
In this context, the exponential increase in computational power brought by General Purpose
Graphics Processing Units (GPGPU) has drawn the attention of scientists and engineers to
the development of low cost and high performance parallel implementations of environmental
models. In this research, we apply GPGPU computing for the development of a model that describes
the physical processes of advection, reaction and diffusion. This presentation is held in
the form of three self-contained articles. In the first one, we present a GPGPU implementation
for the solution of the 2D groundwater flow equation in unconfined aquifers for heterogenous
and anisotropic media. We implement a finite difference solution scheme based on the Crank-
Nicolson method and show that the GPGPU accelerated solution implemented using CUDA
C/C++ (Compute Unified Device Architecture) greatly outperforms the corresponding serial
solution implemented in C/C++. The results show that accelerated GPGPU implementation is
capable of delivering up to 56 times acceleration in the solution process using an ordinary office
computer. In the second article, we study the application of a diffusive-logistic growth (DLG)
model to the problem of forest growth and regeneration. The study focuses on vegetation belonging
to preservation areas, such as riparian buffer zones. The study was developed in two
stages: (i) a methodology based on Artificial Neural Network Ensembles (ANNE) was applied
to evaluate the width of riparian buffer required to filter 90% of the residual nitrogen; (ii) the
DLG model was calibrated and validated to generate a prognostic of forest regeneration in riparian
protection bands considering the minimum widths indicated by the ANNE. The solution
was implemented in GPGPU and it was applied to simulate the forest regeneration process for
forty years on the riparian protection bands along the Ligeiro river, in Brazil. The results from
calibration and validation showed that the DLG model provides fairly accurate results for the
modelling of forest regeneration. In the third manuscript, we present a GPGPU implementation
of the solution of the advection-reaction-diffusion equation in 2D. The implementation is
designed to be general and flexible to allow the modeling of a wide range of processes, including
those with heterogeneity and anisotropy. We show that simulations performed in GPGPU
allow the use of mesh grids containing more than 20 million points, corresponding to an area of
18,000 km? in a standard Landsat image resolution.Os modelos computacionais s?o ferramentas poderosas para o estudo de sistemas ambientais,
desempenhando um papel fundamental em v?rios campos de pesquisa (ci?ncias hidrol?gicas,
biomatem?tica, ci?ncias atmosf?ricas, geoci?ncias, entre outros). A maioria desses modelos
requer alta capacidade computacional, especialmente quando se considera uma alta resolu??o
espacial e a aplica??o em grandes ?reas. Neste contexto, o aumento exponencial do poder computacional
trazido pelas Unidades de Processamento de Gr?ficos de Prop?sito Geral (GPGPU)
chamou a aten??o de cientistas e engenheiros para o desenvolvimento de implementa??es paralelas
de baixo custo e alto desempenho para modelos ambientais. Neste trabalho, aplicamos
computa??o em GPGPU para o desenvolvimento de um modelo que descreve os processos f?sicos
de advec??o, rea??o e difus?o. Esta disserta??o ? apresentada sob a forma de tr?s artigos. No
primeiro, apresentamos uma implementa??o em GPGPU para a solu??o da equa??o de fluxo de
?guas subterr?neas 2D em aqu?feros n?o confinados para meios heterog?neos e anisotr?picos.
Foi implementado um esquema de solu??o de diferen?as finitas com base no m?todo Crank-
Nicolson e mostramos que a solu??o acelerada GPGPU implementada usando CUDA C / C ++
supera a solu??o serial correspondente implementada em C / C ++. Os resultados mostram que
a implementa??o acelerada por GPGPU ? capaz de fornecer acelera??o de at? 56 vezes no processo
da solu??o usando um computador de escrit?rio comum. No segundo artigo estudamos a
aplica??o de um modelo de crescimento log?stico difusivo (DLG) ao problema de crescimento e
regenera??o florestal. O estudo foi desenvolvido em duas etapas: (i) Aplicou-se uma metodologia
baseada em Comites de Rede Neural Artificial (ANNE) para avaliar a largura da faixa de
prote??o rip?ria necess?ria para filtrar 90% do nitrog?nio residual; (ii) O modelo DLG foi calibrado
e validado para gerar um progn?stico de regenera??o florestal em faixas de prote??o
rip?rias considerando as larguras m?nimas indicadas pela ANNE. A solu??o foi implementada
em GPGPU e aplicada para simular o processo de regenera??o florestal para um per?odo de
quarenta anos na faixa de prote??o rip?ria ao longo do rio Ligeiro, no Brasil. Os resultados
da calibra??o e valida??o mostraram que o modelo DLG fornece resultados bastante precisos
para a modelagem de regenera??o florestal. No terceiro artigo, apresenta-se uma implementa??o
em GPGPU para solu??o da equa??o advec??o-rea??o-difus?o em 2D. A implementa??o
? projetada para ser geral e flex?vel para permitir a modelagem de uma ampla gama de processos,
incluindo caracter?sticas como heterogeneidade e anisotropia do meio. Neste trabalho
mostra-se que as simula??es realizadas em GPGPU permitem o uso de malhas contendo mais
de 20 milh?es de pontos (vari?veis), correspondendo a uma ?rea de 18.000 km? em resolu??o
de 30m padr?o das imagens Landsat
Local to Global Multi-Scale Multimedia Modeling of Chemical Fate and Population Exposure
To assess environmental and human exposure to chemical emissions, two types of approaches are available: 1. intermediate- to high-resolution, substance/location-specific analyses, and 2. lower resolution, less specific analyses aiming for broad coverage. The first category is time/resource intensive, which limits its utility, while the second is less accurate but allows for evaluation of large numbers of substances/situations. None is well suited for analyzing local to global population exposure. We need a multi-scale approach of intermediate complexity that bridges the advantages of both approaches: high resolution when relevant, the ability to evaluate large numbers of substances, and a level of accuracy that is “useful” (for decision-makers).
This thesis aims to 1. develop a multi-scale, multimedia fate and transport, and multi-pathway population exposure modeling framework, 2. evaluate it using large-scale inventories of emissions and measured environmental concentrations, 3. evaluate local to global population exposure associated with large sets of point sources covering a wide variety of local contexts (e.g. up/down-wind/stream from large populations, important water bodies or agricultural resources), and 4. simulate a large national inventory of emissions and perform multi-media source apportionment.
Coupling a geographic information system and a computation engine, we develop the Pangea framework, which offers a unique ability to discretize the globe using three-dimensional multi-scale grids, to overlay Eulerian fate and transport multimedia models, and to compute multi-pathway population exposure.
We first apply this framework to predict the fate and transport of home and personal care chemicals in all of Asia. This study provides a large-scale high-resolution spatial inventory of emissions and a large data set of ~1,600 monitoring values. We compare predicted environmental concentrations (PECs) and measurements and find good agreement for the long-lived triclosan in fresh water (Pearson r=0.82), moderate agreement for shorter-lived substances, and a large discrepancy specifically for parabens in sediments. This study highlights the limitation of the present underlying gridded hydrological data set (WWDRII) when comparison with measurements at monitoring sites is required, which prompts the evaluation of a finer, catchment-based hydrological data set (HydroBASINS).
We then focus on human exposure and the evolution of the population intake fraction with the distance from the source. We simulate emissions from 126 point sources (stacks of solid waste treatment plants) in France, and compute radial distributions of population intake fractions through inhalation and ingestion. We determine that a substantial fraction of emissions may be taken in by the population farther than 100 km away from point sources (78.5% of the inhaled benzene and 54.1% of the ingested 2,3,7,8-TCDD). We demonstrate the feasibility of simulating large numbers of emission scenarios by extending the study to 10,000 point sources.
We finally extend the previous emitter-oriented studies with receptor-oriented analyses (source apportionment). We simulate 43 substances emitted from 4,101 point sources defined by the Australian National Pollutant Inventory for 2014-2015. We compute population exposure and severity (DALY). Formaldehyde, benzene, and styrene are the three top contributors in terms of DALYs. We demonstrate the technical feasibility of multimedia, large-scale source apportionment.
This research opens new perspectives in spatial, local to large-scale fate and exposure modeling. The flexibility of Pangea allows to build project-specific model geometries and to re-analyze projects following the evolution of data availability. Major limitations come from the underlying first-order fate and transport models and from a limited availability of global spatial data sets.PHDEnvironmental Health SciencesUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/138610/1/wannaz_1.pd
Technology Needs Assessment of an Atmospheric Observation System for Multidisciplinary Air Quality/Meteorology Missions, Part 2
The technology advancements that will be necessary to implement the atmospheric observation systems are considered. Upper and lower atmospheric air quality and meteorological parameters necessary to support the air quality investigations were included. The technology needs were found predominantly in areas related to sensors and measurements of air quality and meteorological measurements
The application of an Eulerian chemical and transport model (CMAQ) at fine scale resolution to the UK
Present-day numerical air quality models are considered essential tools for predicting
future air pollutant concentrations and depositions, contributing to the development
of new effective strategies for the control and the reduction of pollutant emissions.
They simulate concentrations and depositions of pollutants on a wide range of scales
(global, national, urban scale) and they are used for identifying critical areas,
integrating measurements and achieving a deeper scientific understanding of the
physical and chemical processes involving air pollutants in the atmosphere.
The use of comprehensive air quality models started in the late 1970s and since then
their development has increased rapidly, hand in hand with the rapid increase in
computational resources. Today more and more complex and computationally
expensive numerical models are available to the scientific community. One of these
tools is the Community Multi-Scale Air Quality System (CMAQ), developed in the
1990s by the US Environmental Protection Agency (EPA) and currently widely
applied across the world for air pollution studies. This work focuses on the
application of CMAQ to the United Kingdom, for estimating concentrations and
depositions of acidifying pollutants (NOX, NHX, SOX) on a national scale.
The work is divided into seven chapters, the first one describing the main issues
related to the emission and dispersion in the atmosphere of acidifying species. It also
includes a brief overview of the main international policies signed in the last thirty
years in order to reduce the problem of acidification in Europe, as well as a brief
description of some models mentioned in this thesis.
The second one describes the main features of CMAQ and addresses some issues
such as the use of a nesting process for achieving temporally and spatially resolved
boundary concentrations, and the implementation of the model on parallel machines,
essential for reducing the simulation computing time. It also describes how this study
is part of a wider context, which includes the application of CMAQ in the United
Kingdom by other users with different scientific purposes (aerosols processes, air
quality in the urban area of London, contribution of UK power stations to
concentrations and depositions etc.).
The third part of the thesis focuses on the application and evaluation over the United
Kingdom of the 5th Generation Mesoscale Model MM5, used for providing 3D
meteorological input fields to CMAQ. This study was performed assuming that an
accurate representation of depositions and concentrations of chemical species cannot
be achieved without a good estimate of the meteorological parameters involved in
most of the atmospheric processes (transport, photochemistry, aerosol processes,
cloud processes etc.).
The fourth part of the thesis describes the preliminary implementation of the Sparse
Matrix Operational Kernel Emission System (SMOKE) in the United Kingdom. The
processor provides input emissions to CMAQ. The use of SMOKE is usually avoided
in CMAQ applications of outside America, and CMAQ input emission files are
prepared by the application of other software. The reason is that the model requires
radical changes for being applied outside Northern and Central America. Some of
these changes have been made in this study such as the adaptation of the European
emission inventory EMEP and the UK National Inventory NAEI to the modelling
system for point and area sources, the introduction of new European emission
temporal profiles in substitution of the American ones and the introduction of new
geographical references for the spatial allocation of emissions.
In the fifth chapter the results of CMAQ application over the UK are discussed. The
study focuses on NOX, SO2, NH3 and +
4 NH . Maps of concentration are presented and
modelled data are compared to measurements from two different air quality networks
in the UK. An analysis of the performance of CMAQ over the UK is also performed.
In the final chapter an annual inter-comparison between CMAQ and the Lagrangian
transport model FRAME is carried out. Maps of annual wet deposition fluxes of
NHX, NOY and SOX for year 1999 are presented. The results of both models are
compared to one another and they are also compared to values from the UK official
data set CBED.
Finally, the last chapter suggests the work that has to be done in the future with
CMAQ and it summarizes the conclusions
Modelling heat transfer and respiration of occupants in indoor climate
Although the terms "Human Thermal Comfort" and "Indoor Air Quality (IAQ)" can be highly subjective, they still dictate the indoor climate design (HVAC design) of a building. In order to evaluate human thermal comfort and IAQ, one of three main tools are used, a) direct questioning the subjects about their thermal and air quality sensation (voting, sampling etc.), b) measuring the human thermal comfort by recording the physical parameters such as relative humidity, air and radiation temperature, air velocities and concentration gradients of pollutants or c) by using numerical simulations either including or excluding detailed thermo-physiological models. The application of the first two approaches can only take place in post commissioning and/or testing phases of the building. Use of numerical techniques can however be employed at any stage of the building design. With the rapid development in computational hard- and software technology, the costs involved in numerical studies has reduced compared to detailed tests. Employing numerical modelling to investigate human thermal comfort and IAQ however demand thorough verification and validation studies. Such studies are used to understand the limitations and application of numerical modelling of human thermal comfort and IAQ in indoor climates.
This PhD research is an endeavour to verify, validate and apply, numerical simulation for modelling heat transfer and respiration of occupants in indoor climates. Along with the investigations concerning convective and radiation heat transfer between the occupants and their surroundings, the work focuses on detailed respiration modelling of sedentary human occupants. The objectives of the work have been to: verify the convective and radiation numerical models; validate them for buoyancy-driven flows due to human occupants in indoor climates; and apply these validated models for investigating human thermal comfort and IAQ in a real classroom for which field study data was available. On the basis of the detailed verification, validation and application studies, the findings are summarized as a set of guidelines for simulating human thermal comfort and IAQ in indoor climates.
This PhD research involves the use of detailed human body geometries and postures. Modelling radiation and investigating the effect of geometrical posture has shown that the effective radiation area varies significantly with posture. The simulation results have shown that by using an effective radiation area factor of 0.725, estimated previously (Fanger, 1972) for a standing person, can lead to an underestimation of effective radiation area by 13% for the postures considered.
Numerical modelling of convective heat transfer and respiration processes for sedentary manikins have shown that the SST turbulence model (Menter, 1994) with appropriate resolution of near wall region can simulate the local air velocity, temperature and heat transfer coefficients to a level of detail required for prediction of thermal comfort and IAQ. The present PhD work has shown that in a convection dominated environment, the detailed seated manikins give rise to an asymmetrical thermal plume as compared to the thermal plumes generated by simplified manikins or point sources.
Validated simulation results obtained during the present PhD work have shown that simplified manikins can be used without significant limitations while investigating IAQ of complete indoor spaces. The use of simplified manikins however does not seem appropriate when simulating detailed respiration effects in the immediate vicinity of seated humans because of the underestimation in the amount of re-inhaled CO2 and pollutants from the surroundings. Furthermore, the results have shown that due to the simplification in geometrical form of the nostrils, the CO2 concentration is much higher near the face region (direct jet along the nostrils) as compared to a detailed geometry (sideways jet).
Simulating the complete respiration cycle has shown that a pause between exhalation and inhalation has a significant effect on the amount of re-inhaled CO2. Previous results have shown the amount of re-inhaled CO2 to range between 10 - 19%. The present study has shown that by considering the pause, this amount of re-inhaled CO2 falls down to values lower than 1%. A comparison between the simplified and detailed geometry has shown that a simplified geometry can cause an underestimation in the amount of re-inhaled CO2 by more than 37% as compared to a detailed geometry.
The major contribution to knowledge delivered by this PhD work is the provision of a validated seated computational thermal manikin. This PhD work follows a structured verification and validation approach for conducting CFD simulations to predict human thermal comfort and indoor air quality. The work demonstrates the application of the validated model to a classroom case with multiple occupancy and compares the measured results with the simulation results. The comparison of CFD results with measured data advocates the use of CFD and visualizes the importance of modelling thermal manikins in indoor HVAC design rather than designing the HVAC by considering empty spaces as the occupancy has a strong influence on the indoor air flow. This PhD work enables the indoor climate researchers and building designers to employ simplified thermal manikin to correctly predict the mean flow characteristics in indoor surroundings.
The present work clearly demonstrates the limitation of the PIV measurement technique, the importance of using detailed CFD manikin geometry when investigating the phenomena of respiration in detail and the effect of thermal plume around the seated manikin. This computational thermal manikin used in this work is valid for a seated adult female geometry
Advances in Modeling of Fluid Dynamics
This book contains twelve chapters detailing significant advances and applications in fluid dynamics modeling with focus on biomedical, bioengineering, chemical, civil and environmental engineering, aeronautics, astronautics, and automotive. We hope this book can be a useful resource to scientists and engineers who are interested in fundamentals and applications of fluid dynamics
- …