28,457 research outputs found
Air pollution modelling using a graphics processing unit with CUDA
The Graphics Processing Unit (GPU) is a powerful tool for parallel computing.
In the past years the performance and capabilities of GPUs have increased, and
the Compute Unified Device Architecture (CUDA) - a parallel computing
architecture - has been developed by NVIDIA to utilize this performance in
general purpose computations. Here we show for the first time a possible
application of GPU for environmental studies serving as a basement for decision
making strategies. A stochastic Lagrangian particle model has been developed on
CUDA to estimate the transport and the transformation of the radionuclides from
a single point source during an accidental release. Our results show that
parallel implementation achieves typical acceleration values in the order of
80-120 times compared to CPU using a single-threaded implementation on a 2.33
GHz desktop computer. Only very small differences have been found between the
results obtained from GPU and CPU simulations, which are comparable with the
effect of stochastic transport phenomena in atmosphere. The relatively high
speedup with no additional costs to maintain this parallel architecture could
result in a wide usage of GPU for diversified environmental applications in the
near future.Comment: 5 figure
On green routing and scheduling problem
The vehicle routing and scheduling problem has been studied with much
interest within the last four decades. In this paper, some of the existing
literature dealing with routing and scheduling problems with environmental
issues is reviewed, and a description is provided of the problems that have
been investigated and how they are treated using combinatorial optimization
tools
From Social Simulation to Integrative System Design
As the recent financial crisis showed, today there is a strong need to gain
"ecological perspective" of all relevant interactions in
socio-economic-techno-environmental systems. For this, we suggested to set-up a
network of Centers for integrative systems design, which shall be able to run
all potentially relevant scenarios, identify causality chains, explore feedback
and cascading effects for a number of model variants, and determine the
reliability of their implications (given the validity of the underlying
models). They will be able to detect possible negative side effect of policy
decisions, before they occur. The Centers belonging to this network of
Integrative Systems Design Centers would be focused on a particular field, but
they would be part of an attempt to eventually cover all relevant areas of
society and economy and integrate them within a "Living Earth Simulator". The
results of all research activities of such Centers would be turned into
informative input for political Decision Arenas. For example, Crisis
Observatories (for financial instabilities, shortages of resources,
environmental change, conflict, spreading of diseases, etc.) would be connected
with such Decision Arenas for the purpose of visualization, in order to make
complex interdependencies understandable to scientists, decision-makers, and
the general public.Comment: 34 pages, Visioneer White Paper, see http://www.visioneer.ethz.c
NASA's supercomputing experience
A brief overview of NASA's recent experience in supercomputing is presented from two perspectives: early systems development and advanced supercomputing applications. NASA's role in supercomputing systems development is illustrated by discussion of activities carried out by the Numerical Aerodynamical Simulation Program. Current capabilities in advanced technology applications are illustrated with examples in turbulence physics, aerodynamics, aerothermodynamics, chemistry, and structural mechanics. Capabilities in science applications are illustrated by examples in astrophysics and atmospheric modeling. Future directions and NASA's new High Performance Computing Program are briefly discussed
Recommended from our members
Weather, climate, and hydrologic forecasting for the US Southwest: A survey
As part of a regional integrated assessment of climate vulnerability, a survey was conducted from June 1998 to May 2000 of weather, climate, and hydrologic forecasts with coverage of the US Southwest and an emphasis on the Colorado River Basin. The survey addresses the types of forecasts that were issued, the organizations that provided them, and techniques used in their generation. It reflects discussions with key personnel from organizations involved in producing or issuing forecasts, providing data for making forecasts, or serving as a link for communicating forecasts. During the survey period, users faced a complex and constantly changing mix of forecast products available from a variety of sources. The abundance of forecasts was not matched in the provision of corresponding interpretive materials, documentation about how the forecasts were generated, or reviews of past performance. Potential existed for confusing experimental and research products with others that had undergone a thorough review process, including official products issued by the National Weather Service. Contrasts between the state of meteorologic and hydrologic forecasting were notable, especially in the former's greater operational flexibility and more rapid incorporation of new observations and research products. Greater attention should be given to forecast content and communication, including visualization, expression of probabilistic forecasts and presentation of ancillary information. Regional climate models and use of climate forecasts in water supply forecasting offer rapid improvements in predictive capabilities for the Southwest. Forecasts and production details should be archived, and publicly available forecasts should be accompanied by performance evaluations that are relevant to users
Recommended from our members
Three decades of the Shuffled Complex Evolution (SCE-UA) optimization algorithm: Review and applications
- …