37,167 research outputs found
A Conceptual Framework for Integration Development of GSFLOW Model: Concerns and Issues Identified and Addressed for Model Development Efficiency
In Coupled Groundwater and Surface-Water Flow (GSFLOW) model, the three-dimensional finite-difference groundwater model (MODFLOW) plays a critical role of groundwater flow simulation, together with which the Precipitation-Runoff Modeling System (PRMS) simulates the surface hydrologic processes. While the model development of each individual PRMS and MODFLOW model requires tremendous time and efforts, further integration development of these two models exerts additional concerns and issues due to different simulation realm, data communication, and computation algorithms. To address these concerns and issues in GSFLOW, the present paper proposes a conceptual framework from perspectives of: Model Conceptualization, Data Linkages and Transference, Model Calibration, and Sensitivity Analysis. As a demonstration, a MODFLOW groundwater flow system was developed and coupled with the PRMS model in the Lehman Creek watershed, eastern Nevada, resulting in a smooth and efficient integration as the hydrogeologic features were well captured and represented. The proposed conceptual integration framework with techniques and concerns identified substantially improves GSFLOW model development efficiency and help better model result interpretations. This may also find applications in other integrated hydrologic modelings
The LifeV library: engineering mathematics beyond the proof of concept
LifeV is a library for the finite element (FE) solution of partial
differential equations in one, two, and three dimensions. It is written in C++
and designed to run on diverse parallel architectures, including cloud and high
performance computing facilities. In spite of its academic research nature,
meaning a library for the development and testing of new methods, one
distinguishing feature of LifeV is its use on real world problems and it is
intended to provide a tool for many engineering applications. It has been
actually used in computational hemodynamics, including cardiac mechanics and
fluid-structure interaction problems, in porous media, ice sheets dynamics for
both forward and inverse problems. In this paper we give a short overview of
the features of LifeV and its coding paradigms on simple problems. The main
focus is on the parallel environment which is mainly driven by domain
decomposition methods and based on external libraries such as MPI, the Trilinos
project, HDF5 and ParMetis.
Dedicated to the memory of Fausto Saleri.Comment: Review of the LifeV Finite Element librar
Power quality and electromagnetic compatibility: special report, session 2
The scope of Session 2 (S2) has been defined as follows by the Session Advisory Group and the Technical Committee: Power Quality (PQ), with the more general concept of electromagnetic compatibility (EMC) and with some related safety problems in electricity distribution systems.
Special focus is put on voltage continuity (supply reliability, problem of outages) and voltage quality (voltage level, flicker, unbalance, harmonics). This session will also look at electromagnetic compatibility (mains frequency to 150 kHz), electromagnetic interferences and electric and magnetic fields issues. Also addressed in this session are electrical safety and immunity concerns (lightning issues, step, touch and transferred voltages).
The aim of this special report is to present a synthesis of the present concerns in PQ&EMC, based on all selected papers of session 2 and related papers from other sessions, (152 papers in total). The report is divided in the following 4 blocks:
Block 1: Electric and Magnetic Fields, EMC, Earthing systems
Block 2: Harmonics
Block 3: Voltage Variation
Block 4: Power Quality Monitoring
Two Round Tables will be organised:
- Power quality and EMC in the Future Grid (CIGRE/CIRED WG C4.24, RT 13)
- Reliability Benchmarking - why we should do it? What should be done in future? (RT 15
Recommended from our members
A high resolution coupled hydrologic–hydraulic model (HiResFlood-UCI) for flash flood modeling
HiResFlood-UCI was developed by coupling the NWS's hydrologic model (HL-RDHM) with the hydraulic model (BreZo) for flash flood modeling at decameter resolutions. The coupled model uses HL-RDHM as a rainfall-runoff generator and replaces the routing scheme of HL-RDHM with the 2D hydraulic model (BreZo) in order to predict localized flood depths and velocities. A semi-automated technique of unstructured mesh generation was developed to cluster an adequate density of computational cells along river channels such that numerical errors are negligible compared with other sources of error, while ensuring that computational costs of the hydraulic model are kept to a bare minimum. HiResFlood-UCI was implemented for a watershed (ELDO2) in the DMIP2 experiment domain in Oklahoma. Using synthetic precipitation input, the model was tested for various components including HL-RDHM parameters (a priori versus calibrated), channel and floodplain Manning n values, DEM resolution (10 m versus 30 m) and computation mesh resolution (10 m+ versus 30 m+). Simulations with calibrated versus a priori parameters of HL-RDHM show that HiResFlood-UCI produces reasonable results with the a priori parameters from NWS. Sensitivities to hydraulic model resistance parameters, mesh resolution and DEM resolution are also identified, pointing to the importance of model calibration and validation for accurate prediction of localized flood intensities. HiResFlood-UCI performance was examined using 6 measured precipitation events as model input for model calibration and validation of the streamflow at the outlet. The Nash–Sutcliffe Efficiency (NSE) obtained ranges from 0.588 to 0.905. The model was also validated for the flooded map using USGS observed water level at an interior point. The predicted flood stage error is 0.82 m or less, based on a comparison to measured stage. Validation of stage and discharge predictions builds confidence in model predictions of flood extent and localized velocities, which are fundamental to reliable flash flood warning
Towards a service-oriented e-infrastructure for multidisciplinary environmental research
Research e-infrastructures are considered to have generic and thematic parts. The generic part provids high-speed networks, grid (large-scale distributed computing) and database systems (digital repositories and data transfer systems) applicable to all research commnities irrespective of discipline. Thematic parts are specific deployments of e-infrastructures to support diverse virtual research communities. The needs of a virtual community of multidisciplinary envronmental researchers are yet to be investigated. We envisage and argue for an e-infrastructure that will enable environmental researchers to develop environmental models and software entirely out of existing components through loose coupling of diverse digital resources based on the service-oriented achitecture. We discuss four specific aspects for consideration for a future e-infrastructure: 1) provision of digital resources (data, models & tools) as web services, 2) dealing with stateless and non-transactional nature of web services using workflow management systems, 3) enabling web servce discovery, composition and orchestration through semantic registries, and 4) creating synergy with existing grid infrastructures
Model Coupling between the Weather Research and Forecasting Model and the DPRI Large Eddy Simulator for Urban Flows on GPU-accelerated Multicore Systems
In this report we present a novel approach to model coupling for
shared-memory multicore systems hosting OpenCL-compliant accelerators, which we
call The Glasgow Model Coupling Framework (GMCF). We discuss the implementation
of a prototype of GMCF and its application to coupling the Weather Research and
Forecasting Model and an OpenCL-accelerated version of the Large Eddy Simulator
for Urban Flows (LES) developed at DPRI.
The first stage of this work concerned the OpenCL port of the LES. The
methodology used for the OpenCL port is a combination of automated analysis and
code generation and rule-based manual parallelization. For the evaluation, the
non-OpenCL LES code was compiled using gfortran, fort and pgfortran}, in each
case with auto-parallelization and auto-vectorization. The OpenCL-accelerated
version of the LES achieves a 7 times speed-up on a NVIDIA GeForce GTX 480
GPGPU, compared to the fastest possible compilation of the original code
running on a 12-core Intel Xeon E5-2640.
In the second stage of this work, we built the Glasgow Model Coupling
Framework and successfully used it to couple an OpenMP-parallelized WRF
instance with an OpenCL LES instance which runs the LES code on the GPGPI. The
system requires only very minimal changes to the original code. The report
discusses the rationale, aims, approach and implementation details of this
work.Comment: This work was conducted during a research visit at the Disaster
Prevention Research Institute of Kyoto University, supported by an EPSRC
Overseas Travel Grant, EP/L026201/
Magnetic and structural properties of nanocrystalline PrCo
The structure and magnetic properties of nanocrystalline PrCo obtained
from high energy milling technique are investigated by X-ray diffraction, Curie
temperature determination and magnetic properties measurements are reported.
The as-milled samples have been annealed in a temperature range of 1023 K to
1273 K for 30 mn to optimize the extrinsic properties. The Curie temperature is
349\,K and coercive fields of 55\,kOe at 10\,K and 12\,kOe at 293\,K are
obtained on the samples annealed at 1023\,K. A simulation of the magnetic
properties in the framework of micromagnetism has been performed in order to
investigate the influence of the nanoscale structure. A composite model with
hard crystallites embedded in an amorphous matrix, corresponding to the
as-milled material, leads to satisfying agreement with the experimental
magnetization curve. [ K. Younsi, V. Russier and L. Bessais, J. Appl. Phys.
{\bf 107}, 083916 (2010)]. The microscopic scale will also be considered from
DFT based calculations of the electronic structure of Co compounds,
where = (Y, Pr) and = 2,3 and 5.Comment: To be published in J. Phys.: Conference Series in the JEMS 2010
special issue. To be found once published at
http://iopscience.iop.org/1742-659
Benefits and limitations of data assimilation for discharge forecasting using an event-based rainfall–runoff model
Mediterranean catchments in southern France are threatened by potentially devastating fast floods which are difficult to anticipate. In order to improve the skill of rainfall-runoff models in predicting such flash floods, hydrologists use data assimilation techniques to provide real-time updates of the model using observational data. This approach seeks to reduce the uncertainties present in different components of the hydrological model (forcing, parameters or state variables) in order to minimize the error in simulated discharges. This article presents a data assimilation procedure, the best linear unbiased estimator (BLUE), used with the goal of improving the peak discharge predictions generated by an event-based hydrological model Soil Conservation Service lag and route (SCS-LR). For a given prediction date, selected model inputs are corrected by assimilating discharge data observed at the basin outlet. This study is conducted on the Lez Mediterranean basin in southern France. The key objectives of this article are (i) to select the parameter(s) which allow for the most efficient and reliable correction of the simulated discharges, (ii) to demonstrate the impact of the correction of the initial condition upon simulated discharges, and (iii) to identify and understand conditions in which this technique fails to improve the forecast skill. The correction of the initial moisture deficit of the soil reservoir proves to be the most efficient control parameter for adjusting the peak discharge. Using data assimilation, this correction leads to an average of 12% improvement in the flood peak magnitude forecast in 75% of cases. The investigation of the other 25% of cases points out a number of precautions for the appropriate use of this data assimilation procedure
- …