81 research outputs found

    Finite element modelling of the Scheldt estuary and the adjacent Belgian/Dutch coastal zone with application to the transport of fecal bacteria

    Get PDF
    A fundamental problem in coastal modelling is the need to simultaneously consider large- and small-scale processes, especially when local dynamics or local environmental issues are of interest. The approach widely resorted to is based on a nesting strategy by which coarse grid large scale model provide boundary conditions to force fine resolution local models. This is probably the best solution for finite difference methods, needing structured grids. However, the use of structured grids leads to a marked lack of flexibility in the spatial resolution. Another solution is to take advantage of the potential of the more modern finite element methods, which allow the use of unstructured grids in which the mesh size may vary over a wide spectrum. With these methods only one model is required to describe both the larger and the smaller scales.Such a model is use herein, namely the Second-generation Louvain-la-Neuve Ice-ocean Model (SLIM, http://www.climate.be/SLIM). For one of its first realistic applications, the Scheldt Estuary area is studied. The hydrodynamics is primarily forced by the tide and the neatest way to take it into account is to fix it at the shelf break. This results in a multi-scale problem since the domain boundary lies at the shelf break, and covers about 1000km of the North Sea and 60km of the actual estuary, and ends with a 100km long section of the Scheldt River until Ghent where the river is not more than 50 m wide.Two-dimensional elements are used to simulate the hydrodynamics from the shelf break to Antwerp (80km upstream of the mouth) and one-dimensional elements for the riverine part between Antwerp and Ghent.For first application we consider the transport of faecal bacteria (Escherichia coli) which is an important water quality indicator.The model will be described in detail and the simulation results will be discussed. This modelling exercise actually falls within the framework of the interdisciplinary project TIMOTHY (http://www.climate.be/TIMOTHY) dedicated to the modelling of ecological indicators in the Scheldt area

    Timing and placing samplings to optimally calibrate a reactive transport model: exploring the potential for <i>Escherichia coli</i> in the Scheldt estuary

    Get PDF
    For the calibration of any model, measurements are necessary. As measurements are expensive, it is of interest to determine beforehand which kind of samples will provide the maximum of information. Using a criterion related to the Fisher information matrix, it is possible to design a sampling scheme that will enable the most precise model parameter estimates. This approach was applied to a reactive transport model (based on SLIM) of Escherichia coli in the Scheldt Estuary. As this estuary is highly influenced by the tide, it is expected that careful timing of the samples with respect to the tidal cycle will have an effect on the quality of the data. The timing and also the positioning of samples were optimised according to the proposed criterion. In the investigated case studies the precision of the estimated parameters could be improved by up to a factor of ten, confirming the usefulness of this approach to maximize the amount of information that can be retrieved from a fixed number of samples

    Pseudo-Random Streams for Distributed and Parallel Stochastic Simulations on GP-GPU

    Get PDF
    International audienceRandom number generation is a key element of stochastic simulations. It has been widely studied for sequential applications purposes, enabling us to reliably use pseudo-random numbers in this case. Unfortunately, we cannot be so enthusiastic when dealing with parallel stochastic simulations. Many applications still neglect random stream parallelization, leading to potentially biased results. In particular parallel execution platforms, such as Graphics Processing Units (GPUs), add their constraints to those of Pseudo-Random Number Generators (PRNGs) used in parallel. This results in a situation where potential biases can be combined with performance drops when parallelization of random streams has not been carried out rigorously. Here, we propose criteria guiding the design of good GPU-enabled PRNGs. We enhance our comments with a study of the techniques aiming to parallelize random streams correctly, in the context of GPU-enabled stochastic simulations

    Towards surface quantum optics with Bose-Einstein condensates in evanescent waves

    Full text link
    We present a surface trap which allows for studying the coherent interaction of ultracold atoms with evanescent waves. The trap combines a magnetic Joffe trap with a repulsive evanescent dipole potential. The position of the magnetic trap can be controlled with high precision which makes it possible to move ultracold atoms to the surface of a glass prism in a controlled way. The optical potential of the evanescent wave compensates for the strong attractive van der Waals forces and generates a potential barrier at only a few hundred nanometers from the surface. The trap is tested with Rb Bose-Einstein condensates (BEC), which are stably positioned at distances from the surfaces below one micrometer

    Learning-based quality control for cardiac MR images

    Get PDF
    The effectiveness of a cardiovascular magnetic resonance (CMR) scan depends on the ability of the operator to correctly tune the acquisition parameters to the subject being scanned and on the potential occurrence of imaging artifacts, such as cardiac and respiratory motion. In the clinical practice, a quality control step is performed by visual assessment of the acquired images; however, this procedure is strongly operator-dependent, cumbersome, and sometimes incompatible with the time constraints in clinical settings and large-scale studies. We propose a fast, fully automated, and learning-based quality control pipeline for CMR images, specifically for short-axis image stacks. Our pipeline performs three important quality checks: 1) heart coverage estimation; 2) inter-slice motion detection; 3) image contrast estimation in the cardiac region. The pipeline uses a hybrid decision forest method—integrating both regression and structured classification models—to extract landmarks and probabilistic segmentation maps from both long- and short-axis images as a basis to perform the quality checks. The technique was tested on up to 3000 cases from the UK Biobank and on 100 cases from the UK Digital Heart Project and validated against manual annotations and visual inspections performed by expert interpreters. The results show the capability of the proposed pipeline to correctly detect incomplete or corrupted scans (e.g., on UK Biobank, sensitivity and specificity, respectively, 88% and 99% for heart coverage estimation and 85% and 95% for motion detection), allowing their exclusion from the analyzed dataset or the triggering of a new acquisition

    The languages of peace during the French religious wars

    Get PDF
    The desirability of peace was a common topos in sixteenth-century political rhetoric, and the duty of the king to uphold the peace for the benefit of his subjects was also a long-established tradition. However, the peculiar circumstances of the French religious wars, and the preferred royal policy of pacification, galvanized impassioned debate among both those who supported and those who opposed confessional coexistence. This article looks at the diverse ways in which peace was viewed during the religious wars through an exploration of language and context. It draws not only on the pronouncements of the crown and its officials, and of poets and jurists, but also on those of local communities and confessional groups. Opinion was not just divided along religious lines; political imperatives, philosophical positions and local conditions all came into play in the arguments deployed. The variegated languages of peace provide a social and cultural dimension for the contested nature of sixteenth-century French politics. However, they could not restore harmony to a war-torn and divided kingdom

    Development of microstructural and morphological cortical profiles in the neonatal brain

    Get PDF
    Interruptions to neurodevelopment during the perinatal period may have long-lasting consequences. However, to be able to investigate deviations in the foundation of proper connectivity and functional circuits, we need a measure of how this architecture evolves in the typically developing brain. To this end, in a cohort of 241 term-born infants, we used magnetic resonance imaging to estimate cortical profiles based on morphometry and microstructure over the perinatal period (37-44 weeks postmenstrual age, PMA). Using the covariance of these profiles as a measure of inter-areal network similarity (morphometric similarity networks; MSN), we clustered these networks into distinct modules. The resulting modules were consistent and symmetric, and corresponded to known functional distinctions, including sensory-motor, limbic, and association regions, and were spatially mapped onto known cytoarchitectonic tissue classes. Posterior regions became more morphometrically similar with increasing age, while peri-cingulate and medial temporal regions became more dissimilar. Network strength was associated with age: Within-network similarity increased over age suggesting emerging network distinction. These changes in cortical network architecture over an 8-week period are consistent with, and likely underpin, the highly dynamic processes occurring during this critical period. The resulting cortical profiles might provide normative reference to investigate atypical early brain development
    • …
    corecore