326 research outputs found
Local scour and flow characteristics around a circular cylinder undergoing vortex-induced vibration
Vortex-induced vibration (VIV) has been extensively studied and the related findings presented in published literature. However, when the cylinder is placed near an erodible sand bed, interactions between the vibrating cylinder, flow field and local scour evidently will become very complex. The aim of this study is to provide an improved understanding of the interaction between the flow field, free vibrating cylinder and scour by using flow visualization and a new PIV measurement technique. The results show that the amplitude and frequency of the vibrating cylinder are closely related to the depth of the scour hole. Based on the qualitative observation and quantitative measurement of the flow field, vibrating cylinder and scour dimensions, three distinct scour stages are identified in this study. The characteristics of the turbulence intensity, formation and transmission of the vortices in each of these three scour stages are discussed in this paper
A review of simulation and application of agent-based model approaches
In the past, various traditional methods used experiments and statistical data to examine and solve the occurred problem and social-environmental issue. However, the traditional method is not suitable for expressing or solving the complex dynamics of human environmental crisis (such as the spread of diseases, natural disaster management, social problems, etc.). Therefore, the implementation of computational modelling methods such as Agent-Based Models (ABM) has become an effective technology for solving complex problems arising from the interpretation of human behaviour such as human society, environment, and biological systems. Overall, this article will outline the ABM model properties and its applications in the criminology, flood management, and the COVID-19 pandemic fields. In addition, this article will review the limitations that occurred to be overcome in the further development of the ABM model
Archiving scientific data
We present an archiving technique for hierarchical data with key structure. Our approach is based on the notion of timestamps whereby an element appearing in multiple versions of the database is stored only once along with a compact description of versions in which it appears. The basic idea of timestamping was discovered by Driscoll et. al. in the context of persistent data structures where one wishes to track the sequences of changes made to a data structure. We extend this idea to develop an archiving tool for XML data that is capable of providing meaningful change descriptions and can also efficiently support a variety of basic functions concerning the evolution of data such as retrieval of any specific version from the archive and querying the temporal history of any element. This is in contrast to diff-based approaches where such operations may require undoing a large number of changes or significant reasoning with the deltas. Surprisingly, our archiving technique does not incur any significant space overhead when contrasted with other approaches. Our experimental results support this and also show that the compacted archive file interacts well with other compression techniques. Finally, another useful property of our approach is that the resulting archive is also in XML and hence can directly leverage existing XML tools
Modelling colloids with Baxter's adhesive hard sphere model
The structure of the Baxter adhesive hard sphere fluid is examined using
computer simulation. The radial distribution function (which exhibits unusual
discontinuities due to the particle adhesion) and static structure factor are
calculated with high accuracy over a range of conditions and compared with the
predictions of Percus--Yevick theory. We comment on rigidity in percolating
clusters and discuss the role of the model in the context of experiments on
colloidal systems with short-range attractive forces.Comment: 14 pages, 7 figures. (For proceedings of "Structural arrest in
colloidal systems with short-range attractive forces", Messina, December
2003
Road Triangle Detection for Non-Road Area Elimination Using Lane Detection and Image Multiplication
The background has become the key issue in maintaining the accuracy of final analysis for object detection in the development of an image processing algorithm. Therefore, this paper focuses on intelligent transport system (ITS), in which some of the background characteristics such as trees, road divider, and buildings interfere in the detection system algorithm. Therefore, this paper presents an algorithm that can remove the unwanted background, outside the road area boundaries for dynamic video footage. Using the onboard camera to capture the road traffic, the background is always moving in motion together with the foreground; therefore, a region of interest that focuses only on the road region needs to be established. The algorithm consists of three main components: lane detection, vanishing point and image multiplication. From the three components, other methods are applied, namely Hough transform, line intersection, image masking and image multiplication, which are combined together to create the background subtraction system. In the final analysis, the test results under various road conditions show a good detection rate and background removal
Full Connectivity: Corners, edges and faces
We develop a cluster expansion for the probability of full connectivity of
high density random networks in confined geometries. In contrast to percolation
phenomena at lower densities, boundary effects, which have previously been
largely neglected, are not only relevant but dominant. We derive general
analytical formulas that show a persistence of universality in a different form
to percolation theory, and provide numerical confirmation. We also demonstrate
the simplicity of our approach in three simple but instructive examples and
discuss the practical benefits of its application to different models.Comment: 28 pages, 8 figure
A global assessment of the impact of climate change on water scarcity
This paper presents a global scale assessment of the impact of climate change on water scarcity. Patterns of climate change from 21 Global Climate Models (GCMs) under four SRES scenarios are applied to a global hydrological model to estimate water resources across 1339 watersheds. The Water Crowding Index (WCI) and the Water Stress Index (WSI) are used to calculate exposure to increases and decreases in global water scarcity due to climate change. 1.6 (WCI) and 2.4 (WSI) billion people are estimated to be currently living within watersheds exposed to water scarcity. Using the WCI, by 2050 under the A1B scenario, 0.5 to 3.1 billion people are exposed to an increase in water scarcity due to climate change (range across 21 GCMs). This represents a higher upper-estimate than previous assessments because scenarios are constructed from a wider range of GCMs. A substantial proportion of the uncertainty in the global-scale effect of climate change on water scarcity is due to uncertainty in the estimates for South Asia and East Asia. Sensitivity to the WCI and WSI thresholds that define water scarcity can be comparable to the sensitivity to climate change pattern. More of the world will see an increase in exposure to water scarcity than a decrease due to climate change but this is not consistent across all climate change patterns. Additionally, investigation of the effects of a set of prescribed global mean temperature change scenarios show rapid increases in water scarcity due to climate change across many regions of the globe, up to 2°C, followed by stabilisation to 4°C
A robust Gauss-Newton algorithm for the optimization of hydrological models: benchmarking against industry-standard algorithms
Optimization of model parameters is a ubiquitous task in hydrological and environmental modeling. Currently, the environmental modeling community tends to favor evolutionary techniques over classical Newtonâtype methods, in the light of the geometrically problematic features of objective functions, such as multiple optima and general nonsmoothness. The companion paper (Qin et al., 2018, https://doi.org/10.1029/2017WR022488) introduced the robust GaussâNewton (RGN) algorithm, an enhanced version of the standard GaussâNewton algorithm that employs several heuristics to enhance its explorative abilities and perform robustly even for problematic objective functions. This paper focuses on benchmarking the RGN algorithm against three optimization algorithms generally accepted as âbest practiceâ in the hydrological community, namely, the LevenbergâMarquardt algorithm, the shuffled complex evolution (SCE) search (with 2 and 10 complexes), and the dynamically dimensioned search (DDS). The empirical case studies include four conceptual hydrological models and three catchments. Empirical results indicate that, on average, RGN is 2â3 times more efficient than SCE (2 complexes) by achieving comparable robustness at a lower cost, 7â9 times more efficient than SCE (10 complexes) by trading off some speed to more than compensate for a somewhat lower robustness, 5â7 times more efficient than LevenbergâMarquardt by achieving higher robustness at a moderate additional cost, and 12â26 times more efficient than DDS in terms of robustnessâperâfixedâcost. A detailed analysis of performance in terms of reliability and cost is provided. Overall, the RGN algorithm is an attractive option for the calibration of hydrological models, and we recommend further investigation of its benefits for broader types of optimization problems.Youwei Qin, Dmitri Kavetski, George Kuczer
Test of the Kolmogorov-Johnson-Mehl-Avrami picture of metastable decay in a model with microscopic dynamics
The Kolmogorov-Johnson-Mehl-Avrami (KJMA) theory for the time evolution of
the order parameter in systems undergoing first-order phase transformations has
been extended by Sekimoto to the level of two-point correlation functions.
Here, this extended KJMA theory is applied to a kinetic Ising lattice-gas
model, in which the elementary kinetic processes act on microscopic length and
time scales. The theoretical framework is used to analyze data from extensive
Monte Carlo simulations. The theory is inherently a mesoscopic continuum
picture, and in principle it requires a large separation between the
microscopic scales and the mesoscopic scales characteristic of the evolving
two-phase structure. Nevertheless, we find excellent quantitative agreement
with the simulations in a large parameter regime, extending remarkably far
towards strong fields (large supersaturations) and correspondingly small
nucleation barriers. The original KJMA theory permits direct measurement of the
order parameter in the metastable phase, and using the extension to correlation
functions one can also perform separate measurements of the nucleation rate and
the average velocity of the convoluted interface between the metastable and
stable phase regions. The values obtained for all three quantities are verified
by other theoretical and computational methods. As these quantities are often
difficult to measure directly during a process of phase transformation, data
analysis using the extended KJMA theory may provide a useful experimental
alternative.Comment: RevTex, 21 pages including 14 ps figures. Submitted to Phys. Rev. B.
One misprint corrected in Eq.(C1
- âŠ