21 research outputs found

    A multiphase Cahn-Hilliard system with mobilities and the numerical simulation of dewetting

    Full text link
    We propose in this paper a new multiphase Cahn-Hilliard model with doubly degenerate mobilities. We prove by a formal asymptotic analysis that it approximates with second order accuracy the multiphase surface diffusion flow with mobility coefficients and surface tensions. To illustrate that it lends itself well to numerical approximation, we propose a simple and effective numerical scheme together with a very compact Matlab implementation. We provide the results of various numerical experiments to show the influence of mobility and surface tension coefficients. Thanks to its second order accuracy and its good suitability for numerical implementation, our model is very handy for tackling notably difficult surface diffusion problems. In particular, we show that it can be used very effectively to simulate numerically the dewetting of thin liquid tubes on arbitrary solid supports without requiring nonlinear boundary conditions.Comment: 35 page

    Water Resources Decision Making Under Uncertainty

    Get PDF
    Uncertainty is in part about variability in relation to the physical characteristics of water resources systems. But uncertainty is also about ambiguity (Simonovic, 2009). Both variability and ambiguity are associated with a lack of clarity because of the behaviour of all system components, a lack of data, a lack of detail, a lack of structure to consider water resources management problems, working and framing assumptions being used to consider the problems, known and unknown sources of bias, and ignorance about how much effort it is worth expending to clarify the management situation. Climate change, addressed in this research project (CFCAS, 2008), is another important source of uncertainty that contributes to the variability in the input variables for water resources management. This report presents a set of examples that illustrate (a) probabilistic and (b) fuzzy set approaches for solving various water resources management problems. The main goal of this report is to demonstrate how information provided to water resources decision makers can be improved by using the tools that incorporate risk and uncertainty. The uncertainty associated with water resources decision making problems is quantified using probabilistic and fuzzy set approaches. A set of selected examples are presented to illustrate the application of probabilistic and fuzzy simulation, optimization, and multi-objective analysis to water resources design, planning and operations. Selected examples include dike design, sewer pipe design, optimal operations of a single purpose reservoir, and planning of a multi-purpose reservoir system. Demonstrated probabilistic and fuzzy tools can be easily adapted to many other water resources decision making problems.https://ir.lib.uwo.ca/wrrr/1035/thumbnail.jp

    Development of a vehicle dynamics controller for obstacle avoidance

    Get PDF
    As roads become busier and automotive technology improves, there is considerable potential for driver assistance systems to improve the safety of road users. Longitudinal collision warning and collision avoidance systems are starting to appear on production cars to assist drivers when required to stop in an emergency. Many luxury cars are also equipped with stability augmentation systems that prevent the car from spinning out of control during aggressive lateral manoeuvres. Combining these concepts, there is a natural progression to systems that could assist in aiding or performing lateral collision avoidance manoeuvres. A successful automatic lateral collision avoidance system would require convergent development of many fields of technology, from sensors and instrumentation to aid environmental awareness through to improvements in driver vehicle interfaces so that a degree of control can be smoothly and safely transferred between the driver and vehicle computer. A fundamental requirement of any collision avoidance system is determination of a feasible path that avoids obstacles and a means of causing the vehicle to follow that trajectory. This research focuses on feasible trajectory generation and development of an automatic obstacle avoidance controller that integrates steering and braking action. A controller is developed to cause a specially modified car (a Mercedes `S' class with steer-by-wire and brake-by-wire capability) to perform an ISO 3888-2 emergency obstacle avoidance manoeuvre. A nonlinear two-track vehicle model is developed and used to derive optimal controller parameters using a series of simulations. Feedforward and feedback control is used to track a feasible reference trajectory. The feedforward control loops use inverse models of the vehicle dynamics. The feedback control loops are implemented as linear proportional controllers with a force allocation matrix used to apportion braking effort between redundant actuators. Two trajectory generation routines are developed: a geometric method, for steering a vehicle at its physical limits; and an optimal method, which integrates steering and braking action to make full use of available traction. The optimal trajectory is obtained using a multi-stage convex optimisation procedure. The overall controller performance is validated by simulation using a complex proprietary model of the vehicle that is reported to have been validated and calibrated against experimental data over several years of use in an industrial environment

    The investigation of noctilucent clouds and other mesospheric phenomena using ground-based instrumentation and rockets

    Get PDF
    The optical and dynamical properties of the summer phenomena known as Noctilucent Clouds (NLC) have been studied globally since the early 1960s. These clouds only occur naturally in the Earth's mesosphere, and are presently studied using remote sensing from rockets, and satellites in addition to ground-based observations. Direct evidence of the topology, and structure of an aerosol layer, such as a NLC can be achieved using non-imaging photodiode/photometers housed on a rocket payload. The APL designed photometers utilize the spin of the rocket payload, therefore scanning the entire sky laterally, and producing a two- dimensional image of the aerosol layer traversed during the upleg and downleg flights. The APL photodiode/photometers were flown during the MAED (Middle Atmospheric ElectroDynamics) and NLC-91 (NoctiLucent Clouds 1991) summer rocket campaigns respectively. These multinational rocket campaigns were coordinated with ground-based and satellite observations. The resultant APL data provides a complementary data source to be compared and contrasted with other rocket experiments flown during the absence/presence of NLC and/or PMSE (Polar Mesospheric Summer Echoes). A Bomem Michelson Interferometer (MI) was stationed in Sweden during the summer rocket campaign, NLC-91 provided measurements of the hydroxyl (3,1) band emission from a layer positioned ~87 km. The presented data were kindly provided by the Space Dynamics Laboratory, Utah State University; analyzed and interpreted by the author. The data gave a measure of the upper mesospheric conditions during the presence and absence of NLC during the rocket campaign. The interpretation of the raw data gave an indication of stratospheric filtering of upward propagating waves whose diminution could produce the upward forcing that may be involved in the NLC formation processes. Intensity and rotational temperature profiles deduced during the absence/presence of NLC gave clear results of small- and large-scale waves, and possible correlation between the hydroxyl intensity, and mesopausal temperature. A ground-based Imaging Fabry-Perot Interferometer (IFPI) stationed at the Bear Lake Observatory (BLO), Utah, (41.93°N, 111.42°W) has been operated, since 1989, to study the behaviour of the mesospheric winds at mid-latitudes deduced from the measurements of intensity, and wind speeds of the 8430 Å hydroxyl (6,2) band. The results presented have been studied during the summer periods from this site. This IFPI has provided an opportunity of observing this weak infra-red emission line, and provides a continuous monitoring of the mesopause region throughout the year. A comparison of characteristics on a night by night basis of hydroxyl intensity variations, and wind speed structure, and variations from the mid-latitude IFPI (BLO) with intensity fluctuations indicated by the Bomem MI as being associated with the absence/presence of NLC at summer high latitudes

    A fast classifier-based approach to credit card fraud detection

    Get PDF
    openThis thesis aims at addressing the problem of anomaly detection in the context of credit card fraud detection with machine learning. Specifically, the goal is to apply a new approach to two-sample testing based on classifiers recently developed for new physic searches in high-energy physics. This strategy allows one to compare batches of incoming data with a control sample of standard transactions in a statistically sound way without prior knowledge of the type of fraudulent activity. The learning algorithm at the basis of this approach is a modern implementation of kernel methods that allows for fast online training and high flexibility. This work is the first attempt to export this method to a real-world use case outside the domain of particle physics.This thesis aims at addressing the problem of anomaly detection in the context of credit card fraud detection with machine learning. Specifically, the goal is to apply a new approach to two-sample testing based on classifiers recently developed for new physic searches in high-energy physics. This strategy allows one to compare batches of incoming data with a control sample of standard transactions in a statistically sound way without prior knowledge of the type of fraudulent activity. The learning algorithm at the basis of this approach is a modern implementation of kernel methods that allows for fast online training and high flexibility. This work is the first attempt to export this method to a real-world use case outside the domain of particle physics

    Computational stress and damage modelling for rolling contact fatigue

    Get PDF

    Discrete relativistic positioning systems

    Get PDF
    We discuss the design for a discrete, immediate, simple relativistic positioning system (rPS) which is potentially able of self-positioning (up to isometries) and operating without calibration or ground control assistance. The design is discussed in 1 + 1 spacetimes, in Minkowski and Schwarzschild solutions, as well as in 2 + 1 spacetimes in Minkowski. The system works without calibration, i.e. clock synchronizations, or prior knowledge about the motion of clocks, it is robust, i.e. it is able to test hypotheses break down (for example, if one or more clocks temporarily become notfreely falling, or the gravitational field changes), and then it is automatically back and operational when the assumed conditions are restored. In the Schwarzschild case, we also check that the system can best fit the gravitational mass of the source of the gravitational field. We stress that no weak field assumptions are made anywhere. In particular, the rPS we propose can work in a region close to the horizon since it does not use approximations or PPN expansions. More generally, the rPS can be adapted as detectors for the gravitational field and we shall briefly discuss their role in testing different theoretical settings for gravity. In fact, rPS is a natural candidate for a canonical method to extract observables out of a gravitational theory, an activity also known as designing experiments to test gravity

    OpenFPM: A scalable environment for particle and particle-mesh codes on parallel computers

    Get PDF
    Scalable and efficient numerical simulations continue to gain importance, as computation is firmly established tool of discovery, together with theory and experiment. Meanwhile, the performance of computing hardware grows with increasing heterogeneous hardware, enabling simulations of ever more complex models. However, efficiently implementing scalable codes on heterogeneous, distributed hardware systems becomes the bottleneck. This bottleneck can be alleviated by intermediate software layers that provide higher-level abstractions closer to the problem domain, hence allowing the computational scientist to focus on the simulation. Here, we present OpenFPM, an open and scalable framework that provides an abstraction layer for numerical simulations using particles and/or meshes. OpenFPM provides transparent and scalable infrastructure for shared-memory and distributed-memory implementations of particles-only and hybrid particle-mesh simulations of both discrete and continuous models, as well as non-simulation codes. This infrastructure is complemented with frequently used numerical routines, as well as interfaces to third-party libraries. This thesis will present the architecture and design of OpenFPM, detail the underlying abstractions, and benchmark the framework in applications ranging from Smoothed-Particle Hydrodynamics (SPH) to Molecular Dynamics (MD), Discrete Element Methods (DEM), Vortex Methods, stencil codes, high-dimensional Monte Carlo sampling (CMA-ES), and Reaction-Diffusion solvers, comparing it to the current state of the art and existing software frameworks

    Radiation Damage of TiC and TiN during Microanalysis in the Electron Microscope

    Get PDF
    The work presented in this thesis is concerned with the preferential mass loss observed during high spatial resolution micro-analysis of ceramic materials such as TiC and TiN. Electron energy loss spectroscopy (EELS) was used to investigate the preferential loss of the light elements as a function of dose incident upon the specimen. This thesis is primarily concerned with the characterisation of the knock-on displacement damage mechanism thought to be responsible for the mass loss. Chapter 1 gives a general introduction to these knock-on damage processes with particular emphasis on radiation damage observed in the electron microscope. To perform quantitative EELS microanalysis without the use of standard specimens it is necessary to have an accurate knowledge of the cross-sections relevant to the inelastic scattering of electrons. Chapter 2 outlines the development of partial cross-sections required for EELS analysis and discusses other processes which may contribute to the EELS spectrum. Chapter 3 develops two radiation damage mechanisms as possible explanations for the depletion of the light elements in Tic and TiN during electron irradiation: a forward knock-on displacement model and an isotropic radiation induced diffusion model. In chapter 4 a brief description is given of the scanning transmission electron microscope (STEM) used to carry out the high current density radiation damage experiments described in this thesis. The discussion includes a description of two specimen preparation techniques developed to provide electron transparent samples with a plentiful supply of uniform thin areas, suitable for radiation damage experiments. The reduced dose rate incident upon the specimen during radiation damage experiments results in statistically poor EEL spectra. Often, the accuracy with which quantitative data can be extracted from these "noisy" spectra is limited by the accuracy of the background stripping routines. Chapter 5 compares and contrast three such background fitting routines using both experimental and theoretically generated spectra in an attempt to assess which, if any, is more reliable in the presence of noise. The experimental results are presented in chapter 6 and 7 for the TiC and TiN materials. Comparison of the rate of loss of C and N with respect to dose at various specimen thicknesses is carried out to establish which, if either, of the two radiation damage mechanisms considered in chapter 3 is applicable to the data. Other considerations such as dose rate effects, channeling effects and the loss of Ti from the sample are considered in more detail in chapter 7 and their effect on the measured rate of loss of N is established. Some high angle ADF images are presented in chapter 6 as a possible method of following the radiation damage process and to highlight the inhomogeneous nature of the damage processes in TiC and TiN
    corecore