265,922 research outputs found
Recommended from our members
High performance Monte Carlo computation for finance risk data analysis
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Finance risk management has been playing an increasingly important role in the finance sector, to analyse finance data and to prevent any potential crisis. It has been widely recognised that Value at Risk (VaR) is an effective method for finance risk management and evaluation. This thesis conducts a comprehensive review on a number of VaR methods and discusses in depth their strengths and limitations. Among these VaR methods, Monte Carlo simulation and analysis has proven to be the most accurate VaR method in finance risk evaluation due to its strong modelling capabilities. However, one major challenge in Monte Carlo analysis is its high computing complexity of O(n²). To speed up the computation in Monte Carlo analysis, this thesis parallelises Monte Carlo using the MapReduce model, which has become a major software programming model in support of data intensive applications. MapReduce consists of two functions - Map and Reduce. The Map function segments a large data set into small data chunks and distribute these data chunks among a number of computers for processing in parallel with a Mapper processing a data chunk on a computing node. The Reduce function collects the results generated by these Map nodes (Mappers) and generates an output. The parallel Monte Carlo is evaluated initially in a small scale MapReduce experimental environment, and subsequently evaluated in a large scale simulation environment. Both experimental and simulation results show that the MapReduce based parallel Monte Carlo is greatly faster than the sequential Monte Carlo in computation, and the accuracy level is maintained as well. In data intensive applications, moving huge volumes of data among the computing nodes could incur high overhead in communication. To address this issue, this thesis further considers data locality in the MapReduce based parallel Monte Carlo, and evaluates the impacts of data locality on the performance in computation
Data-driven modelling of biological multi-scale processes
Biological processes involve a variety of spatial and temporal scales. A
holistic understanding of many biological processes therefore requires
multi-scale models which capture the relevant properties on all these scales.
In this manuscript we review mathematical modelling approaches used to describe
the individual spatial scales and how they are integrated into holistic models.
We discuss the relation between spatial and temporal scales and the implication
of that on multi-scale modelling. Based upon this overview over
state-of-the-art modelling approaches, we formulate key challenges in
mathematical and computational modelling of biological multi-scale and
multi-physics processes. In particular, we considered the availability of
analysis tools for multi-scale models and model-based multi-scale data
integration. We provide a compact review of methods for model-based data
integration and model-based hypothesis testing. Furthermore, novel approaches
and recent trends are discussed, including computation time reduction using
reduced order and surrogate models, which contribute to the solution of
inference problems. We conclude the manuscript by providing a few ideas for the
development of tailored multi-scale inference methods.Comment: This manuscript will appear in the Journal of Coupled Systems and
Multiscale Dynamics (American Scientific Publishers
An empirical learning-based validation procedure for simulation workflow
Simulation workflow is a top-level model for the design and control of
simulation process. It connects multiple simulation components with time and
interaction restrictions to form a complete simulation system. Before the
construction and evaluation of the component models, the validation of
upper-layer simulation workflow is of the most importance in a simulation
system. However, the methods especially for validating simulation workflow is
very limit. Many of the existing validation techniques are domain-dependent
with cumbersome questionnaire design and expert scoring. Therefore, this paper
present an empirical learning-based validation procedure to implement a
semi-automated evaluation for simulation workflow. First, representative
features of general simulation workflow and their relations with validation
indices are proposed. The calculation process of workflow credibility based on
Analytic Hierarchy Process (AHP) is then introduced. In order to make full use
of the historical data and implement more efficient validation, four learning
algorithms, including back propagation neural network (BPNN), extreme learning
machine (ELM), evolving new-neuron (eNFN) and fast incremental gaussian mixture
model (FIGMN), are introduced for constructing the empirical relation between
the workflow credibility and its features. A case study on a landing-process
simulation workflow is established to test the feasibility of the proposed
procedure. The experimental results also provide some useful overview of the
state-of-the-art learning algorithms on the credibility evaluation of
simulation models
State of the Art in the Optimisation of Wind Turbine Performance Using CFD
Wind energy has received increasing attention in recent years due to its sustainability and geographically wide availability. The efficiency of wind energy utilisation highly depends on the performance of wind turbines, which convert the kinetic energy in wind into electrical energy. In order to optimise wind turbine performance and reduce the cost of next-generation wind turbines, it is crucial to have a view of the state of the art in the key aspects on the performance optimisation of wind turbines using Computational Fluid Dynamics (CFD), which has attracted enormous interest in the development of next-generation wind turbines in recent years. This paper presents a comprehensive review of the state-of-the-art progress on optimisation of wind turbine performance using CFD, reviewing the objective functions to judge the performance of wind turbine, CFD approaches applied in the simulation of wind turbines and optimisation algorithms for wind turbine performance. This paper has been written for both researchers new to this research area by summarising underlying theory whilst presenting a comprehensive review on the up-to-date studies, and experts in the field of study by collecting a comprehensive list of related references where the details of computational methods that have been employed lately can be obtained
Principles and Concepts of Agent-Based Modelling for Developing Geospatial Simulations
The aim of this paper is to outline fundamental concepts and principles of the Agent-Based Modelling (ABM) paradigm, with particular reference to the development of geospatial simulations. The paper begins with a brief definition of modelling, followed by a classification of model types, and a comment regarding a shift (in certain circumstances) towards modelling systems at the individual-level. In particular, automata approaches (e.g. Cellular Automata, CA, and ABM) have been particularly popular, with ABM moving to the fore. A definition of agents and agent-based models is given; identifying their advantages and disadvantages, especially in relation to geospatial modelling. The potential use of agent-based models is discussed, and how-to instructions for developing an agent-based model are provided. Types of simulation / modelling systems available for ABM are defined, supplemented with criteria to consider before choosing a particular system for a modelling endeavour. Information pertaining to a selection of simulation / modelling systems (Swarm, MASON, Repast, StarLogo, NetLogo, OBEUS, AgentSheets and AnyLogic) is provided, categorised by their licensing policy (open source, shareware / freeware and proprietary systems). The evaluation (i.e. verification, calibration, validation and analysis) of agent-based models and their output is examined, and noteworthy applications are discussed.Geographical Information Systems (GIS) are a particularly useful medium for representing model input and output of a geospatial nature. However, GIS are not well suited to dynamic modelling (e.g. ABM). In particular, problems of representing time and change within GIS are highlighted. Consequently, this paper explores the opportunity of linking (through coupling or integration / embedding) a GIS with a simulation / modelling system purposely built, and therefore better suited to supporting the requirements of ABM. This paper concludes with a synthesis of the discussion that has proceeded. The aim of this paper is to outline fundamental concepts and principles of the Agent-Based Modelling (ABM) paradigm, with particular reference to the development of geospatial simulations. The paper begins with a brief definition of modelling, followed by a classification of model types, and a comment regarding a shift (in certain circumstances) towards modelling systems at the individual-level. In particular, automata approaches (e.g. Cellular Automata, CA, and ABM) have been particularly popular, with ABM moving to the fore. A definition of agents and agent-based models is given; identifying their advantages and disadvantages, especially in relation to geospatial modelling. The potential use of agent-based models is discussed, and how-to instructions for developing an agent-based model are provided. Types of simulation / modelling systems available for ABM are defined, supplemented with criteria to consider before choosing a particular system for a modelling endeavour. Information pertaining to a selection of simulation / modelling systems (Swarm, MASON, Repast, StarLogo, NetLogo, OBEUS, AgentSheets and AnyLogic) is provided, categorised by their licensing policy (open source, shareware / freeware and proprietary systems). The evaluation (i.e. verification, calibration, validation and analysis) of agent-based models and their output is examined, and noteworthy applications are discussed.Geographical Information Systems (GIS) are a particularly useful medium for representing model input and output of a geospatial nature. However, GIS are not well suited to dynamic modelling (e.g. ABM). In particular, problems of representing time and change within GIS are highlighted. Consequently, this paper explores the opportunity of linking (through coupling or integration / embedding) a GIS with a simulation / modelling system purposely built, and therefore better suited to supporting the requirements of ABM. This paper concludes with a synthesis of the discussion that has proceeded
Emulation of Poincaré return maps with Gaussian Kriging models
In this paper we investigate the use of Gaussian emulators to give an accurate and computationally fast method to approximate return maps, a tool used to study the dynamics of differential equations. One advantage of emulators over other approximation techniques is that they encode deterministic data exactly, so where values of the return map are known these are also outputs of the emulator output, another is that emulators allow us to simultaneously emulate a parameterized family of ODEs giving a tool to assess the behavior of perturbed systems. The methods introduced here are illustrated using two well-known dynamical systems: The Rossler equations, and the Billiard system. We show that the method can be used to look at return maps and discuss the further implications for full computation of differential equation outputs
Contrasting the capabilities of building energy performance simulation programs
For the past 50 years, a wide variety of building energy simulation programs have been developed, enhanced and are in use throughout the building energy community. This paper is an overview of a report, which provides up-to-date comparison of the features and capabilities of twenty major building energy simulation programs. The comparison is based on information provided by the program developers in the following categories: general modeling features; zone loads; building envelope and daylighting and solar; infiltration, ventilation and multizone airflow; renewable energy systems; electrical systems and equipment; HVAC systems; HVAC equipment; environmental emissions; economic evaluation; climate data availability, results reporting; validation; and user interface, links to other programs, and availability
A review of wildland fire spread modelling, 1990-present 3: Mathematical analogues and simulation models
In recent years, advances in computational power and spatial data analysis
(GIS, remote sensing, etc) have led to an increase in attempts to model the
spread and behvaiour of wildland fires across the landscape. This series of
review papers endeavours to critically and comprehensively review all types of
surface fire spread models developed since 1990. This paper reviews models of a
simulation or mathematical analogue nature. Most simulation models are
implementations of existing empirical or quasi-empirical models and their
primary function is to convert these generally one dimensional models to two
dimensions and then propagate a fire perimeter across a modelled landscape.
Mathematical analogue models are those that are based on some mathematical
conceit (rather than a physical representation of fire spread) that
coincidentally simulates the spread of fire. Other papers in the series review
models of an physical or quasi-physical nature and empirical or quasi-empirical
nature. Many models are extensions or refinements of models developed before
1990. Where this is the case, these models are also discussed but much less
comprehensively.Comment: 20 pages + 9 pages references + 1 page figures. Submitted to the
International Journal of Wildland Fir
- …