1,844 research outputs found
Validation of simulated real world TCP stacks
The TCP models in ns-2 have been validated and are widely used in network research. They are however not aimed at producing results consistent with a TCP implementation, they are rather designed to be a general model for TCP congestion control. The Network Simulation Cradle makes real world TCP implementations available to ns-2: Linux, FreeBSD and OpenBSD can all be simulated as easily as using the original simplified models. These simulated TCP implementations can be validated by directly comparing packet traces from simulations to traces measured from a real network. We describe the Network Simulation Cradle, present packet trace comparison results showing the high degree of accuracy possible when simulating with real TCP implementations and briefly show how this is reflected in a simulation study of TCP throughput
MGSim - Simulation tools for multi-core processor architectures
MGSim is an open source discrete event simulator for on-chip hardware
components, developed at the University of Amsterdam. It is intended to be a
research and teaching vehicle to study the fine-grained hardware/software
interactions on many-core and hardware multithreaded processors. It includes
support for core models with different instruction sets, a configurable
multi-core interconnect, multiple configurable cache and memory models, a
dedicated I/O subsystem, and comprehensive monitoring and interaction
facilities. The default model configuration shipped with MGSim implements
Microgrids, a many-core architecture with hardware concurrency management.
MGSim is furthermore written mostly in C++ and uses object classes to represent
chip components. It is optimized for architecture models that can be described
as process networks.Comment: 33 pages, 22 figures, 4 listings, 2 table
Network Simulation Cradle
This thesis proposes the use of real world network stacks instead of protocol
abstractions in a network simulator, bringing the actual code used in
computer systems inside the simulator and allowing for greater simulation
accuracy. Specifically, a framework called the Network Simulation
Cradle is created that supports the kernel source code from FreeBSD, OpenBSD
and Linux to make the network stacks from these systems available to the
popular network simulator ns-2.
Simulating with these real world network stacks reveals situations where the
result differs significantly from ns-2's TCP models. The simulated
network stacks are able to be directly compared to the same operating system
running on an actual machine, making validation simple. When measuring the
packet traces produced on a test network and in simulation the results are
nearly identical, a level of accuracy previously unavailable using traditional
TCP simulation models. The results of simulations run comparing ns-2 TCP
models and our framework are presented in this dissertation along with
validation studies of our framework showing how closely simulation resembles
real world computers.
Using real world stacks to simulate TCP is a complementary approach to using
the existing TCP models and provides an extra level of validation. This way of
simulating TCP and other protocols provides the network researcher or engineer
new possibilities. One example is using the framework as a protocol
development environment, which allows user-level development of protocols with
a standard set of reproducible tests, the ability to test scenarios which are
costly or impossible to build physically, and being able to trace and debug
the protocol code without affecting results
Using Emulation to Engineer and Understand Simulations of Biological Systems
Modeling and simulation techniques have demonstrated success in studying biological systems. As the drive to better capture biological complexity leads to more sophisticated simulators, it becomes challenging to perform statistical analyses that help translate predictions into increased understanding. These analyses may require repeated executions and extensive sampling of high-dimensional parameter spaces: analyses that may become intractable due to time and resource limitations. Significant reduction in these requirements can be obtained using surrogate models, or emulators, that can rapidly and accurately predict the output of an existing simulator. We apply emulation to evaluate and enrich understanding of a previously published agent-based simulator of lymphoid tissue organogenesis, showing an ensemble of machine learning techniques can reproduce results obtained using a suite of statistical analyses within seconds. This performance improvement permits incorporation of previously intractable analyses, including multi-objective optimization to obtain parameter sets that yield a desired response, and Approximate Bayesian Computation to assess parametric uncertainty. To facilitate exploitation of emulation in simulation-focused studies, we extend our open source statistical package, spartan, to provide a suite of tools for emulator development, validation, and application. Overcoming resource limitations permits enriched evaluation and refinement, easing translation of simulator insights into increased biological understanding
Performance, Validation and Testing with the Network Simulation Cradle
Much current simulation of TCP makes use of simplified models of TCP, which is a large and complex protocol with many variations possible between implementations. We use direct execution of real world network stacks in the network simulator ns-2 to compare TCP performance between implementations and reproduce existing work. A project called The Network Simulation Cradle provides the real world network stacks and we show how it can be used for performance evaluation and validation. There are large differences in performance between simplified TCP models and TCP implementations in some situations. Such differences are apparent in some reproduced research, with results using the Network Simulation Cradle very different from the results produced with the ns-2 TCP models. In other cases, using the real implementations gives very similar results, validating the original research
Quantum Adversarial Learning in Emulation of Monte-Carlo Methods for Max-cut Approximation: QAOA is not optimal
One of the leading candidates for near-term quantum advantage is the class of
Variational Quantum Algorithms, but these algorithms suffer from classical
difficulty in optimizing the variational parameters as the number of parameters
increases. Therefore, it is important to understand the expressibility and
power of various ans\"atze to produce target states and distributions. To this
end, we apply notions of emulation to Variational Quantum Annealing and the
Quantum Approximate Optimization Algorithm (QAOA) to show that QAOA is
outperformed by variational annealing schedules with equivalent numbers of
parameters. Our Variational Quantum Annealing schedule is based on a novel
polynomial parameterization that can be optimized in a similar gradient-free
way as QAOA, using the same physical ingredients. In order to compare the
performance of ans\"atze types, we have developed statistical notions of
Monte-Carlo methods. Monte-Carlo methods are computer programs that generate
random variables that approximate a target number that is computationally hard
to calculate exactly. While the most well-known Monte-Carlo method is
Monte-Carlo integration (e.g. Diffusion Monte-Carlo or path-integral quantum
Monte-Carlo), QAOA is itself a Monte-Carlo method that finds good solutions to
NP-complete problems such as Max-cut. We apply these statistical Monte-Carlo
notions to further elucidate the theoretical framework around these quantum
algorithms
Emulation of Dynamic Process-Based Agroecosystem Models Using Long Short-Term Memory Networks
Modeling carbon balance in agroecosystems help monitoring changes in carbon emissions and influence in ecosystem functioning and productivity. Process-based models are widely used in modeling diverse agroecosystems, and also enable quantification of carbon balance in agroecosystems. However, process-based models tend to be very computationally demanding, due to their complex computations based on hypotheses and assumption of the dynamics of the system. The computational demands complicate performing large scale simulations, needed when simulating several different parameter scenarios, such as model calibration and sensitivity analysis.
In order to mitigate the computational burden of large scale simulations, a surrogate model utilizing neural networks is developed to emulate the behavior of a process-based land model BASGRA\_N, obtaining a fast execution time. The emulator recognizes sequentially dependent data by networks specifically designed for sequential learning. Additionally, it is applicable to other similar agroecosystem models. The model is evaluated by 5-fold cross validation, achieving RMSEs of 0.0290 (g C m^(-2) h^(-1)) and 0.322 (m^2 m^(-2)) for weekly mean values of hourly NPP and LAI, respectively. Each of the 5 folds give R^2 of >0.91 for NPP and >0.93 for LAI.
The thesis begins with basic concepts on neural networks, concerning to regression tasks, covering a fundamental neural network model, its architecture, features, and general training methods. Subsequently, the study continues to sequential modeling and introduces neural networks designed for processing sequentially structured data. Subsequently, an overall review on existing research on machine learning applications, especially in emulation of process-based models, is provided. Lastly a novel emulator model applying neural networks is introduced for emulation of an agroecosystem model.
This project was done in collaboration with Carbon Cycle group of Finnish Meteorological Institute, for their requirement for an emulator for a process-based agroecosystem model BASGRA_N to enable large scale simulations for simulator calibration purposes
- ā¦