9,904 research outputs found
Darwinian Data Structure Selection
Data structure selection and tuning is laborious but can vastly improve an
application's performance and memory footprint. Some data structures share a
common interface and enjoy multiple implementations. We call them Darwinian
Data Structures (DDS), since we can subject their implementations to survival
of the fittest. We introduce ARTEMIS a multi-objective, cloud-based
search-based optimisation framework that automatically finds optimal, tuned DDS
modulo a test suite, then changes an application to use that DDS. ARTEMIS
achieves substantial performance improvements for \emph{every} project in
Java projects from DaCapo benchmark, popular projects and uniformly
sampled projects from GitHub. For execution time, CPU usage, and memory
consumption, ARTEMIS finds at least one solution that improves \emph{all}
measures for () of the projects. The median improvement across
the best solutions is , , for runtime, memory and CPU
usage.
These aggregate results understate ARTEMIS's potential impact. Some of the
benchmarks it improves are libraries or utility functions. Two examples are
gson, a ubiquitous Java serialization framework, and xalan, Apache's XML
transformation tool. ARTEMIS improves gson by \%, and for
memory, runtime, and CPU; ARTEMIS improves xalan's memory consumption by
\%. \emph{Every} client of these projects will benefit from these
performance improvements.Comment: 11 page
An empirical behavioral model of price formation
Although behavioral economics has demonstrated that there are many situations
where rational choice is a poor empirical model, it has so far failed to
provide quantitative models of economic problems such as price formation. We
make a step in this direction by developing empirical models that capture
behavioral regularities in trading order placement and cancellation using data
from the London Stock Exchange. For order placement we show that the
probability of placing an order at a given price is well approximated by a
Student distribution with less than two degrees of freedom, centered on the
best quoted price. This result is surprising because it implies that trading
order placement is symmetric, independent of the bid-ask spread, and the same
for buying and selling. We also develop a crude but simple cancellation model
that depends on the position of an order relative to the best price and the
imbalance between buying and selling orders in the limit order book. These
results are combined to construct a stochastic representative agent model, in
which the orders and cancellations are described in terms of conditional
probability distributions. This model is used to simulate price formation and
the results are compared to real data from the London Stock Exchange. Without
adjusting any parameters based on price data, the model produces good
predictions for the magnitude and functional form of the distribution of
returns and the bid-ask spread
Experimental analysis of computer system dependability
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance
Characterizing and controlling program behavior using execution-time variance
Immersive applications, such as computer gaming, computer vision and video codecs, are an important emerging class of applications with QoS requirements that are difficult to characterize and control using traditional methods. This thesis proposes new techniques reliant on execution-time variance to both characterize and control program behavior. The proposed techniques are intended to be broadly applicable to a wide variety of immersive applications and are intended to be easy for programmers to apply without needing to gain specialized expertise. First, we create new QoS controllers that programmers can easily apply to their applications to achieve desired application-speciïŹc QoS objectives on any platform or application data-set, provided the programmers verify that their applications satisfy some simple domain requirements speciïŹc to immersive applications. The controllers adjust programmer-identiïŹed knobs every application frame to effect desired values for programmer-identiïŹed QoS metrics. The control techniques are novel in that they do not require the user to provide any kind of application behavior models, and are effective for immersive applications that defy the traditional requirements for feedback controller construction. Second, we create new proïŹling techniques that provide visibility into the behavior of a large complex application, inferring behavior relationships across application components based on the execution-time variance observed at all levels of granularity of the application functionality. Additionally for immersive applications, some of the most important QoS requirements relate to managing the execution-time variance of key application components, for example, the frame-rate. The proïŹling techniques not only identify and summarize behavior directly relevant to the QoS aspects related to timing, but also indirectly reveal non-timing related properties of behavior, such as the identiïŹcation of components that are sensitive to data, or those whose behavior changes based on the call-context.Ph.D
Problems related to the integration of fault tolerant aircraft electronic systems
Problems related to the design of the hardware for an integrated aircraft electronic system are considered. Taxonomies of concurrent systems are reviewed and a new taxonomy is proposed. An informal methodology intended to identify feasible regions of the taxonomic design space is described. Specific tools are recommended for use in the methodology. Based on the methodology, a preliminary strawman integrated fault tolerant aircraft electronic system is proposed. Next, problems related to the programming and control of inegrated aircraft electronic systems are discussed. Issues of system resource management, including the scheduling and allocation of real time periodic tasks in a multiprocessor environment, are treated in detail. The role of software design in integrated fault tolerant aircraft electronic systems is discussed. Conclusions and recommendations for further work are included
QMCPACK: Advances in the development, efficiency, and application of auxiliary field and real-space variational and diffusion Quantum Monte Carlo
We review recent advances in the capabilities of the open source ab initio
Quantum Monte Carlo (QMC) package QMCPACK and the workflow tool Nexus used for
greater efficiency and reproducibility. The auxiliary field QMC (AFQMC)
implementation has been greatly expanded to include k-point symmetries,
tensor-hypercontraction, and accelerated graphical processing unit (GPU)
support. These scaling and memory reductions greatly increase the number of
orbitals that can practically be included in AFQMC calculations, increasing
accuracy. Advances in real space methods include techniques for accurate
computation of band gaps and for systematically improving the nodal surface of
ground state wavefunctions. Results of these calculations can be used to
validate application of more approximate electronic structure methods including
GW and density functional based techniques. To provide an improved foundation
for these calculations we utilize a new set of correlation-consistent effective
core potentials (pseudopotentials) that are more accurate than previous sets;
these can also be applied in quantum-chemical and other many-body applications,
not only QMC. These advances increase the efficiency, accuracy, and range of
properties that can be studied in both molecules and materials with QMC and
QMCPACK
- âŠ