1,984 research outputs found
Evolutionary improvement of programs
Most applications of genetic programming (GP) involve the creation of an entirely new function, program or expression to solve a specific problem. In this paper, we propose a new approach that applies GP to improve existing software by optimizing its non-functional properties such as execution time, memory usage, or power consumption. In general, satisfying non-functional requirements is a difficult task and often achieved in part by optimizing compilers. However, modern compilers are in general not always able to produce semantically equivalent alternatives that optimize non-functional properties, even if such alternatives are known to exist: this is usually due to the limited local nature of such optimizations. In this paper, we discuss how best to combine and extend the existing evolutionary methods of GP, multiobjective optimization, and coevolution in order to improve existing software. Given as input the implementation of a function, we attempt to evolve a semantically equivalent version, in this case optimized to reduce execution time subject to a given probability distribution of inputs. We demonstrate that our framework is able to produce non-obvious optimizations that compilers are not yet able to generate on eight example functions. We employ a coevolved population of test cases to encourage the preservation of the function's semantics. We exploit the original program both through seeding of the population in order to focus the search, and as an oracle for testing purposes. As well as discussing the issues that arise when attempting to improve software, we employ rigorous experimental method to provide interesting and practical insights to suggest how to address these issues
Do forage legumes have a role in modern dairy farming systems?
peer-reviewedIntensification in New Zealand dairy farming systems has placed greater pressure on
clover performance and fitness and has highlighted the need to develop clover cultivars
that are better adapted to intensive grazing systems. Increased stocking rates and
increased use of nitrogen fertiliser have put enormous pressure on the contribution of
clover to modern dairy systems. Future innovations such as semi-hybrid cultivars offer
the potential to improve the competitiveness of legumes with nitrogen-fertilised forage
grasses. Similarly, advances in condensed tannin research suggest that significant
animal performance gains can be achieved in conjunction with reduced environmental
impact. In order to capture these benefits, dairy farmers will need to reassess their grazing
management to ensure that legumes can be maintained at economically useful levels.
Novel grazing management systems that optimise the benefits provided by the grass and
legume components need to be used in future dairy farming systems. Forage legumes,
and especially white clover, have an important role to play in modern dairy systems
Searching for invariants using genetic programming and mutation testing
Invariants are concise and useful descriptions of a program's behaviour. As most programs are not annotated with invariants, previous research has attempted to automatically generate them from source code. In this paper, we propose a new approach to invariant generation using search. We reuse the trace generation front-end of existing tool Daikon and integrate it with genetic programming and a mutation testing tool. We demonstrate that our system can find the same invariants through search that Daikon produces via template instantiation, and we also find useful invariants that Daikon does not. We then present a method of ranking invariants such that we can identify those that are most interesting, through a novel application of program mutation
Ensuring Secure Non-interference of Programs by Game Semantics
Non-interference is a security property which states that improper information leakages due to direct and indirect flows have not occurred through executing programs. In this paper we investigate a game semantics based formulation ofnon-interference that allows to perform a security analysis of closed and open procedural programs. We show that such formulation is amenable to automated verification techniques. The practicality of this method is illustrated by several examples, which also emphasize its advantage compared to known operational methods for reasoning about open programs.<br/
Preceding rule induction with instance reduction methods
A new prepruning technique for rule induction is presented which applies instance reduction before rule induction. An empirical evaluation records the predictive accuracy and size of rule-sets generated from 24 datasets from the UCI Machine Learning Repository. Three instance reduction algorithms (Edited Nearest Neighbour, AllKnn and DROP5) are compared. Each one is used to reduce the size of the training set, prior to inducing a set of rules using Clark and Boswell's modification of CN2. A hybrid instance reduction algorithm (comprised of AllKnn and DROP5) is also tested. For most of the datasets, pruning the training set using ENN, AllKnn or the hybrid significantly reduces the number of rules generated by CN2, without adversely affecting the predictive performance. The hybrid achieves the highest average predictive accuracy
Influence of temperature gradients on tunnel junction thermometry below 1 K: cooling and electron-phonon coupling
We have studied thermal gradients in thin Cu and AlMn wires, both
experimentally and theoretically. In the experiments, the wires were Joule
heated non-uniformly at sub-Kelvin temperatures, and the resulting temperature
gradients were measured using normal metal-insulator-superconducting tunnel
junctions. The data clearly shows that even in reasonably well conducting thin
wires with a short (m) non-heated portion, significant temperature
differences can form. In most cases, the measurements agree well with a model
which includes electron-phonon interaction and electronic thermal conductivity
by the Wiedemann-Franz law.Comment: J. Low Temp. Phys. in pres
Generic meta-modelling with concepts, templates and mixin layers
The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-642-16145-2_2Proceedings of 13th International Conference, MODELS 2010, Oslo, Norway, October 3-8, 2010.Meta-modelling is a key technique in Model Driven Engineering, where it is used for language engineering and domain modelling. However, mainstream approaches like the OMG’s Meta-Object Facility provide little support for abstraction, modularity, reusability and extendibility of (meta-)models, behaviours and transformations.
In order to alleviate this weakness, we bring three elements of generic programming into meta-modelling: concepts, templates and mixin layers. Concepts permit an additional typing for models, enabling the definition of behaviours and transformations independently of meta-models, making specifications reusable. Templates use concepts to express requirements on their generic parameters, and are applicable to models and meta-models. Finally, we define functional layers by means of meta-model mixins which can extend other meta-models.
As a proof of concept we also report on MetaDepth, a multi-level meta-modelling framework that implements these ideas.Work sponsored by the Spanish Ministry of Science, project TIN2008-02081 and mobility grants JC2009-00015 and PR2009-0019, and by the R&D programme of the Community of Madrid, project S2009/TIC-165
Phase Transitions of Hard Disks in External Periodic Potentials: A Monte Carlo Study
The nature of freezing and melting transitions for a system of hard disks in
a spatially periodic external potential is studied using extensive Monte Carlo
simulations. Detailed finite size scaling analysis of various thermodynamic
quantities like the order parameter, its cumulants etc. are used to map the
phase diagram of the system for various values of the density and the amplitude
of the external potential. We find clear indication of a re-entrant liquid
phase over a significant region of the parameter space. Our simulations
therefore show that the system of hard disks behaves in a fashion similar to
charge stabilized colloids which are known to undergo an initial freezing,
followed by a re-melting transition as the amplitude of the imposed, modulating
field produced by crossed laser beams is steadily increased. Detailed analysis
of our data shows several features consistent with a recent dislocation
unbinding theory of laser induced melting.Comment: 36 pages, 16 figure
Spectroscopic Factors in Ca and Pb from : Fully Relativistic Analysis
We present results for spectroscopic factors of the outermost shells in
Ca and Pb, which have been derived from the comparison between
the available quasielastic () data from NIKHEF-K and the corresponding
calculated cross-sections obtained within a fully relativistic formalism. We
include exactly the effect of Coulomb distortion on the electron wave functions
and discuss its role in the extraction of the spectroscopic factors from
experiment. Without any adjustable parameter, we find spectroscopic factors of
about 70\%, consistent with theoretical predictions. We compare our results
with previous relativistic and nonrelativistic analyses of () data. In
addition to Coulomb distortion effects we discuss different choices of the
nucleon current operator and also analyze the effects due to the relativistic
treatment of the outgoing-distorted and bound nucleon wave functions.Comment: 9 pages RevTeX, 5 figures can be obtained from the author
Green functions for generalized point interactions in 1D: A scattering approach
Recently, general point interactions in one dimension has been used to model
a large number of different phenomena in quantum mechanics. Such potentials,
however, requires some sort of regularization to lead to meaningful results.
The usual ways to do so rely on technicalities which may hide important
physical aspects of the problem. In this work we present a new method to
calculate the exact Green functions for general point interactions in 1D. Our
approach differs from previous ones because it is based only on physical
quantities, namely, the scattering coefficients, and , to construct .
Renormalization or particular mathematical prescriptions are not invoked. The
simple formulation of the method makes it easy to extend to more general
contexts, such as for lattices of general point interactions; on a line; on
a half-line; under periodic boundary conditions; and confined in a box.Comment: Revtex, 9 pages, 3 EPS figures. To be published in PR
- …
