3,154 research outputs found
FORTEST: Formal methods and testing
Formal methods have traditionally been used for specification and development of software. However there are potential benefits for the testing stage as well. The panel session associated with this paper explores the usefulness
or otherwise of formal methods in various contexts for improving software testing. A number of different possibilities for the use of formal methods are explored and questions raised. The contributors are all members of the UK FORTEST Network on formal methods and testing. Although
the authors generally believe that formal methods
are useful in aiding the testing process, this paper is intended to provoke discussion. Dissenters are encouraged to put their views to the panel or individually to the authors
AVMf: An Open-Source Framework and Implementation of the Alternating Variable Method
The Alternating Variable Method (AVM) has been shown to
be a fast and effective local search technique for search-based software
engineering. Recent improvements to the AVM have generalized the representations
it can optimize and have provably reduced its running time.
However, until now, there has been no general, publicly-available implementation
of the AVM incorporating all of these developments. We introduce
AVMf, an object-oriented Java framework that provides such an
implementation. AVMf is available from http://avmframework.org for
configuration and use in a wide variety of projects
Recommended from our members
Using formal methods to support testing
Formal methods and testing are two important approaches that assist in the development of high quality software. While traditionally these approaches have been seen as rivals, in recent
years a new consensus has developed in which they are seen as complementary. This article reviews the state of the art regarding ways in which the presence of a formal specification can be used to assist testing
Conditional citizens? welfare rights and responsibilities in the late 1990s
In Britain the relationship between welfare rights and responsibilities has undergone change. A new welfare 'consensus' that emphasizes a citizen ship centred on notions of duty rather than rights has been built. This has allowed the state to reduce its role as a provider of welfare and also defend a position in which the welfare rights of some citizens are increas ingly conditional on those individuals meeting compulsory responsibili ties or duties. This concentration on individual responsibility/duty has undermined the welfare rights of some of the poorest members of society. Three levels of debate are considered within the article: academic, pol itical and 'grassroots'. The latter is included in an attempt to allow some 'bottom up' views into what is largely a debate dominated by social sci entists and politicians
Photon angular distribution and nuclear-state alignment in nuclear excitation by electron capture
The alignment of nuclear states resonantly formed in nuclear excitation by
electron capture (NEEC) is studied by means of a density matrix technique. The
vibrational excitations of the nucleus are described by a collective model and
the electrons are treated in a relativistic framework. Formulas for the angular
distribution of photons emitted in the nuclear relaxation are derived. We
present numerical results for alignment parameters and photon angular
distributions for a number of heavy elements in the case of E2 nuclear
transitions. Our results are intended to help future experimental attempts to
discern NEEC from radiative recombination, which is the dominant competing
process
Belief revision and uncertain reasoning
When a new piece of information contradicts a currently held belief, one has to modify the set of beliefs in order to restore its consistency. In the case where it is necessary to give up a belief, some of them are less likely to be abandoned than others. The concept of epistemic entrenchment is used by some AI approaches to explain this fact based on formal properties of the belief set (e. g. , transitivity). Two experiments were designed to test the hypothesis that contrary to such views, (i) belief is naturally represented by degrees rather than in an all-or-nothing manner, (ii) entrenchment is primarily a matter of content and not only a matter of form, and (iii) consequently prior degree of belief is a powerful factor of change. The two experiments used Elio and Pelletier's (1997) paradigm in which participants were presented with full simple deductive arguments whose conclusion was denied, following which they were asked to decide which premise to revise
The time of the Roma in times of crisis: Where has European neoliberal capitalism failed?
This paper argues that the economic and financial crisis that has ensnared Europe from the late 2000s has been instrumental in reshaping employment and social relations in a detrimental way for the majority of the European people. It argues that the crisis has exacerbated the socio-economic position of most Roma people, immigrants as well as of other vulnerable groups. This development is approached here as an outcome of the widening structural inequalities that underpin the crisis within an increasingly neoliberalised Europe. Through recent policy developments and public discourses from a number of European countries I show how rising inequalities nurture racialised social tensions. My account draws on classic and contemporary theoretical propositions that have been propounded about the nature of capitalism, its contemporary re-articulation as well as its ramification for the future of Europe
Sparse Exploratory Factor Analysis
Sparse principal component analysis is a very active research area in the last decade. It produces component loadings with many zero entries which facilitates their interpretation and helps avoid redundant variables. The classic factor analysis is another popular dimension reduction technique which shares similar interpretation problems and could greatly benefit from sparse solutions. Unfortunately, there are very few works considering sparse versions of the classic factor analysis. Our goal is to contribute further in this direction. We revisit the most popular procedures for exploratory factor analysis, maximum likelihood and least squares. Sparse factor loadings are obtained for them by, first, adopting a special reparameterization and, second, by introducing additional [Formula: see text]-norm penalties into the standard factor analysis problems. As a result, we propose sparse versions of the major factor analysis procedures. We illustrate the developed algorithms on well-known psychometric problems. Our sparse solutions are critically compared to ones obtained by other existing methods
Storage selection functions: A coherent framework for quantifying how catchments store and release water and solutes
We discuss a recent theoretical approach combining catchment-scale flow and transport processes into a unified framework. The approach is designed to characterize the hydrochemistry of hydrologic systems and to meet the challenges posed by empirical evidence. StorAge Selection functions (SAS) are defined to represent the way catchment storage supplies the outflows with water of different ages, thus regulating the chemical composition of out-fluxes. Biogeochemical processes are also reflected in the evolving residence time distribution and thus in age-selection. Here we make the case for the routine use of SAS functions and look forward to areas where further research is needed
- …