81,007 research outputs found
FDDetector: A Tool for Deduplicating Features in Software Product Lines
Duplication is one of the model defects that affect software product lines during their evolution. Many approaches have been proposed to deal with duplication in code level while duplication in features hasnโt received big interest in literature. At the aim of reducing maintenance cost and improving product quality in an early stage of a product line, we have proposed in previous work a tool support based on a conceptual framework. The main objective of this tool called FDDetector is to detect and correct duplication in product line models. In this paper, we recall the motivation behind creating a solution for feature deduplication and we present progress done in the design and implementation of FDDetector
Structured Review of Code Clone Literature
This report presents the results of a structured review of code clone literature. The aim of the review is to assemble a conceptual model of clone-related concepts which helps us to reason about clones. This conceptual model unifies clone concepts from a wide range of literature, so that findings about clones can be compared with each other
The MGDO software library for data analysis in Ge neutrinoless double-beta decay experiments
The GERDA and Majorana experiments will search for neutrinoless double-beta
decay of germanium-76 using isotopically enriched high-purity germanium
detectors. Although the experiments differ in conceptual design, they share
many aspects in common, and in particular will employ similar data analysis
techniques. The collaborations are jointly developing a C++ software library,
MGDO, which contains a set of data objects and interfaces to encapsulate, store
and manage physical quantities of interest, such as waveforms and high-purity
germanium detector geometries. These data objects define a common format for
persistent data, whether it is generated by Monte Carlo simulations or an
experimental apparatus, to reduce code duplication and to ease the exchange of
information between detector systems. MGDO also includes general-purpose
analysis tools that can be used for the processing of measured or simulated
digital signals. The MGDO design is based on the Object-Oriented programming
paradigm and is very flexible, allowing for easy extension and customization of
the components. The tools provided by the MGDO libraries are used by both GERDA
and Majorana.Comment: 4 pages, 1 figure, proceedings for TAUP201
Concurrent Lexicalized Dependency Parsing: The ParseTalk Model
A grammar model for concurrent, object-oriented natural language parsing is
introduced. Complete lexical distribution of grammatical knowledge is achieved
building upon the head-oriented notions of valency and dependency, while
inheritance mechanisms are used to capture lexical generalizations. The
underlying concurrent computation model relies upon the actor paradigm. We
consider message passing protocols for establishing dependency relations and
ambiguity handling.Comment: 90kB, 7pages Postscrip
The Problem of Confirmation in the Everett Interpretation
I argue that the Oxford school Everett interpretation is internally
incoherent, because we cannot claim that in an Everettian universe the kinds of
reasoning we have used to arrive at our beliefs about quantum mechanics would
lead us to form true beliefs. I show that in an Everettian context, the
experimental evidence that we have available could not provide empirical
confirmation for quantum mechanics, and moreover that we would not even be able
to establish reference to the theoretical entities of quantum mechanics. I then
consider a range of existing Everettian approaches to the probability problem
and show that they do not succeed in overcoming this incoherence
The ModelCC Model-Driven Parser Generator
Syntax-directed translation tools require the specification of a language by
means of a formal grammar. This grammar must conform to the specific
requirements of the parser generator to be used. This grammar is then annotated
with semantic actions for the resulting system to perform its desired function.
In this paper, we introduce ModelCC, a model-based parser generator that
decouples language specification from language processing, avoiding some of the
problems caused by grammar-driven parser generators. ModelCC receives a
conceptual model as input, along with constraints that annotate it. It is then
able to create a parser for the desired textual syntax and the generated parser
fully automates the instantiation of the language conceptual model. ModelCC
also includes a reference resolution mechanism so that ModelCC is able to
instantiate abstract syntax graphs, rather than mere abstract syntax trees.Comment: In Proceedings PROLE 2014, arXiv:1501.0169
The essence of component-based design and coordination
Is there a characteristic of coordination languages that makes them
qualitatively different from general programming languages and deserves special
academic attention? This report proposes a nuanced answer in three parts. The
first part highlights that coordination languages are the means by which
composite software applications can be specified using components that are only
available separately, or later in time, via standard interfacing mechanisms.
The second part highlights that most currently used languages provide
mechanisms to use externally provided components, and thus exhibit some
elements of coordination. However not all do, and the availability of an
external interface thus forms an objective and qualitative criterion that
distinguishes coordination. The third part argues that despite the qualitative
difference, the segregation of academic attention away from general language
design and implementation has non-obvious cost trade-offs.Comment: 8 pages, 2 figures, 3 table
- โฆ