26,313 research outputs found
Big Data Refinement
"Big data" has become a major area of research and associated funding, as well as a focus of utopian thinking. In the still growing research community, one of the favourite optimistic analogies for data processing is that of the oil refinery, extracting the essence out of the raw data. Pessimists look for their imagery to the other end of the petrol cycle, and talk about the "data exhausts" of our society.
Obviously, the refinement community knows how to do "refining". This paper explores the extent to which notions of refinement and data in the formal methods community relate to the core concepts in "big data". In particular, can the data refinement paradigm can be used to explain aspects of big data processing
Modernity and morality in Habermas's discourse ethics
Discourse ethics is originally conceived as a programme of philosophical justification of morality. This depends on the formal derivation of the moral principle (U) from non-moral principles. The moral theory is supposed to fall out of a pragmatic theory of meaning. The original programme plays a central role in Habermas's social theory: the moral theory, if true, provides good evidence for the more general theory of modernization. But neither Habermas nor his followers have succeeded in providing a formal derivation. This essay shows how and why Habermas's proposed derivation is impossible. As if aware of the lacuna, Habermas has recently suggested that (U) can be derived by 'abduction' rather than deduction. The proposal draws heavily on modernization theory; hence the only justification for (U) now available to him rests on premises drawn from that theory. The original programme of the justification of morality has thus given way to the weaker programme of the philosophical elucidation of morality. Further, since Habermas's moral theory is no longer justified independently of modernization theory, but at least partly by it, the moral theory cannot without circularity provide evidence for the modernization theory
On Generating Combilex Pronunciations via Morphological Analysis
Combilex is a high-quality lexicon that has been developed specifically for speech technology purposes and recently released by CSTR. Combilex benefits from many advanced features. This paper explores one of these: the ability to generate fully-specified transcriptions for morphologically derived words automatically. This functionality was originally implemented to encode the pronunciations of derived words in terms of their constituent morphemes, thus accelerating lexicon development and ensuring a high level of consistency. In this paper, we propose this method of modelling pronunciations can be exploited further by combining it with a morphological parser, thus yielding a method to generate full transcriptions for unknown derived words. Not only could this accelerate adding new derived words to Combilex, but it could also serve as an alternative to conventional letter-to-sound rules. This paper presents preliminary work indicating this is a promising direction
Logic-Based Analogical Reasoning and Learning
Analogy-making is at the core of human intelligence and creativity with
applications to such diverse tasks as commonsense reasoning, learning, language
acquisition, and story telling. This paper contributes to the foundations of
artificial general intelligence by developing an abstract algebraic framework
for logic-based analogical reasoning and learning in the setting of logic
programming. The main idea is to define analogy in terms of modularity and to
derive abstract forms of concrete programs from a `known' source domain which
can then be instantiated in an `unknown' target domain to obtain analogous
programs. To this end, we introduce algebraic operations for syntactic program
composition and concatenation and illustrate, by giving numerous examples, that
programs have nice decompositions. Moreover, we show how composition gives rise
to a qualitative notion of syntactic program similarity. We then argue that
reasoning and learning by analogy is the task of solving analogical proportions
between logic programs. Interestingly, our work suggests a close relationship
between modularity, generalization, and analogy which we believe should be
explored further in the future. In a broader sense, this paper is a first step
towards an algebraic and mainly syntactic theory of logic-based analogical
reasoning and learning in knowledge representation and reasoning systems, with
potential applications to fundamental AI-problems like commonsense reasoning
and computational learning and creativity
Transformational derivation of programs using the Focus system
A program derivation support system called Focus is being constructed. It will formally derive programs using the paradigm of program transformation. The following issues are discussed: (1) the integration of validation and program derivation activities in the Focus system; (2) its tree-based user interface; (3) the control of search spaces in program derivation; and (4) the structure and organization of program derivation records. The inference procedures of the system are based on the integration of functional and logic programming principles. This brings about a synthesis of paradigms that were heretofore considered far apart, such as logical and executable specifications and constructive and transformational approaches to program derivation. A great emphasis has been placed, in the design of Focus, on achieving small search spaces during program derivation. The program manipulation operations such as expansion, simplification and rewriting were designed with this objective. The role of operations that are expensive in search spaces, such as folding, has been reduced. Program derivations are documented in Focus in a way that the high level descriptions of derivations are expressed only using program level information. All the meta-level information, together with dependencies between derivations of program components, is automatically recorded by the system at a lower level of description for its own use in replay
Tools for producing formal specifications : a view of current architectures and future directions
During the last decade, one important contribution towards requirements engineering has been the advent of formal specification languages. They offer a well-defined notation that can improve consistency and avoid ambiguity in specifications.
However, the process of obtaining formal specifications that are consistent with the requirements is itself a difficult activity. Hence various researchers are developing systems that aid the transition from informal to formal specifications.
The kind of problems tackled and the contributions made by these proposed systems are very diverse. This paper brings these studies together to provide a vision for future architectures that aim to aid the transition from informal to formal specifications. The new architecture, which is based on the strengths of existing studies, tackles a
number of key issues in requirements engineering such as identifying ambiguities, incompleteness, and reusability.
The paper concludes with a discussion of the research problems that need to be addressed in order to realise the proposed architecture
Hydrodynamic Nambu Brackets derived by Geometric Constraints
A geometric approach to derive the Nambu brackets for ideal two-dimensional
(2D) hydrodynamics is suggested. The derivation is based on two-forms with
vanishing integrals in a periodic domain, and with resulting dynamics
constrained by an orthogonality condition. As a result, 2D hydrodynamics with
vorticity as dynamic variable emerges as a generic model, with conservation
laws which can be interpreted as enstrophy and energy functionals. Generalized
forms like surface quasi-geostrophy and fractional Poisson equations for the
stream-function are also included as results from the derivation. The formalism
is extended to a hydrodynamic system coupled to a second degree of freedom,
with the Rayleigh-B\'{e}nard convection as an example. This system is
reformulated in terms of constitutive conservation laws with two additive
brackets which represent individual processes: a first representing inviscid 2D
hydrodynamics, and a second representing the coupling between hydrodynamics and
thermodynamics. The results can be used for the formulation of conservative
numerical algorithms that can be employed, for example, for the study of fronts
and singularities.Comment: 12 page
- âŚ