941 research outputs found
Compositional Model Repositories via Dynamic Constraint Satisfaction with Order-of-Magnitude Preferences
The predominant knowledge-based approach to automated model construction,
compositional modelling, employs a set of models of particular functional
components. Its inference mechanism takes a scenario describing the constituent
interacting components of a system and translates it into a useful mathematical
model. This paper presents a novel compositional modelling approach aimed at
building model repositories. It furthers the field in two respects. Firstly, it
expands the application domain of compositional modelling to systems that can
not be easily described in terms of interacting functional components, such as
ecological systems. Secondly, it enables the incorporation of user preferences
into the model selection process. These features are achieved by casting the
compositional modelling problem as an activity-based dynamic preference
constraint satisfaction problem, where the dynamic constraints describe the
restrictions imposed over the composition of partial models and the preferences
correspond to those of the user of the automated modeller. In addition, the
preference levels are represented through the use of symbolic values that
differ in orders of magnitude
First results from 2+1 dynamical quark flavors on an anisotropic lattice: light-hadron spectroscopy and setting the strange-quark mass
We present the first light-hadron spectroscopy on a set of
dynamical, anisotropic lattices. A convenient set of coordinates that
parameterize the two-dimensional plane of light and strange-quark masses is
introduced. These coordinates are used to extrapolate data obtained at the
simulated values of the quark masses to the physical light and strange-quark
point. A measurement of the Sommer scale on these ensembles is made, and the
performance of the hybrid Monte Carlo algorithm used for generating the
ensembles is estimated.Comment: 24 pages. Hadron Spectrum Collaboratio
Generalized Adaptive Fuzzy Rule Interpolation
As a substantial extension to fuzzy rule interpolation that works based on two neighbouring rules flanking an observation, adaptive fuzzy rule interpolation is able to restore system consistency when contradictory results are reached during interpolation. The approach first identifies the exhaustive sets of candidates, with each candidate consisting of a set of interpolation procedures which may jointly be responsible for the system inconsistency. Then, individual candidates are modified such that all contradictions are removed and thus interpolation consistency is restored. It has been developed on the assumption that contradictions may only be resulted from the underlying interpolation mechanism, and that all the identified candidates are not distinguishable in terms of their likelihood to be the real culprit. However, this assumption may not hold for real world situations. This paper therefore further develops the adaptive method by taking into account observations, rules and interpolation procedures, all as diagnosable and modifiable system components. Also, given the common practice in fuzzy systems that observations and rules are often associated with certainty degrees, the identified candidates are ranked by examining the certainty degrees of its components and their derivatives. From this, the candidate modification is carried out based on such ranking. This work significantly improves the efficacy of the existing adaptive system by exploiting more information during both the diagnosis and modification processes
Co-evolution of capabilities and preferences in the adoption of new technologies
The objective of this paper is to propose a multidisciplinary approach for the analysis of demand and innovation. It combines insights from studies on technology diffusion, evolutionary economics and cognitive psychology to argue that consumption and demand are learning processes driven by trial-and-error, rather than by ex-ante maximization. The paper presents a heuristic synthesis to incorporate learning processes in the determination of consumption preferences and capabilities. The case of banking service innovation in the UK is presented as an illustrative example of the outlined dynamics.Demand, Innovation, Technology Adoption, Learning
Focusing ATMS Problem-Solving: A Formal Approach
The Assumption-based Truth Maintenance System (ATMS) is a general and powerful problem-solving tool in AI. Unfortunately, its generality usually entails a high computational cost. In this paper, we study how a general notion of cost function can be incorporated into the design of an algorithm for focusing the ATMS, called BF-ATMS. The BF-ATMS algorithm explores a search space of size polynomial in the number of assumptions, even for problems which are proven to have exponential size labels. Experimental results indicate significant speedups over the standard ATMS for such problems. In addition to its improved efficiency, the BF-ATMS algorithm retains the multiple-context capability of an ATMS, and the important properties of consistency, minimality, soundness, as well as the property of bounded completeness. The usefulness of the new algorithm is demonstrated by its application to the task of consistency-based diagnosis, where dramatic efficiency improvements, with respect to the standard solution technique, are obtained
Compositional model repositories via dynamic constraint satisfaction with order-of-magnitude preferences
The predominant knowledge-based approach to automated model construction, compositional modelling, employs a set of models of particular functional components. Its inference mechanism takes a scenario describing the constituent interacting components of a system and translates it into a useful mathematical model. This paper presents a novel compositional modelling approach aimed at building model repositories. It furthers the field in two respects. Firstly, it expands the application domain of compositional modelling to systems that can not be easily described in terms of interacting functional components, such as ecological systems. Secondly, it enables the incorporation of user preferences into the model selection process. These features are achieved by casting the compositional modelling problem as an activity-based dynamic preference constraint satisfaction problem, where the dynamic constraints describe the restrictions imposed over the composition of partial models and the preferences correspond to those of the user of the automated modeller. In addition, the preference levels are represented through the use of symbolic values that differ in orders of magnitude
Reducing fuzzy answer set programming to model finding in fuzzy logics
In recent years, answer set programming (ASP) has been extended to deal with multivalued predicates. The resulting formalisms allow for the modeling of continuous problems as elegantly as ASP allows for the modeling of discrete problems, by combining the stable model semantics underlying ASP with fuzzy logics. However, contrary to the case of classical ASP where many efficient solvers have been constructed, to date there is no efficient fuzzy ASP solver. A well-known technique for classical ASP consists of translating an ASP program P to a propositional theory whose models exactly correspond to the answer sets of P. In this paper, we show how this idea can be extended to fuzzy ASP, paving the way to implement efficient fuzzy ASP solvers that can take advantage of existing fuzzy logic reasoners
ATMS-Based architecture for stylistics-aware text generation
This thesis is concerned with the effect of surface stylistic constraints (SSC) on syntactic
and lexical choice within a unified generation architecture. Despite the fact that these
issues have been investigated by researchers in the field, little work has been done with
regard to system architectures that allow surface form constraints to influence earlier
linguistic or even semantic decisions made throughout the NLG process. By SSC we
mean those stylistic requirements that are known beforehand but cannot be tested
until after the utterance or — in some lucky cases — until a proper linearised part
of it has been generated. These include collocational constraints, text size limits, and
poetic aspects such as rhyme and metre to name a few.
This thesis introduces a new NLG architecture that can be sensitive to surface stylistic
requirements. It brings together a well-founded linguistic theory that has been used
in many successful NLG systems (Systemic Functional Linguistics, SFL) and an exist¬
ing AI search mechanism (the Assumption-based Truth Maintenance System, ATMS)
which caches important search information and avoids work duplication.
To this end, the thesis explores the logical relation between the grammar formalism and
the search technique. It designs, based on that logical connection, an algorithm for the
automatic translation of systemic grammar networks to ATMS dependency networks.
The generator then uses the translated networks to generate natural language texts
with a high paraphrasing power as a direct result of its ability to pursue multiple paths
simultaneously. The thesis approaches the crucial notion of choice differently to previ¬
ous systems using SFL. It relaxes the choice process in that choosers are not obliged to
deterministically choose a single alternative allowing SSC to influence the final lexical
and syntactic decisions. The thesis also develops a situation-action framework for the
specification of stylistic requirements independently of the micro-semantic input. The
user or application can state what surface requirements they wish to impose and the
ATMS-based generator then attempts to satisfy these constraints.
Finally, a prototype ATMS-based generation system embodying the ideas presented in
this thesis is implemented and evaluated. We examine the system's stylistic sensitivity
by testing it on three different sets of stylistic requirements, namely: collocational,
size, and poetic constraints
Analytical methodology for ATM control panel design
This thesis presents a methodology for control panel design and layout along with a case study of an automated teller machine (ATM). A predictive model of human endurance and fatigue is developed from anthropometric, biomechanical and kinematics research. The layout problem is formulated to assign controls to locations to minimize the fatigue imposed on an operator performing a known set of tasks. A family of optimal and near-optimal layouts are found using conventional algorithms. The final hardware design refinements are suggested by human factors concerns. Ergonomic guidelines are also proposed for software aspects of the design. The methods and guidelines can provide hardware and software designers with useful insights into some human-machine interface considerations
- …