1,610 research outputs found
On Making Good Games - Using Player Virtue Ethics and Gameplay Design Patterns to Identify Generally Desirable Gameplay Features
This paper uses a framework of player virtues to perform a
theoretical exploration of what is required to make a game
good. The choice of player virtues is based upon the view
that games can be seen as implements, and that these are
good if they support an intended use, and the intended use
of games is to support people to be good players. A collection of gameplay design patterns, identified through
their relation to the virtues, is presented to provide specific starting points for considering design options for this type of good games. 24 patterns are identified supporting the virtues, including RISK/REWARD, DYNAMIC ALLIANCES, GAME MASTERS, and PLAYER DECIDED RESULTS, as are 7 countering three or more virtues, including ANALYSIS
PARALYSIS, EARLY ELIMINATION, and GRINDING. The paper concludes by identifying limitations of the approach as well as by showing how it can be applied using other views of what are preferable features in games
Combining Provenance Management and Schema Evolution
The combination of provenance management and schema evolution using the CHASE algorithm is the focus of our research in the area of research data management. The aim is to combine the construc- tion of a CHASE inverse mapping to calculate the minimal part of the original database — the minimal sub-database — with a CHASE-based schema mapping for schema evolution
Composition with Target Constraints
It is known that the composition of schema mappings, each specified by
source-to-target tgds (st-tgds), can be specified by a second-order tgd (SO
tgd). We consider the question of what happens when target constraints are
allowed. Specifically, we consider the question of specifying the composition
of standard schema mappings (those specified by st-tgds, target egds, and a
weakly acyclic set of target tgds). We show that SO tgds, even with the
assistance of arbitrary source constraints and target constraints, cannot
specify in general the composition of two standard schema mappings. Therefore,
we introduce source-to-target second-order dependencies (st-SO dependencies),
which are similar to SO tgds, but allow equations in the conclusion. We show
that st-SO dependencies (along with target egds and target tgds) are sufficient
to express the composition of every finite sequence of standard schema
mappings, and further, every st-SO dependency specifies such a composition. In
addition to this expressive power, we show that st-SO dependencies enjoy other
desirable properties. In particular, they have a polynomial-time chase that
generates a universal solution. This universal solution can be used to find the
certain answers to unions of conjunctive queries in polynomial time. It is easy
to show that the composition of an arbitrary number of standard schema mappings
is equivalent to the composition of only two standard schema mappings. We show
that surprisingly, the analogous result holds also for schema mappings
specified by just st-tgds (no target constraints). This is proven by showing
that every SO tgd is equivalent to an unnested SO tgd (one where there is no
nesting of function symbols). Similarly, we prove unnesting results for st-SO
dependencies, with the same types of consequences.Comment: This paper is an extended version of: M. Arenas, R. Fagin, and A.
Nash. Composition with Target Constraints. In 13th International Conference
on Database Theory (ICDT), pages 129-142, 201
Querying Geometric Figures Using a Controlled Language, Ontological Graphs and Dependency Lattices
Dynamic geometry systems (DGS) have become basic tools in many areas of
geometry as, for example, in education. Geometry Automated Theorem Provers
(GATP) are an active area of research and are considered as being basic tools
in future enhanced educational software as well as in a next generation of
mechanized mathematics assistants. Recently emerged Web repositories of
geometric knowledge, like TGTP and Intergeo, are an attempt to make the already
vast data set of geometric knowledge widely available. Considering the large
amount of geometric information already available, we face the need of a query
mechanism for descriptions of geometric constructions.
In this paper we discuss two approaches for describing geometric figures
(declarative and procedural), and present algorithms for querying geometric
figures in declaratively and procedurally described corpora, by using a DGS or
a dedicated controlled natural language for queries.Comment: 14 pages, 5 figures, accepted at CICM 201
Exchange-Repairs: Managing Inconsistency in Data Exchange
In a data exchange setting with target constraints, it is often the case that
a given source instance has no solutions. In such cases, the semantics of
target queries trivialize. The aim of this paper is to introduce and explore a
new framework that gives meaningful semantics in such cases by using the notion
of exchange-repairs. Informally, an exchange-repair of a source instance is
another source instance that differs minimally from the first, but has a
solution. Exchange-repairs give rise to a natural notion of exchange-repair
certain answers (XR-certain answers) for target queries. We show that for
schema mappings specified by source-to-target GAV dependencies and target
equality-generating dependencies (egds), the XR-certain answers of a target
conjunctive query can be rewritten as the consistent answers (in the sense of
standard database repairs) of a union of conjunctive queries over the source
schema with respect to a set of egds over the source schema, making it possible
to use a consistent query-answering system to compute XR-certain answers in
data exchange. We then examine the general case of schema mappings specified by
source-to-target GLAV constraints, a weakly acyclic set of target tgds and a
set of target egds. The main result asserts that, for such settings, the
XR-certain answers of conjunctive queries can be rewritten as the certain
answers of a union of conjunctive queries with respect to the stable models of
a disjunctive logic program over a suitable expansion of the source schema.Comment: 29 pages, 13 figures, submitted to the Journal on Data Semantic
Sacrificing Accuracy for Reduced Computation: Cascaded Inference Based on Softmax Confidence
We study the tradeoff between computational effort and accuracy in a cascade
of deep neural networks. During inference, early termination in the cascade is
controlled by confidence levels derived directly from the softmax outputs of
intermediate classifiers. The advantage of early termination is that
classification is performed using less computation, thus adjusting the
computational effort to the complexity of the input. Moreover, dynamic
modification of confidence thresholds allow one to trade accuracy for
computational effort without requiring retraining. Basing of early termination
on softmax classifier outputs is justified by experimentation that demonstrates
an almost linear relation between confidence levels in intermediate classifiers
and accuracy. Our experimentation with architectures based on ResNet obtained
the following results. (i) A speedup of 1.5 that sacrifices 1.4% accuracy with
respect to the CIFAR-10 test set. (ii) A speedup of 1.19 that sacrifices 0.7%
accuracy with respect to the CIFAR-100 test set. (iii) A speedup of 2.16 that
sacrifices 1.4% accuracy with respect to the SVHN test set
Probabilistic Algorithmic Knowledge
The framework of algorithmic knowledge assumes that agents use deterministic
knowledge algorithms to compute the facts they explicitly know. We extend the
framework to allow for randomized knowledge algorithms. We then characterize
the information provided by a randomized knowledge algorithm when its answers
have some probability of being incorrect. We formalize this information in
terms of evidence; a randomized knowledge algorithm returning ``Yes'' to a
query about a fact \phi provides evidence for \phi being true. Finally, we
discuss the extent to which this evidence can be used as a basis for decisions.Comment: 26 pages. A preliminary version appeared in Proc. 9th Conference on
Theoretical Aspects of Rationality and Knowledge (TARK'03
Hypergraph Acyclicity and Propositional Model Counting
We show that the propositional model counting problem #SAT for CNF- formulas
with hypergraphs that allow a disjoint branches decomposition can be solved in
polynomial time. We show that this class of hypergraphs is incomparable to
hypergraphs of bounded incidence cliquewidth which were the biggest class of
hypergraphs for which #SAT was known to be solvable in polynomial time so far.
Furthermore, we present a polynomial time algorithm that computes a disjoint
branches decomposition of a given hypergraph if it exists and rejects
otherwise. Finally, we show that some slight extensions of the class of
hypergraphs with disjoint branches decompositions lead to intractable #SAT,
leaving open how to generalize the counting result of this paper
A logic for reasoning about knowledge of unawareness
In the most popular logics combining knowledge and awareness, it is not possible to express statements about knowledge of unawareness such as “Ann knows that Bill is aware of something Ann is not aware of” – without using a stronger statement such as “Ann knows that Bill is aware of p and Ann is not aware of p”, for some particular p. In Halpern and Rêgo (2006, 2009b) (revisited in Halpern and Rêgo (2009a, 2013)) Halpern and Rêgo introduced a logic in which such statements about knowledge of unawareness can be expressed. The logic extends the traditional framework with quantification over formulae, and is thus very expressive. As a consequence, it is not decidable. In this paper we introduce a decidable logic which can be used to reason about certain types of unawareness. Our logic extends the traditional framework with an operator expressing full awareness, i.e., the fact that an agent is aware of everything, and another operator expressing relative awareness, the fact that one agent is aware of everything another agent is aware of. The logic is less expressive than Halpern’s and Rêgo’s logic. It is, however, expressive enough to express all of the motivating examples in Halpern and Rêgo (2006, 2009b). In addition to proving that the logic is decidable and that its satisfiability problem is PSPACE-complete, we present an axiomatisation which we show is sound and complete
- …