8,197 research outputs found
Generating Functions For Kernels of Digraphs (Enumeration & Asymptotics for Nim Games)
In this article, we study directed graphs (digraphs) with a coloring
constraint due to Von Neumann and related to Nim-type games. This is equivalent
to the notion of kernels of digraphs, which appears in numerous fields of
research such as game theory, complexity theory, artificial intelligence
(default logic, argumentation in multi-agent systems), 0-1 laws in monadic
second order logic, combinatorics (perfect graphs)... Kernels of digraphs lead
to numerous difficult questions (in the sense of NP-completeness,
#P-completeness). However, we show here that it is possible to use a generating
function approach to get new informations: we use technique of symbolic and
analytic combinatorics (generating functions and their singularities) in order
to get exact and asymptotic results, e.g. for the existence of a kernel in a
circuit or in a unicircuit digraph. This is a first step toward a
generatingfunctionology treatment of kernels, while using, e.g., an approach "a
la Wright". Our method could be applied to more general "local coloring
constraints" in decomposable combinatorial structures.Comment: Presented (as a poster) to the conference Formal Power Series and
Algebraic Combinatorics (Vancouver, 2004), electronic proceeding
Semantic networks
AbstractA semantic network is a graph of the structure of meaning. This article introduces semantic network systems and their importance in Artificial Intelligence, followed by I. the early background; II. a summary of the basic ideas and issues including link types, frame systems, case relations, link valence, abstraction, inheritance hierarchies and logic extensions; and III. a survey of âworld-structuringâ systems including ontologies, causal link models, continuous models, relevance, formal dictionaries, semantic primitives and intersecting inference hierarchies. Speed and practical implementation are briefly discussed. The conclusion argues for a synthesis of relational graph theory, graph-grammar theory and order theory based on semantic primitives and multiple intersecting inference hierarchies
What matters to older people with assisted living needs? A phenomenological analysis of the use and non-use of telehealth and telecare
Telehealth and telecare research has been dominated by efficacy trials. The field lacks a sophisticated theorisation of [a] what matters to older people with assisted living needs; [b] how illness affects people's capacity to use technologies; and [c] the materiality of assistive technologies. We sought to develop a phenomenologically and socio-materially informed theoretical model of assistive technology use. Forty people aged 60â98 (recruited via NHS, social care and third sector) were visited at home several times in 2011â13. Using ethnographic methods, we built a detailed picture of participants' lives, illness experiences and use (or non-use) of technologies. Data were analysed phenomenologically, drawing on the work of Heidegger, and contextualised using a structuration approach with reference to Bourdieu's notions of habitus and field. We found that participants' needs were diverse and unique. Each had multiple, mutually reinforcing impairments (e.g. tremor and visual loss and stiff hands) that were steadily worsening, culturally framed and bound up with the prospect of decline and death. They managed these conditions subjectively and experientially, appropriating or adapting technologies so as to enhance their capacity to sense and act on their world. Installed assistive technologies met few participants' needs; some devices had been abandoned and a few deliberately disabled. Successful technology arrangements were often characterised by âbricolageâ (pragmatic customisation, combining new with legacy devices) by the participant or someone who knew and cared about them. With few exceptions, the current generation of so-called âassisted living technologiesâ does not assist people to live with illness. To overcome this irony, technology providers need to move beyond the goal of representing technology users informationally (e.g. as biometric data) to providing flexible components from which individuals and their carers can âthink with thingsâ to improve the situated, lived experience of multi-morbidity. A radical revision of assistive technology design policy may be needed
Expressive Stream Reasoning with Laser
An increasing number of use cases require a timely extraction of non-trivial
knowledge from semantically annotated data streams, especially on the Web and
for the Internet of Things (IoT). Often, this extraction requires expressive
reasoning, which is challenging to compute on large streams. We propose Laser,
a new reasoner that supports a pragmatic, non-trivial fragment of the logic
LARS which extends Answer Set Programming (ASP) for streams. At its core, Laser
implements a novel evaluation procedure which annotates formulae to avoid the
re-computation of duplicates at multiple time points. This procedure, combined
with a judicious implementation of the LARS operators, is responsible for
significantly better runtimes than the ones of other state-of-the-art systems
like C-SPARQL and CQELS, or an implementation of LARS which runs on the ASP
solver Clingo. This enables the application of expressive logic-based reasoning
to large streams and opens the door to a wider range of stream reasoning use
cases.Comment: 19 pages, 5 figures. Extended version of accepted paper at ISWC 201
A General Framework for Representing, Reasoning and Querying with Annotated Semantic Web Data
We describe a generic framework for representing and reasoning with annotated
Semantic Web data, a task becoming more important with the recent increased
amount of inconsistent and non-reliable meta-data on the web. We formalise the
annotated language, the corresponding deductive system and address the query
answering problem. Previous contributions on specific RDF annotation domains
are encompassed by our unified reasoning formalism as we show by instantiating
it on (i) temporal, (ii) fuzzy, and (iii) provenance annotations. Moreover, we
provide a generic method for combining multiple annotation domains allowing to
represent, e.g. temporally-annotated fuzzy RDF. Furthermore, we address the
development of a query language -- AnQL -- that is inspired by SPARQL,
including several features of SPARQL 1.1 (subqueries, aggregates, assignment,
solution modifiers) along with the formal definitions of their semantics
The Grail theorem prover: Type theory for syntax and semantics
As the name suggests, type-logical grammars are a grammar formalism based on
logic and type theory. From the prespective of grammar design, type-logical
grammars develop the syntactic and semantic aspects of linguistic phenomena
hand-in-hand, letting the desired semantics of an expression inform the
syntactic type and vice versa. Prototypical examples of the successful
application of type-logical grammars to the syntax-semantics interface include
coordination, quantifier scope and extraction.This chapter describes the Grail
theorem prover, a series of tools for designing and testing grammars in various
modern type-logical grammars which functions as a tool . All tools described in
this chapter are freely available
- âŠ