99,794 research outputs found
A Boxology of Design Patterns for Hybrid Learning and Reasoning Systems
We propose a set of compositional design patterns to describe a large variety
of systems that combine statistical techniques from machine learning with
symbolic techniques from knowledge representation. As in other areas of
computer science (knowledge engineering, software engineering, ontology
engineering, process mining and others), such design patterns help to
systematize the literature, clarify which combinations of techniques serve
which purposes, and encourage re-use of software components. We have validated
our set of compositional design patterns against a large body of recent
literature.Comment: 12 pages,55 reference
Global semantic typing for inductive and coinductive computing
Inductive and coinductive types are commonly construed as ontological
(Church-style) types, denoting canonical data-sets such as natural numbers,
lists, and streams. For various purposes, notably the study of programs in the
context of global semantics, it is preferable to think of types as semantical
properties (Curry-style). Intrinsic theories were introduced in the late 1990s
to provide a purely logical framework for reasoning about programs and their
semantic types. We extend them here to data given by any combination of
inductive and coinductive definitions. This approach is of interest because it
fits tightly with syntactic, semantic, and proof theoretic fundamentals of
formal logic, with potential applications in implicit computational complexity
as well as extraction of programs from proofs. We prove a Canonicity Theorem,
showing that the global definition of program typing, via the usual (Tarskian)
semantics of first-order logic, agrees with their operational semantics in the
intended model. Finally, we show that every intrinsic theory is interpretable
in a conservative extension of first-order arithmetic. This means that
quantification over infinite data objects does not lead, on its own, to
proof-theoretic strength beyond that of Peano Arithmetic. Intrinsic theories
are perfectly amenable to formulas-as-types Curry-Howard morphisms, and were
used to characterize major computational complexity classes Their extensions
described here have similar potential which has already been applied
Frequentist statistics as a theory of inductive inference
After some general remarks about the interrelation between philosophical and
statistical thinking, the discussion centres largely on significance tests.
These are defined as the calculation of -values rather than as formal
procedures for ``acceptance'' and ``rejection.'' A number of types of null
hypothesis are described and a principle for evidential interpretation set out
governing the implications of -values in the specific circumstances of each
application, as contrasted with a long-run interpretation. A variety of more
complicated situations are discussed in which modification of the simple
-value may be essential.Comment: Published at http://dx.doi.org/10.1214/074921706000000400 in the IMS
Lecture Notes--Monograph Series
(http://www.imstat.org/publications/lecnotes.htm) by the Institute of
Mathematical Statistics (http://www.imstat.org
Implicit complexity for coinductive data: a characterization of corecurrence
We propose a framework for reasoning about programs that manipulate
coinductive data as well as inductive data. Our approach is based on using
equational programs, which support a seamless combination of computation and
reasoning, and using productivity (fairness) as the fundamental assertion,
rather than bi-simulation. The latter is expressible in terms of the former. As
an application to this framework, we give an implicit characterization of
corecurrence: a function is definable using corecurrence iff its productivity
is provable using coinduction for formulas in which data-predicates do not
occur negatively. This is an analog, albeit in weaker form, of a
characterization of recurrence (i.e. primitive recursion) in [Leivant, Unipolar
induction, TCS 318, 2004].Comment: In Proceedings DICE 2011, arXiv:1201.034
Coinductive Formal Reasoning in Exact Real Arithmetic
In this article we present a method for formally proving the correctness of
the lazy algorithms for computing homographic and quadratic transformations --
of which field operations are special cases-- on a representation of real
numbers by coinductive streams. The algorithms work on coinductive stream of
M\"{o}bius maps and form the basis of the Edalat--Potts exact real arithmetic.
We use the machinery of the Coq proof assistant for the coinductive types to
present the formalisation. The formalised algorithms are only partially
productive, i.e., they do not output provably infinite streams for all possible
inputs. We show how to deal with this partiality in the presence of syntactic
restrictions posed by the constructive type theory of Coq. Furthermore we show
that the type theoretic techniques that we develop are compatible with the
semantics of the algorithms as continuous maps on real numbers. The resulting
Coq formalisation is available for public download.Comment: 40 page
Unraveling the influence of domain knowledge during simulation-based inquiry learning
This study investigated whether the mere knowledge of the meaning of variables can facilitate inquiry learning processes and outcomes. Fifty-seven college freshmen were randomly allocated to one of three inquiry tasks. The concrete task had familiar variables from which hypotheses about their underlying relations could be inferred. The intermediate task used familiar variables that did not invoke underlying relations, whereas the abstract task contained unfamiliar variables that did not allow for inference of hypotheses about relations. Results showed that concrete participants performed more successfully and efficiently than intermediate participants, who in turn were equally successful and efficient as abstract participants. From these findings it was concluded that students learning by inquiry benefit little from knowledge of the meaning of variables per se. Some additional understanding of the way these variables are interrelated seems required to enhance inquiry learning processes and outcomes
Representing and coding the knowledge embedded in texts of Health Science Web published articles
Despite the fact that electronic publishing is a common activity to scholars electronic journals are still based in the print model and do not take full advantage of the facilities offered by the Semantic Web environment. This is a report of the results of a research project with the aim of investigating the possibilities of electronic publishing journal articles both as text for human reading and in machine readable format recording the new knowledge contained in the article. This knowledge is identified with the scientific methodology elements such as problem, methodology, hypothesis, results, and conclusions. A model integrating all those elements is proposed which makes explicit and records the knowledge embedded in the text of scientific articles as an ontology. Knowledge thus represented enables its processing by intelligent software agents The proposed model aims to take advantage of these facilities enabling semantic retrieval and validation of the knowledge contained in articles. To validate and enhance the model a set of electronic journal articles were analyzed
- …