1,220 research outputs found

    An MPEG-7 scheme for semantic content modelling and filtering of digital video

    Get PDF
    Abstract Part 5 of the MPEG-7 standard specifies Multimedia Description Schemes (MDS); that is, the format multimedia content models should conform to in order to ensure interoperability across multiple platforms and applications. However, the standard does not specify how the content or the associated model may be filtered. This paper proposes an MPEG-7 scheme which can be deployed for digital video content modelling and filtering. The proposed scheme, COSMOS-7, produces rich and multi-faceted semantic content models and supports a content-based filtering approach that only analyses content relating directly to the preferred content requirements of the user. We present details of the scheme, front-end systems used for content modelling and filtering and experiences with a number of users

    A puzzle about rates of change

    Get PDF
    Most of our best scientific descriptions of the world employ rates of change of some continuous quantity with respect to some other continuous quantity. For instance, in classical physics we arrive at a particle’s velocity by taking the time-derivative of its position, and we arrive at a particle’s acceleration by taking the time-derivative of its velocity. Because rates of change are defined in terms of other continuous quantities, most think that facts about some rate of change obtain in virtue of facts about those other continuous quantities. For example, on this view facts about a particle’s velocity at a time obtain in virtue of facts about how that particle’s position is changing at that time. In this paper we raise a puzzle for this orthodox reductionist account of rate of change quantities and evaluate some possible replies. We don’t decisively come down in favour of one reply over the others, though we say some things to support taking our puzzle to cast doubt on the standard view that spacetime is continuous

    A puzzle about rates of change

    Get PDF
    Most of our best scientific descriptions of the world employ rates of change of some continuous quantity with respect to some other continuous quantity. For instance, in classical physics we arrive at a particle’s velocity by taking the time-derivative of its position, and we arrive at a particle’s acceleration by taking the time-derivative of its velocity. Because rates of change are defined in terms of other continuous quantities, most think that facts about some rate of change obtain in virtue of facts about those other continuous quantities. For example, on this view facts about a particle’s velocity at a time obtain in virtue of facts about how that particle’s position is changing at that time. In this paper we raise a puzzle for this orthodox reductionist account of rate of change quantities and evaluate some possible replies. We don’t decisively come down in favour of one reply over the others, though we say some things to support taking our puzzle to cast doubt on the standard view that spacetime is continuous

    Kind Instantiation and Kind Change - A Problem for Four-Category Ontology

    Get PDF
    In Lowe’s Four-Category Ontology, instantiation is a basic formal ontological relation between particulars (objects, modes) and their kinds (kinds, attributes). Therefore, instantiation must be considered as a metaphysically necessary relation, which also rules out the metaphysical possibility of kind change. Nevertheless, according to Lowe, objects obtain their identity conditions in a more general level than specific natural kinds, which allows for kind change. There also seems to be actual examples of kind change. The advocate of Four-Category Ontology is obliged to resolve the tension between these mutually incompatible claims. In this article, we argue that the only viable option for the advocate of Four-Category Ontology is to bite the bullet and stick to the necessity of each of the most specific natural kind to the object instantiating it. As a major drawback, the four-category ontologist does not have any credible means to allow for kind change or determination of the identity conditions in a more general level

    The Proscriptive Principle and Logics of Analytic Implication

    Full text link
    The analogy between inference and mereological containment goes at least back to Aristotle, whose discussion in the Prior Analytics motivates the validity of the syllogism by way of talk of parts and wholes. On this picture, the application of syllogistic is merely the analysis of concepts, a term that presupposes—through the root ἀνά + λύω —a mereological background. In the 1930s, such considerations led William T. Parry to attempt to codify this notion of logical containment in his system of analytic implication AI. Parry’s original system AI was later expanded to the system PAI. The hallmark of Parry’s systems—and of what may be thought of as containment logics or Parry systems in general—is a strong relevance property called the ‘Proscriptive Principle’ (PP) described by Parry as the thesis that: No formula with analytic implication as main relation holds universally if it has a free variable occurring in the consequent but not the antecedent. This type of proscription is on its face justified, as the presence of a novel parameter in the consequent corresponds to the introduction of new subject matter. The plausibility of the thesis that the content of a statement is related to its subject matter thus appears also to support the validity of the formal principle. Primarily due to the perception that Parry’s formal systems were intended to accurately model Kant’s notion of an analytic judgment, Parry’s deductive systems—and the suitability of the Proscriptive Principle in general—were met with severe criticism. While Anderson and Belnap argued that Parry’s criterion failed to account for a number of prima facie analytic judgments, others—such as Sylvan and Brady—argued that the utility of the criterion was impeded by its reliance on a ‘syntactical’ device. But these arguments are restricted to Parry’s work qua exegesis of Kant and fail to take into account the breadth of applications in which the Proscriptive Principle emerges. It is the goal of the present work to explore themes related to deductive systems satisfying one form of the Proscriptive Principle or other, with a special emphasis placed on the rehabilitation of their study to some degree. The structure of the dissertation is as follows: In Chapter 2, we identify and develop the relationship between Parry-type deductive systems and the field of ‘logics of nonsense.’ Of particular importance is Dmitri Bochvar’s ‘internal’ nonsense logic Σ0, and we observe that two ⊢-Parry subsystems of Σ0 (Harry Deutsch’s Sfde and Frederick Johnson’s RC) can be considered to be the products of particular ‘strategies’ of eliminating problematic inferences from Bochvar’s system. The material of Chapter 3 considers Kit Fine’s program of state space semantics in the context of Parry logics. Recently, Fine—who had already provided the first intuitive semantics for Parry’s PAI—has offered a formal model of truthmaking (and falsemaking) that provides one of the first natural semantics for Richard B. Angell’s logic of analytic containment AC, itself a ⊢-Parry system. After discussing the relationship between state space semantics and nonsense, we observe that Fabrice Correia’s weaker framework—introduced as a semantics for a containment logic weaker than AC—tacitly endorses an implausible feature of allowing hypernonsensical statements. By modelling Correia’s containment logic within the stronger setting of Fine’s semantics, we are able to retain Correia’s intuitions about factual equivalence without such a commitment. As a further application, we observe that Fine’s setting can resolve some ambiguities in Greg Restall’s own truthmaker semantics. In Chapter 4, we consider interpretations of disjunction that accord with the characteristic failure of Addition in which the evaluation of a disjunction A ∨ B requires not only the truth of one disjunct, but also that both disjuncts satisfy some further property. In the setting of computation, such an analysis requires the existence of some procedure tasked with ensuring the satisfaction of this property by both disjuncts. This observation leads to a computational analysis of the relationship between Parry logics and logics of nonsense in which the semantic category of ‘nonsense’ is associated with catastrophic faults in computer programs. In this spirit, we examine semantics for several ⊢-Parry logics in terms of the successful execution of certain types of programs and the consequences of extending this analysis to dynamic logic and constructive logic. Chapter 5 considers these faults in the particular case in which Nuel Belnap’s ‘artificial reasoner’ is unable to retrieve the value assigned to a variable. This leads not only to a natural interpretation of Graham Priest’s semantics for the ⊢-Parry system S⋆fde but also a novel, many-valued semantics for Angell’s AC, completeness of which is proven by establishing a correspondence with Correia’s semantics for AC. These many-valued semantics have the additional benefit of allowing us to apply the material in Chapter 2 to the case of AC to define intensional extensions of AC in the spirit of Parry’s PAI. One particular instance of the type of disjunction central to Chapter 4 is Melvin Fitting’s cut-down disjunction. Chapter 6 examines cut-down operations in more detail and provides bilattice and trilattice semantics for the ⊢-Parry systems Sfde and AC in the style of Ofer Arieli and Arnon Avron’s logical bilattices. The elegant connection between these systems and logical multilattices supports the fundamentality and naturalness of these logics and, additionally, allows us to extend epistemic interpretation of bilattices in the tradition of artificial intelligence to these systems. Finally, the correspondence between the present many-valued semantics for AC and those of Correia is revisited in Chapter 7. The technique that plays an essential role in Chapter 4 is used to characterize a wide class of first-degree calculi intermediate between AC and classical logic in Correia’s setting. This correspondence allows the correction of an incorrect characterization of classical logic given by Correia and leads to the question of how to characterize hybrid systems extending Angell’s AC∗. Finally, we consider whether this correspondence aids in providing an interpretation to Correia’s first semantics for AC

    On future in commands

    Get PDF
    [Extract] In many languages of the world, the status of 'future' is different from that of present and of past. Past events can be conceived as known through observation, inference, assumption, or report. Statements about the future may involve speculation, prediction, guesses, and so on. In some languages future refers to the location of an event in time, and can be considered a grammatical tense, on a par with past (and also present). In others, future time can be expressed through a plethora of modalities (including intentional and potential forms) and irrealis (see Dixon 2012: 22-8). The expression of future tense and of future time may interact with other categories in grammar, along the lines of dependencies between grammatical systems as outlined in Aikhenvald and Dixon (1998)

    Identity and Aboutness

    Get PDF
    This paper develops a theory of propositional identity which distinguishes necessarily equivalent propositions that differ in subject-matter. Rather than forming a Boolean lattice as in extensional and intensional semantic theories, the space of propositions forms a non-interlaced bilattice. After motivating a departure from tradition by way of a number of plausible principles for subject-matter, I will provide a Finean state semantics for a novel theory of propositions, presenting arguments against the convexity and nonvacuity constraints which Fine (2016, 2017a,b) introduces. I will then move to compare the resulting logic of propositional identity (PI) with Correia’s (2016) logic of generalised identity (GI), as well as the first degree fragment of Angell’s (1989) logic of analytic containment (AC). The paper concludes by extending PI to include axioms and rules for a subject-matter operator, providing a much broader theory of subject-matter than the principles with which I will begin

    Lazy Model Expansion: Interleaving Grounding with Search

    Full text link
    Finding satisfying assignments for the variables involved in a set of constraints can be cast as a (bounded) model generation problem: search for (bounded) models of a theory in some logic. The state-of-the-art approach for bounded model generation for rich knowledge representation languages, like ASP, FO(.) and Zinc, is ground-and-solve: reduce the theory to a ground or propositional one and apply a search algorithm to the resulting theory. An important bottleneck is the blowup of the size of the theory caused by the reduction phase. Lazily grounding the theory during search is a way to overcome this bottleneck. We present a theoretical framework and an implementation in the context of the FO(.) knowledge representation language. Instead of grounding all parts of a theory, justifications are derived for some parts of it. Given a partial assignment for the grounded part of the theory and valid justifications for the formulas of the non-grounded part, the justifications provide a recipe to construct a complete assignment that satisfies the non-grounded part. When a justification for a particular formula becomes invalid during search, a new one is derived; if that fails, the formula is split in a part to be grounded and a part that can be justified. The theoretical framework captures existing approaches for tackling the grounding bottleneck such as lazy clause generation and grounding-on-the-fly, and presents a generalization of the 2-watched literal scheme. We present an algorithm for lazy model expansion and integrate it in a model generator for FO(ID), a language extending first-order logic with inductive definitions. The algorithm is implemented as part of the state-of-the-art FO(ID) Knowledge-Base System IDP. Experimental results illustrate the power and generality of the approach
    corecore