8 research outputs found
Farewell to Suppression-Freedom
Val Plumwood and Richard Sylvan argued from their joint paper The Semantics of First Degree Entailment (Routley and Routley in Noûs 6(4):335–359, 1972, https://doi.org/10.2307/2214309) and onward that the variable sharing property is but a mere consequence of a good entailment relation, indeed they viewed it as a mere negative test of adequacy of such a relation, the property itself being a rather philosophically barren concept. Such a relation is rather to be analyzed as a sufficiency relation free of any form of premise suppression. Suppression of premises, therefore, gained center stage. Despite this, however, no serious attempt was ever made at analyzing the concept. This paper shows that their suggestions for how to understand it, either as the Anti-Suppression Principle or as the Joint Force Principle, turn out to yield properties strictly weaker than that of variable sharing. A suggestion for how to understand some of their use of the notion of suppression which clearly is not in line with these two mentioned principles is given, and their arguments to the effect that the Anderson and Belnap logics T, E and R are suppressive are shown to be both technically and philosophically wanting. Suppression-freedom, it is argued, cannot do the job Plumwood and Sylvan intended it to do.publishedVersio
Computers and relevant logic : a project in computing matrix model structures for propositional logics
I present and discuss four classes of algorithm
designed as solutions to the problem of generating matrix
representations of model structures for some non-classical
propositional logics. I then go on to survey the output
from implementations of these algorithms and finally exhibit
some logical investigations suggested by that output.
All four algorithms traverse a search tree depthfirst.
In the case of the first and fourth methods the
tree is fixed by imposing a lexicographic order on possible
matrices, while the second and third create their search tree
dynamically as the job progresses. The first algorithm is a
simple "backtrack" with some pruning of the tree in response
to refutations of possible matrices. The fourth, the most
efficient we have for time, maximises the amount of pruning
while keeping the same basic form. The second, which uses
a large number of special properties of the logics in question,
and so requires some logical and algebraic knowledge on the
part of the programmer, finds the matrices at the tips of
branches only, while the third, due to P.A. Pritchard, is far
easier to program and tests a matrix at every node of the search
tree.
The logics with which I am concerned are in the "relevant"
group first seriously investigated by A.R. Anderson and N.D.
Belnap (see their Entailment: the logic of relevance and
necessity, 1975). The most surprising observation in my
preliminary survey of the numbers of matrices validating such
systems is that the typical models are not much like the models
normally taken as canonical for the logics. In particular the proportion of inconsistent models (validating some cases of the
scheme 'A & ~A') is much higher than might have been expected.
Among the logical investigations already suggested by the
quasi-empirical data now available in the form of matrices are
some work on the system R-W, including my theorem, proved in
chapter 2.3, that with the law of excluded middle it suffices
to trivialise naive set theory, and the little-noticed subject
of Ackermann constants (sentential constants) in these logics.
The formula which collapses naive set theory in R-W plus
A v ~A
is the most damaging set-theoretic antinomy known. The theorem
that there are at least 3088 Ackermann constants in the logic R
(chapter 2.4) could not reasonably have been proved without the
aid of a computer.
My major conclusion is that this work on applications of
computers in logical research has reached a point where we are
able not only to relieve logicians of some drudgery, but to
suggest theorems and insights of new and possibly important
kinds
The Proscriptive Principle and Logics of Analytic Implication
The analogy between inference and mereological containment goes at least back to Aristotle, whose discussion in the Prior Analytics motivates the validity of the syllogism by way of talk of parts and wholes. On this picture, the application of syllogistic is merely the analysis of concepts, a term that presupposes—through the root ἀνά + λύω —a mereological background.
In the 1930s, such considerations led William T. Parry to attempt to codify this notion of logical containment in his system of analytic implication AI. Parry’s original system AI was later expanded to the system PAI. The hallmark of Parry’s systems—and of what may be thought of as containment logics or Parry systems in general—is a strong relevance property called the ‘Proscriptive Principle’ (PP) described by Parry as the thesis that: No formula with analytic implication as main relation holds universally if it has a free variable occurring in the consequent but not the antecedent.
This type of proscription is on its face justified, as the presence of a novel parameter in the consequent corresponds to the introduction of new subject matter. The plausibility of the thesis that the content of a statement is related to its subject matter thus appears also to support the validity of the formal principle.
Primarily due to the perception that Parry’s formal systems were intended to accurately model Kant’s notion of an analytic judgment, Parry’s deductive systems—and the suitability of the Proscriptive Principle in general—were met with severe criticism. While Anderson and Belnap argued that Parry’s criterion failed to account for a number of prima facie analytic judgments, others—such as Sylvan and Brady—argued that the utility of the criterion was impeded by its reliance on a ‘syntactical’ device.
But these arguments are restricted to Parry’s work qua exegesis of Kant and fail to take into account the breadth of applications in which the Proscriptive Principle emerges. It is the goal of the present work to explore themes related to deductive systems satisfying one form of the Proscriptive Principle or other, with a special emphasis placed on the rehabilitation of their study to some degree. The structure of the dissertation is as follows: In Chapter 2, we identify and develop the relationship between Parry-type deductive systems and the field of ‘logics of nonsense.’ Of particular importance is Dmitri Bochvar’s ‘internal’ nonsense logic Σ0, and we observe that two ⊢-Parry subsystems of Σ0 (Harry Deutsch’s Sfde and Frederick Johnson’s RC) can be considered to be the products of particular ‘strategies’ of eliminating problematic inferences from Bochvar’s system. The material of Chapter 3 considers Kit Fine’s program of state space semantics in the context of Parry logics. Recently, Fine—who had already provided the first intuitive semantics for Parry’s PAI—has offered a formal model of truthmaking (and falsemaking) that provides one of the first natural semantics for Richard B. Angell’s logic of analytic containment AC, itself a ⊢-Parry system. After discussing the relationship between state space semantics and nonsense, we observe that Fabrice Correia’s weaker framework—introduced as a semantics for a containment logic weaker than AC—tacitly endorses an implausible feature of allowing hypernonsensical statements. By modelling Correia’s containment logic within the stronger setting of Fine’s semantics, we are able to retain Correia’s intuitions about factual equivalence without such a commitment. As a further application, we observe that Fine’s setting can resolve some ambiguities in Greg Restall’s own truthmaker semantics. In Chapter 4, we consider interpretations of disjunction that accord with the characteristic failure of Addition in which the evaluation of a disjunction A ∨ B requires not only the truth of one disjunct, but also that both disjuncts satisfy some further property. In the setting of computation, such an analysis requires the existence of some procedure tasked with ensuring the satisfaction of this property by both disjuncts. This observation leads to a computational analysis of the relationship between Parry logics and logics of nonsense in which the semantic category of ‘nonsense’ is associated with catastrophic faults in computer programs. In this spirit, we examine semantics for several ⊢-Parry logics in terms of the successful execution of certain types of programs and the consequences of extending this analysis to dynamic logic and constructive logic. Chapter 5 considers these faults in the particular case in which Nuel Belnap’s ‘artificial reasoner’ is unable to retrieve the value assigned to a variable. This leads not only to a natural interpretation of Graham Priest’s semantics for the ⊢-Parry system S⋆fde but also a novel, many-valued semantics for Angell’s AC, completeness of which is proven by establishing a correspondence with Correia’s semantics for AC. These many-valued semantics have the additional benefit of allowing us to apply the material in Chapter 2 to the case of AC to define intensional extensions of AC in the spirit of Parry’s PAI. One particular instance of the type of disjunction central to Chapter 4 is Melvin Fitting’s cut-down disjunction. Chapter 6 examines cut-down operations in more detail and provides bilattice and trilattice semantics for the ⊢-Parry systems Sfde and AC in the style of Ofer Arieli and Arnon Avron’s logical bilattices. The elegant connection between these systems and logical multilattices supports the fundamentality and naturalness of these logics and, additionally, allows us to extend epistemic interpretation of bilattices in the tradition of artificial intelligence to these systems. Finally, the correspondence between the present many-valued semantics for AC and those of Correia is revisited in Chapter 7. The technique that plays an essential role in Chapter 4 is used to characterize a wide class of first-degree calculi intermediate between AC and classical logic in Correia’s setting. This correspondence allows the correction of an incorrect characterization of classical logic given by Correia and leads to the question of how to characterize hybrid systems extending Angell’s AC∗. Finally, we consider whether this correspondence aids in providing an interpretation to Correia’s first semantics for AC
Tartu Ülikooli toimetised. Tööd semiootika alalt. 1964-1992. 0259-4668
http://www.ester.ee/record=b1331700*es
The Music Sound
A guide for music: compositions, events, forms, genres, groups, history, industry, instruments, language, live music, musicians, songs, musicology, techniques, terminology , theory, music video.
Music is a human activity which involves structured and audible sounds, which is used for artistic or aesthetic, entertainment, or ceremonial purposes.
The traditional or classical European aspects of music often listed are those elements given primacy in European-influenced classical music: melody, harmony, rhythm, tone color/timbre, and form. A more comprehensive list is given by stating the aspects of sound: pitch, timbre, loudness, and duration.
Common terms used to discuss particular pieces include melody, which is a succession of notes heard as some sort of unit; chord, which is a simultaneity of notes heard as some sort of unit; chord progression, which is a succession of chords (simultaneity succession); harmony, which is the relationship between two or more pitches; counterpoint, which is the simultaneity and organization of different melodies; and rhythm, which is the organization of the durational aspects of music