279,540 research outputs found
SPIRE Map-Making Test Report
The photometer section of SPIRE is one of the key instruments on board of
Herschel. Its legacy depends very much on how well the scanmap observations
that it carried out during the Herschel mission can be converted to high
quality maps. In order to have a comprehensive assessment on the current status
of SPIRE map-making, as well as to provide guidance for future development of
the SPIRE scan-map data reduction pipeline, we carried out a test campaign on
SPIRE map-making. In this report, we present results of the tests in this
campaign.Comment: This document has an executive summary, 6 chapters, and 102 pages.
More information can be found at:
https://nhscsci.ipac.caltech.edu/sc/index.php/Spire/SPIREMap-MakingTest201
TruthÂ-Makers
During the realist revival in the early years of this century, philosophers of various persuasions were concerned to investigate the ontology of truth. That is, whether or not they viewed truth as a correspondence, they were interested in the extent to which one needed to assume the existence of entities serving some role in accounting for the truth of sentences. Certain of these entities, such as the SĂ€tze an sich of Bolzano, the Gedanken of Frege, or the propositions of Russell and Moore, were conceived as the bearers of the properties of truth and falsehood. Some thinkers however, such as Russell, Wittgenstein in the Tractatus, and Husserl in the Logische Untersuchungen, argued that instead of, or in addition to, truth-bearers, one must assume the existence of certain entities in virtue of which sentences and/or propositions are true. Various names were used for these entities, notably 'fact', 'Sachverhalt', and 'state of affairs'. (1) In order not to prejudge the suitability of these words we shall initially employ a more neutral terminology, calling any entities which are candidates for this role truth-makers
Frege's logicism
In this paper, I provide an interpretation of Frege's logicist project, drawing a connection between it and his idiosyncratic view of truth
Thisnesses, Propositions, and Truth
Presentists, who believe that only present objects exist, should accept a thisness ontology, since it can do considerable work in defence of presentism. In this paper, I propose a version of presentism that involves thisnesses of past and present entities and I argue this view solves important problems facing standard versions of presentism
Deflating Deflationary Truthmaking
In this paper we confront a challenge to truthmaker theory that is analogous to the objections raised by deflationists against substantive theories of truth. Several critics of truthmaker theory espouse a âdeflationaryâ attitude about truthmaking, though it has not been clearly presented as such. Our goal is to articulate and then object to the underlying rationale behind deflationary truthmaking. We begin by developing the analogy between deflationary truth and deflationary truthmaking, and then show how the latter can be found in the work of Dodd, Hornsby, Schnieder, Williamson, and others. These philosophers believe that the ambitions of truthmaker theory are easily satisfied, without recourse to ambitious ontological investigationâhence the analogy with deflationary truth. We argue that the deflationistsâ agenda fails: there is no coherent deflationary theory of truthmaking. Truthmaking, once deflated, fails to address the questions at the heart of truthmaking investigation. Truthmaking cannot be had on the cheap
Idealization and Many Aims
In this paper, I first outline the view developed in my recent book on the role of idealization in scientific understanding. I discuss how this view leads to the recognition of a number of kinds of variability among scientific representations, including variability introduced by the many different aims of scientific projects. I then argue that the role of idealization in securing understanding distances understanding from truth, but that this understanding nonetheless gives rise to scientific knowledge. This discussion will clarify how my view relates to three other recent books on understanding by Henk de Regt, Catherine Elgin, and Kareem Khalifa
The Truth about Sherlock Holmes
According to possibilism, or non-actualism, fictional characters are possible individuals. Possibilist accounts of fiction do not only assign the intuitively correct truth-conditions to sentences in a fiction, but has the potential to provide powerful explanatory models for a wide range of phenomena associated with fiction (though these two aspects of possibilism are, I argue, crucially distinct). Apart from the classic defense by David Lewis the idea of modeling fiction in terms of possible worlds have been widely criticized. In this article, I provide a defense of a possibilist account against some lines of criticism. To do so, I assume that names for fictional characters are directly referential and a possible-worlds model that accommodates transworld identity. On this background, I argue, it is possible to construct an elegant model of fictional discourse using familiar models of information exchange in ordinary discourse, and I sketch how this model can be used to i) make a natural distinction between fictional and counterfactual discourse, ii) account for creativity, and iii) sustain a natural definition of truth-in-fiction that avoids certain familiar objections to possibilism. Though I set aside questions about the metaphysical commitments of a possible-world interpretation here, there is accordingly reason to think that the battle over possibilist treatments of fiction will have to be fought over metaphysical foundations rather than technical shortcomings
Presentism without Truth-Makers
We construct a presentist semantics on which there are no truth-makers for past and future tensed statements. The semantics is not an expressivist or projectivist one, and is not susceptible to the semantical difficulties that confront such theories. We discuss how the approach handles some standard concerns with presentism
Recommended from our members
Misunderstanding Models in Environmental and Public Health Regulation
Computational models are fundamental to environmental regulation, yet their capabilities tend to be misunderstood by policymakers. Rather than rely on models to illuminate dynamic and uncertain relationships in natural settings, policymakers too often use models as âanswer machines.â This fundamental misperception that models can generate decisive facts leads to a perverse negative feedback loop that begins with policymaking itself and radiates into the science of modeling and into regulatory deliberations where participants can exploit the misunderstanding in strategic ways. This paper documents the pervasive misperception of models as truth machines in U.S. regulation and the multi-layered problems that result from this misunderstanding. The paper concludes with a series of proposals for making better use of models in environmental policy analysis.The Kay Bailey Hutchison Center for Energy, Law, and Busines
- âŠ