164 research outputs found

    What is truth?

    Get PDF
    I defend the correspondence theory of truth, according to which a statement’s truth consists in a relation of correspondence with extralinguistic fact. There are well-known objections to this view, which I consider and rebut, and also important rival accounts, principal among which are so-called deflationist theories and epistemic theories. Epistemic theories relate the concept of truth to our state of knowledge, but fail, I argue, to respect the crucial distinction between a criterion of truth and the meaning of truth: the view that one cannot do semantics, or metaphysics, without addressing epistemic issues is rejected by this work. Against epistemic theories, I illustrate how truth is independent of epistemic considerations. Deflationism is the more popular of the rival accounts and has gained considerable momentum over the past two decades. It is therefore dealt with in greater detail by this work. Deflationist theories exploit the paradigmatic ‘“Snow is white” is true iff snow is white’ biconditional to argue for an insubstantialist account, according to which truth is conservative with respect to non-semantical facts. On this view, truth’s raison d’ĂȘtre is merely to perform the useful expressive function of generalising over possibly infinite sets of assertions. Against deflationist theories, I claim that the work done by Jeffrey Ketland and Stewart Shapiro conclusively demonstrates how truth is informationally additive over non-semantic facts, while deflationism itself is also an excessively impoverishing theory, inadequate to the tasks it purports to accomplish. This work also defends the thesis that Alfred Tarski’s well-known theory of truth is an authentic correspondence theory. To say this is to say that the clauses of a Tarskian truth-definition can be interpreted in terms of a relation of correspondence that holds between true sentences and the states of affairs they describe. I provide a precise account of what the correspondence in question consists in, claiming that true sentences are homomorphic images of facts, i.e. a true sentence represents, in a form-preserving manner, the truth-making facts in it. This gives precise expression to Wittgenstein’s thesis that true sentences picture the world

    Truth, objectivity and subjectivity in accounting

    Get PDF
    The central thesis defended here is that we can have truth and objectivity in accounting. We do not contend that this potential is presently realized: On the contrary, we argue that certain contradictions immanent to capitalism give rise in late modernity to crisis tendencies in financial accounting as a way of knowing - epistemological crisis. We do contend that accounting's tendencies to epistemological crises can, at least in theory, be overcome. We begin to defend this view by considering accounting as an essentially descriptive activity. The account given by the philosopher Donald Davidson of the very possibility of knowledge is used to justify the view that intersubjectivity is all the foundation we need, or can have, for objectivity, and to defend out claim that we can have accounting knowledge, that is, true accounts/descriptions of an objective and intersubjectively accessible public world. The defence here is against those theorists, including those inspired by certain strands of the phenomenological, (post)structuralist and hermeneutic traditions, who would deny the possibility of any such objectivity in accounting. Using an analysis of the history and debate surrounding the issue of accounting for deferred tax in the United Kingdom (UK), we endeavour to locate accounting in terms of the dichotomy the philosopher Bernard Williams draws between science and ethics. We find that the descriptive and normative are inextricably entangled in accounting concepts in much the same way as they are entangled in thick ethical concepts such as 'chastity' or 'courage'. We recognise that the descriptive aspect of accounting can not be neatly distinguished from the normative and dealt with separately. Furthermore, following Williams, we argue that difficulties associated with the objective validation of the normative dimension of thick accounting concepts renders knowledge held under them vulnerable to destruction by reflection

    Movement Analytics: Current Status, Application to Manufacturing, and Future Prospects from an AI Perspective

    Full text link
    Data-driven decision making is becoming an integral part of manufacturing companies. Data is collected and commonly used to improve efficiency and produce high quality items for the customers. IoT-based and other forms of object tracking are an emerging tool for collecting movement data of objects/entities (e.g. human workers, moving vehicles, trolleys etc.) over space and time. Movement data can provide valuable insights like process bottlenecks, resource utilization, effective working time etc. that can be used for decision making and improving efficiency. Turning movement data into valuable information for industrial management and decision making requires analysis methods. We refer to this process as movement analytics. The purpose of this document is to review the current state of work for movement analytics both in manufacturing and more broadly. We survey relevant work from both a theoretical perspective and an application perspective. From the theoretical perspective, we put an emphasis on useful methods from two research areas: machine learning, and logic-based knowledge representation. We also review their combinations in view of movement analytics, and we discuss promising areas for future development and application. Furthermore, we touch on constraint optimization. From an application perspective, we review applications of these methods to movement analytics in a general sense and across various industries. We also describe currently available commercial off-the-shelf products for tracking in manufacturing, and we overview main concepts of digital twins and their applications

    Organising knowledge in the age of the semantic web: a study of the commensurability of ontologies

    Get PDF
     This study is directed towards the problem of conceptual translation across different data management systems and formats, with a particular focus on those used in the emerging world of the Semantic Web. Increasingly, organisations have sought to connect information sources and services within and beyond their enterprise boundaries, building upon existing Internet facilities to offer improved research, planning, reporting and management capabilities. The Semantic Web is an ambitious response to this growing demand, offering a standards-based platform for sharing, linking and reasoning with information. The imagined result, a globalised knowledge network formed out of mutually referring data structures termed "ontologies", would make possible new kinds of queries, inferences and amalgamations of information. Such a network, though, is premised upon large numbers of manually drawn links between these ontologies. In practice, establishing these links is a complex translation task requiring considerable time and expertise; invariably, as ontologies and other structured information sources are published, many useful connections are neglected. To combat this, in recent years substantial research has been invested into "ontology matching" - the exploration of algorithmic approaches for automatically translating or aligning ontologies. These approaches, which exploit the explicit semantic properties of individual concepts, have registered impressive precision and recall results against humanly-engineered translations. However they are unable to make use of background cultural information about the overall systems in which those concepts are housed - how those systems are used, for what purpose they were designed, what methodological or theoretical principles underlined their construction, and so on. The present study investigates whether paying attention to these sociological dimensions of electronic knowledge systems could supplement algorithmic approaches in some circumstances. Specifically, it asks whether a holistic notion of commensurability can be useful when aligning or translating between such systems.      The first half of the study introduces the problem, surveys the literature, and outlines the general approach. It then proposes both a theoretical foundation and a practical framework for assessing commensurability of ontologies and other knowledge systems. Chapter 1 outlines the Semantic Web, ontologies and the problem of conceptual translation, and poses the key research questions. Conceptual translation can be treated as, by turns, a social, philosophical, linguistic or technological problem; Chapter 2 surveys a correspondingly wide range of literature and approaches.      The methods employed by the study are described in Chapter 3. Chapter 4 critically examines theories of conceptual schemes and commensurability, while Chapter 5 describes the framework itself, comprising a series of specific dimensions, a broad methodological approach, and a means for generating both qualitative and quantitative assessments. The second half of the study then explores the notion of commensurability through several empirical frames. Chapters 6 to 8 applies the framework to a series of case studies. Chapter 6 presents a brief history of knowledge systems, and compares two of these systems - relational databases and Semantic Web ontologies. Chapter 7, in turn, compares several "upper-level" ontologies - reusable schematisations of abstract concepts like Time and Space . Chapter 8 reviews a recent, widely publicised controversy over the standardisation of document formats. This analysis in particular shows how the opaque dry world of technical specifications can reveal the complex network of social dynamics, interests and beliefs which coordinate and motivate them. Collectively, these studies demonstrate the framework is useful in making evident assumptions which motivate the design of different knowledge systems, and further, in assessing the commensurability of those systems. Chapter 9 then presents a further empirical study; here, the framework is implemented as a software system, and pilot tested among a small cohort of researchers. Finally, Chapter 10 summarises the argumentative trajectory of the study as a whole - that, broadly, an elaborated notion of commensurability can tease out important and salient features of translation inscrutable to purely algorithmic methods - and suggests some possibilities for further work

    Subject-Oriented Business Process Management

    Get PDF
    Information Systems Applications (incl.Internet); Business Information Systems; Computer Appl. in Administrative Data Processing; Management of Computing and Information System

    THE ROLE OF UNDERLYING MECHANISMS IN ACHIEVING CONSISTENT HYBRID COMBINATIONS OF COMPETITIVE ADVANTAGES

    Get PDF
    This thesis takes a step beyond the current discussion on hybrid competitive strategies (HS) by identifying the underlying mechanisms and common elements of successful hybrid strategies. Reviewing empirical and theoretical literature revealed a significant gap in this respect. Therefore, the activity-based view of strategy is introduced to the discussion on HS. In a first step, four consistent and sustainable HS concepts are developed providing the basis for deriving specific HS models. A second step identifies commonalities among these HS types and theoretically derives a synthesized, common HS model. Thirdly, the critical realist stance was selected for answering this thesis’ research questions addressing consistent HS concepts, implementations, common activities achieving external and internal fit, as well as common capabilities and resources supporting these activities. In a case study approach, semi-structured, open ended interviews combining appreciative and laddering methods are conducted with twelve interviewees from five firms. The separate analysis of ladder elements and ladders allowed distinguishing constitutional from relational elements. Based on this, fourth, an empirically revised research construct is substantiated. This research finds HS firms applying intended and consistent, but mixed strategy concepts based on generating high customer benefits through combining competitive weapons of differentiation and price or total customer cost. Moreover, HS concepts centre on three strategic building blocks: customer centricity, fulfilment of customer needs and employee orientation. Additionally, the research indicates that firms apply activities primarily for achieving fit. While all firms combine both views, no activities are directed to both fit types simultaneously. Activities deploy capabilities and resources in general on two adaptive and two absorptive mechanisms. Several practical implications derive from this thesis. First, firms can apply the synthesized model as a kind of ‘blueprint’ providing orientation for how to combine competitive advantages. Second, policy makers can apply the outcomes as principles steering firms or industries to ‘higher’ levels of performance. Last, firm managers can adapt their own as well as their firm’s behaviour accordingly

    Commercial Law Intersections

    Get PDF
    Commercial law is not a single, monolithic entity. It has grown into a dense thicket of subject-specific branches that govern a broad range of transactions and corporate actions. When one of such dealings or activities falls concurrently within the purview of two or more of these commercial law branches—such as corporate law, intellectual property law, secured transactions law, conduct and prudential regulation—an overlap materializes. We refer to this legal phenomenon as a commercial law intersection (CLI). CLIs are ubiquitous. Notable examples include traditional commercial transactions, such as bank loans secured by shares, supply chain financing, or patent cross-licensing agreements, as well as nascent FinTech arrangements, such as blockchain-based initial coin offerings and other dealings in digital tokens. CLIs present a multi-faceted challenge. The unharmonious convergence of commercial law branches generates failures in coordination that both increase transaction costs and distort incentives for market participants. Crucially, in the most severe cases, this affliction deters business actors from entering into the affected transactions altogether. The cries of scholars, judges, and practitioners lamenting these issues have grown ever louder; yet methodical, comprehensive solutions remain elusive. This Article endeavors to fill this void. First, it provides a comprehensive analysis of CLIs and the dynamics that give rise to coordination failures. Drawing from systems theory and jurisprudence, it then identifies the deficiencies of the most common approaches used to reconcile tensions between commercial law branches, before advancing the concepts of “legal coherence” and “unity of purpose” as the key to addressing such shortcomings. Finally, leveraging these insights, it formulates a normative blueprint, comprising a two-step method which aims to assist lawmakers, regulators, and courts in untangling the Gordian knot created by CLI coordination failures

    The mathematicization of nature.

    Get PDF
    This thesis defends the Quine-Putnam indispensability argument for mathematical realism and introduces a new indispensability argument for a substantial conception of truth. Chapters 1 and 2 formulate the main components of the Quine-Putnam argument, namely that virtually all scientific laws quantify over mathematical entities and thus logically presuppose the existence thereof. Chapter 2 contains a detailed discussion of the logical structure of some scientific theories that incorporate or apply mathematics. Chapter 3 then reconstructs the central assumptions of Quine's argument, concluding (provocatively) that "science entails platonism". Chapter 4 contains a brief discussion of some major theories of truth, including deflationary views (redundancy, disquotation). Chapter 5 introduces a new argument against such deflationary views, based on certain logical properties of truth theories. Chapter 6 contains a further discussion of mathematical truth. In particular, non-standard conceptions of mathematical truth such as "if-thenism" and "hermeneuticism". Chapter 7 introduces the programmes of reconstrual and reconstruction proposed by recent nominalism. Chapters 8 discusses modal nominalism, concluding that modalism is implausible as an interpretation of mathematics (if taken seriously, it suffers from exactly those epistemological problems allegedly suffered by realism). Chapter 9 discusses Field's deflationism, whose central motivating idea is that mathematics is (pace Quine and Putnam) dispensable in applications. This turns on a conservativeness claim which, as Shapiro pointed out in 1983, must be incorrect (using Godel's Theorems). I conclude in Chapter 10 that nominalistic views of mathematics and deflationist views of truth are both inadequate to the overall explanatory needs of science

    The Role of Inversion in the Genesis, Development and the Structure of Scientific Knowledge

    Get PDF
    The main thrust of the argument of this thesis is to show the possibility of articulating a method of construction or of synthesis--as against the most common method of analysis or division--which has always been (so we shall argue) a necessary component of scientific theorization. This method will be shown to be based on a fundamental synthetic logical relation of thought, that we shall call inversion--to be understood as a species of logical opposition, and as one of the basic monadic logical operators. Thus the major objective of this thesis is to This thesis can be viewed as a response to Larry Laudan's challenge, which is based on the claim that ``the case has yet to be made that the rules governing the techniques whereby theories are invented (if any such rules there be) are the sorts of things that philosophers should claim any interest in or competence at.'' The challenge itself would be to show that the logic of discovery (if at all formulatable) performs the epistemological role of the justification of scientific theories. We propose to meet this challenge head on: a) by suggesting precisely how such a logic would be formulated; b) by demonstrating its epistemological relevance (in the context of justification) and c) by showing that a) and b) can be carried out without sacrificing the fallibilist view of scientific knowledge. OBJECTIVES: We have set three successive objectives: one general, one specific, and one sub-specific, each one related to the other in that very order. (A) The general objective is to indicate the clear possibility of renovating the traditional analytico-synthetic epistemology. By realizing this objective, we attempt to widen the scope of scientific reason or rationality, which for some time now has perniciously been dominated by pure analytic reason alone. In order to achieve this end we need to show specifically that there exists the possibility of articulating a synthetic (constructive) logic/reason, which has been considered by most mainstream thinkers either as not articulatable, or simply non-existent. (B) The second (specific) task is to respond to the challenge of Larry Laudan by demonstrating the possibility of an epistemologically significant generativism. In this context we will argue that this generativism, which is our suggested alternative, and the simplified structuralist and semantic view of scientific theories, mutually reinforce each other to form a single coherent foundation for the renovated analytico-synthetic methodological framework. (C) The third (sub-specific) objective, accordingly, is to show the possibility of articulating a synthetic logic that could guide us in understanding the process of theorization. This is realized by proposing the foundations for developing a logic of inversion, which represents the pattern of synthetic reason in the process of constructing scientific definitions

    Neurath Reconsidered: New Sources and Perspectives

    Get PDF
    • 

    corecore