98,550 research outputs found
Predicting the Truth: Overcoming Problems with Poppers Verisimilitude Through Model Selection Theory
The purpose of this research is to investigate the possibility of using aspects of model selection theory to overcome both a logical problem and an epistemic problem that prevents progress towards the truth being measured while maintaining a realist approach to science. Karl Popper began such an investigation into the problem of progress in 1963 with the idea of verisimilitude, but his attempts failed to meet his own criteria, the logical and epistemic problems, for a metric of progress. Although philosophers have attempted to fix Popper’s verisimilitude, none have seemed to overcome both criteria yet. My research analyzes the similarities between Predictive Accuracy (PA) and Akaike’s Information Criterion (AIC), both parts of model selection theory, and Popper’s criteria for progress. I find that, in ideal data situations, it seems that PA and AIC satisfy both criteria; however, in non-ideal data situations, there are issues that appear. These issues present an interesting dilemma for scientific progress if it turns out that our theories are in non-ideal data situations, yet PA and AIC seem to be better overall indicators of scientific progress towards the truth than other attempts at overcoming the problems of Popper’s verisimilitude
Circling the Truth: Model Selection Criteria as a Metric of Verisimilitude in Theory Selection
The purpose of this research is to investigate the possibility of using aspects of model selection theory to overcome both a logical problem and an epistemic problem that prevents progress towards the truth to be measured while maintaining a realist approach to science. Karl Popper began such an investigation into the problem of progress in 1963 with an idea of verisimilitude, but his attempts failed to meet his own criteria, the logical and epistemic problems, for a metric of progress. Although philosophers have attempted to fix Popper’s verisimilitude, none have seemed to overcome both criteria yet. My research analyzes the similarities between Predictive Accuracy (PA) and Akaike’s Information Criterion (AIC), parts of model selection theory, and Popper’s criteria for progress. I find that, in ideal data situations, it seems that PA and AIC satisfy both criteria; however, in non-ideal data situations, there are issues that appear. These issues present an interesting dilemma for scientific progress if it turns out our theories are in non-ideal data situations, yet PA and AIC seem to be better overall indicators of scientific progress towards the truth than other attempts at overcoming the problems of Popper’s verisimilitude
Physical complexity and cognitive evolution
Our intuition tells us that there is a general trend in the evolution of nature, a trend towards greater complexity. However, there are several definitions of complexity and hence it is difficult to argue for or against the validity of this intuition. Christoph Adami has recently introduced a novel measure called physical complexity that assigns low complexity to both ordered and random systems and high complexity to those in between. Physical complexity measures the amount of information that an organism stores in its genome about the environment in which it evolves. The theory of physical complexity predicts that evolution increases the amount of ‘knowledge’ an organism accumulates about its niche. It might be fruitful to generalize Adami’s concept of complexity to the entire evolution (including the evolution of man). Physical complexity fits nicely into the philosophical framework of cognitive biology which considers biological evolution as a progressing process of accumulation of knowledge (as a gradual increase of epistemic complexity). According to this paradigm, evolution is a cognitive ‘ratchet’ that pushes the organisms unidirectionally towards higher complexity. Dynamic environment
continually creates problems to be solved. To survive in the environment means to solve the problem, and the solution is an embodied knowledge. Cognitive biology (as well as the theory of physical complexity) uses the concepts of information and entropy and views the evolution from both the information-theoretical and thermodynamical perspective. Concerning humans as conscious beings, it seems necessary to postulate an emergence of a new kind of knowledge - a self-aware and self-referential knowledge. Appearence of selfreflection in evolution indicates that the human brain reached a new qualitative level in the epistemic complexity
Epistemic and Social Scripts in Computer-Supported Collaborative Learning
Collaborative learning in computer-supported learning environments typically means that learners work on tasks together, discussing their individual perspectives via text-based media or videoconferencing, and consequently acquire knowledge. Collaborative learning, however, is often sub-optimal with respect to how learners work on the concepts that are supposed to be learned and how learners interact with each other. Therefore, instructional support needs to be implemented into computer-supported collaborative learning environments. One possibility to improve collaborative learning environments is to conceptualize scripts that structure epistemic activities and social interactions of learners. In this contribution, two studies will be reported that investigated the effects of epistemic and social scripts in a text-based computer-supported learning environment and in a videoconferencing learning environment in order to foster the individual acquisition of knowledge. In each study the factors "epistemic script" and "social script" have been independently varied in a 2×2-factorial design. 182 university students of Educational Science participated in these two studies. Results of both studies show that social scripts can be substantially beneficial with respect to the individual acquisition of knowledge, whereas epistemic scripts apparently do not lead to the expected effects.Unter kooperativem Lernen in computerunterstützten Lernumgebungen versteht man typischerweise, dass Lernende Wissen erwerben indem sie gemeinsam Aufgaben bearbeiten und dabei ihre individuellen Perspektiven mittels textbasierter Medien oder in Videokonferenzen diskutieren. Kooperatives Lernen scheint aber häufig suboptimal zu sein in Bezug auf die inhaltliche Bearbeitung der zu lernenden Konzepte sowie hinsichtlich der sozialen Interaktionen der Lernenden. Eine Möglichkeit kooperative Lernumgebungen zu verbessern besteht darin, Skripts zu konzeptualisieren, die epistemische Aktivitäten und soziale Interaktionen von Lernenden unterstützen. In diesem Beitrag werden zwei Studien berichtet, die die Wirkungen epistemischer und sozialer Skripts auf den individuellen Wissenserwerb in einer text- bzw. einer videobasierten computerunterstützten Lernumgebung untersuchen. In beiden Studien wurden die Faktoren "epistemisches Skript" und "soziales Skript" unabhängig voneinander in einem 2×2-faktoriellen Design miteinander variiert. 182 Studierende der Pädagogik der LMU München nahmen an diesen beiden Studien teil. Die Ergebnisse beider Studien deuten darauf hin, dass soziale Skripts individuellen Wissenserwerb substanziell fördern können, während epistemische Skripts scheinbar nicht zu den erwarteten Ergebnissen führen
Epistemic and social scripts in computer-supported collaborative learning
Collaborative learning in computer-supported learning environments typically means that learners work on tasks together, discussing their individual perspectives via text-based media or videoconferencing, and consequently acquire knowledge. Collaborative learning, however, is often sub-optimal with respect to how learners work on the concepts that are supposed to be learned and how learners interact with each other. One possibility to improve collaborative learning environments is to conceptualize epistemic scripts, which specify how learners work on a given task, and social scripts, which structure how learners interact with each other. In this contribution, two studies will be reported that investigated the effects of epistemic and social scripts in a text-based computer-supported learning environment and in a videoconferencing learning environment in order to foster the individual acquisition of knowledge. In each study the factors ‘epistemic script’ and ‘social script’ have been independently varied in a 2×2-factorial design. 182 university students of Educational Science participated in these two studies. Results of both studies show that social scripts can be substantially beneficial with respect to the individual acquisition of knowledge, whereas epistemic scripts apparently do not to lead to the expected effects
Nonlocal quantum information transfer without superluminal signalling and communication
It is a frequent assumption that - via superluminal information transfers -
superluminal signals capable of enabling communication are necessarily
exchanged in any quantum theory that posits hidden superluminal influences.
However, does the presence of hidden superluminal influences automatically
imply superluminal signalling and communication? The non-signalling theorem
mediates the apparent conflict between quantum mechanics and the theory of
special relativity. However, as a 'no-go' theorem there exist two opposing
interpretations of the non-signalling constraint: foundational and operational.
Concerning Bell's theorem, we argue that Bell employed both interpretations at
different times. Bell finally pursued an explicitly operational position on
non-signalling which is often associated with ontological quantum theory, e.g.,
de Broglie-Bohm theory. This position we refer to as "effective
non-signalling". By contrast, associated with orthodox quantum mechanics is the
foundational position referred to here as "axiomatic non-signalling". In search
of a decisive communication-theoretic criterion for differentiating between
"axiomatic" and "effective" non-signalling, we employ the operational framework
offered by Shannon's mathematical theory of communication. We find that an
effective non-signalling theorem represents two sub-theorems, which we call (1)
non-transfer-control (NTC) theorem, and (2) non-signification-control (NSC)
theorem. Employing NTC and NSC theorems, we report that effective, instead of
axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal
communication. An effective non-signalling theorem allows for nonlocal quantum
information transfer yet - at the same time - effectively denies superluminal
signalling and communication.Comment: 21 pages, 5 figures; The article is published with open acces in
Foundations of Physics (2016
On Some Arguments for Epistemic Value Pluralism
Epistemic Value Monism is the view that there is only one kind of thing of basic, final epistemic value. Perhaps the most plausible version of Epistemic Value Monism is Truth Value Monism, the view that only true beliefs are of basic, final epistemic value. Several authors—notably Jonathan Kvanvig and Michael DePaul—have criticized Truth Value Monism by appealing to the epistemic value of things other than knowledge. Such arguments, if successful, would establish Epistemic Value Pluralism is true and Epistemic Value Monism is false. This paper critically examines those arguments, finding them wanting. However, I develop an argument for Epistemic Value Pluralism that succeeds which turns on general reflection on the nature of value
Theories of understanding others: the need for a new account and the guiding role of the person model theory
What would be an adequate theory of social understanding? In the last decade, the philosophical debate has focused on Theory Theory, Simulation Theory and Interaction Theory as the three possible candidates. In the following, we look carefully at each of these and describe its main advantages and disadvantages. Based on this critical analysis, we formulate the need for a new account of social understanding. We propose the Person Model Theory as an independent new account which has greater explanatory power compared to the existing theorie
- …