69,000 research outputs found
Automation of play:theorizing self-playing games and post-human ludic agents
This article offers a critical reflection on automation of play and its significance for the theoretical inquiries into digital games and play. Automation has become an ever more noticeable phenomenon in the domain of video games, expressed by self-playing game worlds, self-acting characters, and non-human agents traversing multiplayer spaces. On the following pages, the author explores various instances of automated non-human play and proposes a post-human theoretical lens, which may help to create a new framework for the understanding of videogames, renegotiate the current theories of interaction prevalent in game studies, and rethink the relationship between human players and digital games
Reason, causation and compatibility with the phenomena
'Reason, Causation and Compatibility with the Phenomena' strives to give answers to the philosophical problem of the interplay between realism, explanation and experience. This book is a compilation of essays that recollect significant conceptions of rival terms such as determinism and freedom, reason and appearance, power and knowledge. This title discusses the progress made in epistemology and natural philosophy, especially the steps that led from the ancient theory of atomism to the modern quantum theory, and from mathematization to analytic philosophy. Moreover, it provides possible gateways from modern deadlocks of theory either through approaches to consciousness or through historical critique of intellectual authorities.
This work will be of interest to those either researching or studying in colleges and universities, especially in the departments of philosophy, history of science, philosophy of science, philosophy of physics and quantum mechanics, history of ideas and culture. Greek and Latin Literature students and instructors may also find this book to be both a fascinating and valuable point of reference
Recommended from our members
Introducing Preservice STEM Teachers to Computer Science: A Narrative of Theoretically Oriented Design
This paper narrates the process of designing a curricular unit that serves to introduce preservice science, technology, engineering, and mathematics (STEM) teachers to computer science (CS) education. Unlike most literature that focuses on results and findings, this paper explains how a justice-centered approach to CS education informed decisions about the theoretical underpinnings of curricular design choices. Situated in issues related to the gentrification of Austin, Texas, the described curricular unit explores how the increased use of CS and growth of the technology sector are having a direct impact on the historically marginalized residents of East Austin. Connected by a theme that maps are both a form of data visualization and political artifact, the described curricular unit uses CS as a tool to: critique the macro-ethics of politics and society; provide a CS learning environment that can be responsive to the multiple social identities of students; and connect CS to larger struggles for justice and liberation.Educatio
From Models to Simulations
This book analyses the impact computerization has had on contemporary science and explains the origins, technical nature and epistemological consequences of the current decisive interplay between technology and science: an intertwining of formalism, computation, data acquisition, data and visualization and how these factors have led to the spread of simulation models since the 1950s.
Using historical, comparative and interpretative case studies from a range of disciplines, with a particular emphasis on the case of plant studies, the author shows how and why computers, data treatment devices and programming languages have occasioned a gradual but irresistible and massive shift from mathematical models to computer simulations
Robot life: simulation and participation in the study of evolution and social behavior.
This paper explores the case of using robots to simulate evolution, in particular the case of Hamilton's Law. The uses of robots raises several questions that this paper seeks to address. The first concerns the role of the robots in biological research: do they simulate something (life, evolution, sociality) or do they participate in something? The second question concerns the physicality of the robots: what difference does embodiment make to the role of the robot in these experiments. Thirdly, how do life, embodiment and social behavior relate in contemporary biology and why is it possible for robots to illuminate this relation? These questions are provoked by a strange similarity that has not been noted before: between the problem of simulation in philosophy of science, and Deleuze's reading of Plato on the relationship of ideas, copies and simulacra
Coin.AI: A Proof-of-Useful-Work Scheme for Blockchain-based Distributed Deep Learning
One decade ago, Bitcoin was introduced, becoming the first cryptocurrency and
establishing the concept of "blockchain" as a distributed ledger. As of today,
there are many different implementations of cryptocurrencies working over a
blockchain, with different approaches and philosophies. However, many of them
share one common feature: they require proof-of-work to support the generation
of blocks (mining) and, eventually, the generation of money. This proof-of-work
scheme often consists in the resolution of a cryptography problem, most
commonly breaking a hash value, which can only be achieved through brute-force.
The main drawback of proof-of-work is that it requires ridiculously large
amounts of energy which do not have any useful outcome beyond supporting the
currency. In this paper, we present a theoretical proposal that introduces a
proof-of-useful-work scheme to support a cryptocurrency running over a
blockchain, which we named Coin.AI. In this system, the mining scheme requires
training deep learning models, and a block is only mined when the performance
of such model exceeds a threshold. The distributed system allows for nodes to
verify the models delivered by miners in an easy way (certainly much more
efficiently than the mining process itself), determining when a block is to be
generated. Additionally, this paper presents a proof-of-storage scheme for
rewarding users that provide storage for the deep learning models, as well as a
theoretical dissertation on how the mechanics of the system could be
articulated with the ultimate goal of democratizing access to artificial
intelligence.Comment: 17 pages, 5 figure
Troping the Enemy: Metaphor, Culture, and the Big Data Black Boxes of National Security
This article considers how cultural understanding is being brought into the work of the Intelligence Advanced Research Projects Activity (IARPA), through an analysis of its Metaphor program. It examines the type of social science underwriting this program, unpacks implications of the agency’s conception of metaphor for understanding so-called cultures of interest, and compares IARPA’s to competing accounts of how metaphor works to create cultural meaning. The article highlights some risks posed by key deficits in the Intelligence Community\u27s (IC) approach to culture, which relies on the cognitive linguistic theories of George Lakoff and colleagues. It also explores the problem of the opacity of these risks for analysts, even as such predictive cultural analytics are becoming a part of intelligence forecasting. This article examines the problem of information secrecy in two ways, by unpacking the opacity of “black box,” algorithm-based social science of culture for end users with little appreciation of their potential biases, and by evaluating the IC\u27s nontransparent approach to foreign cultures, as it underwrites national security assessments
- …