220 research outputs found
Towards a schizogenealogy of heretical materialism : between Bruno and Spinoza, Nietzsche, Deleuze and other philosophical recluses
The central problematic of this thesis is the formation of a philosophy of creative
matter, a philosophical materialism, deriving from the work of Gilles Deleuze Fdlix and Guattari, and based substantially upon an examination of the consequences
of their engagement with the philosophical tradition. I have supplemented the
writers used by Deleuze and Guattari with the resources of Giordano Bruno's
philosophy, as well as numerous examples and arguments from the natural
sciences. Bruno is particularly important here, in that in his work and life,
materialism is most tightly bound up with monism. Philosophical materialist
monism can be crystallised as a sustained meditation upon one problem: that of
the overcoming of dualism; and in this sense to speak of materialism is to speak of
the problem of hylomorphism. The hylomorphic model, formalised by Aristotle, and
operative in both philosophy and science, implies both a transcendent form that
organises matter, and a dead matter, passively moulded by the imposition of that
form. These ontological and epistemological assumptions have clear political and
theological ramifications, contributing to an abstract diagram of State power. The
critique of this model calls for a philosophy of active, self-organising matter- a
necessarily heretical, materialist thought, constitutionally opposed to all
transcendent powers.
I In this chapter I produce a performative diagram of DeleuzeGuattari's
understanding of the heterogenetic nature of the concept by examining those of
drive, assemblage, multiplicity. The case used here is the linked complex of
problems associated with death and entropy. These issues are posed throughout as
means of indicating Deleuze and Guattari's challenge to dominant modes of
philosophising.
II Here I offer an elaboration of Deleuze and Guattari's relationship with
cybernetics, through an outline of the work of Gilbert Simondon. The principal
concepts developed here, are individuation and becoming. This is followed by
extensive critiques of hylomorphism and autopoiesis. The categories of minor or
nomad, and major or State, sciences, are introduced along with the related concepts
of following and reproducing.
III This chapter explores the oppositions between consistency and
organisation; immanence and transcendence. Here I read two of Deleuze and
Guattari's key concepts- intensity and incorporeal transformation- in terms of
Spinoza and Schelling respectively. Symbiosis and morphogenesis are examined as
examples of the minor sciences introduced in the previous chapter. The minor then
poses the questions of invention and pragmatics in philosophy.
IV This chapter is devoted to a critique of Manuel De Landa's reading of
Deleuze and Guattari that aims to demonstrate, against his claims, the centrality
of Marx to their philosophy. The chapter also elaborates upon the concepts of
Geophilosophy, the machinic phylum, and machinic surplus value.
V This chapter offers a set of elaborations upon the nature of the materialism
produced by bringing the thought of Giordano Bruno into contact with that of
Deleuze, thereby transforming both. Inverted vitalism is posed as a key marker of
Deleuze's genealogy. I show the identity of metaphysics and politics, and its role in
an account of materialist heresy.
VI The final chapter consists of a critique of Kant's claim to being `Copernican',
and Copernicus' claim to being revolutionary. It demonstrates the extent of Bruno's
cosmological revolution. I use Nietzsche's `perfect nihilist' to further the ideas of
invention and heresy advanced earlier, to end with a demonstration of philosophy's
ever present becomings hybrid, as opposed to dominant ideas of its being in a
permanent state of mourning
Thermodynamic Computing
The hardware and software foundations laid in the first half of the 20th
Century enabled the computing technologies that have transformed the world, but
these foundations are now under siege. The current computing paradigm, which is
the foundation of much of the current standards of living that we now enjoy,
faces fundamental limitations that are evident from several perspectives. In
terms of hardware, devices have become so small that we are struggling to
eliminate the effects of thermodynamic fluctuations, which are unavoidable at
the nanometer scale. In terms of software, our ability to imagine and program
effective computational abstractions and implementations are clearly challenged
in complex domains. In terms of systems, currently five percent of the power
generated in the US is used to run computing systems - this astonishing figure
is neither ecologically sustainable nor economically scalable. Economically,
the cost of building next-generation semiconductor fabrication plants has
soared past $10 billion. All of these difficulties - device scaling, software
complexity, adaptability, energy consumption, and fabrication economics -
indicate that the current computing paradigm has matured and that continued
improvements along this path will be limited. If technological progress is to
continue and corresponding social and economic benefits are to continue to
accrue, computing must become much more capable, energy efficient, and
affordable. We propose that progress in computing can continue under a united,
physically grounded, computational paradigm centered on thermodynamics. Herein
we propose a research agenda to extend these thermodynamic foundations into
complex, non-equilibrium, self-organizing systems and apply them holistically
to future computing systems that will harness nature's innate computational
capacity. We call this type of computing "Thermodynamic Computing" or TC.Comment: A Computing Community Consortium (CCC) workshop report, 36 page
Thermodynamic AI and the fluctuation frontier
Many Artificial Intelligence (AI) algorithms are inspired by physics and
employ stochastic fluctuations. We connect these physics-inspired AI algorithms
by unifying them under a single mathematical framework that we call
Thermodynamic AI. Seemingly disparate algorithmic classes can be described by
this framework, for example, (1) Generative diffusion models, (2) Bayesian
neural networks, (3) Monte Carlo sampling and (4) Simulated annealing. Such
Thermodynamic AI algorithms are currently run on digital hardware, ultimately
limiting their scalability and overall potential. Stochastic fluctuations
naturally occur in physical thermodynamic systems, and such fluctuations can be
viewed as a computational resource. Hence, we propose a novel computing
paradigm, where software and hardware become inseparable. Our algorithmic
unification allows us to identify a single full-stack paradigm, involving
Thermodynamic AI hardware, that could accelerate such algorithms. We contrast
Thermodynamic AI hardware with quantum computing where noise is a roadblock
rather than a resource. Thermodynamic AI hardware can be viewed as a novel form
of computing, since it uses a novel fundamental building block. We identify
stochastic bits (s-bits) and stochastic modes (s-modes) as the respective
building blocks for discrete and continuous Thermodynamic AI hardware. In
addition to these stochastic units, Thermodynamic AI hardware employs a
Maxwell's demon device that guides the system to produce non-trivial states. We
provide a few simple physical architectures for building these devices and we
develop a formalism for programming the hardware via gate sequences. We hope to
stimulate discussion around this new computing paradigm. Beyond acceleration,
we believe it will impact the design of both hardware and algorithms, while
also deepening our understanding of the connection between physics and
intelligence.Comment: 47 pages, 18 figures, Added relevant reference
Judgement, Responsibility and the Life-World: Perth Workshop 2011 Conference Proceedings
The workshop was part of the ARC funded project Judgement, Responsibility and the Life-world..
Future paradigms for precision oncology.
Research has exposed cancer to be a heterogeneous disease with a high degree of inter-tumoral and intra-tumoral variability. Individual tumors have unique profiles, and these molecular signatures make the use of traditional histology-based treatments problematic. The conventional diagnostic categories, while necessary for care, thwart the use of molecular information for treatment as molecular characteristics cross tissue types.This is compounded by the struggle to keep abreast the scientific advances made in all fields of science, and by the enormous challenge to organize, cross-reference, and apply molecular data for patient benefit. In order to supplement the site-specific, histology-driven diagnosis with genomic, proteomic and metabolomics information, a paradigm shift in diagnosis and treatment of patients is required.While most physicians are open and keen to use the emerging data for therapy, even those versed in molecular therapeutics are overwhelmed with the amount of available data. It is not surprising that even though The Human Genome Project was completed thirteen years ago, our patients have not benefited from the information. Physicians cannot, and should not be asked to process the gigabytes of genomic and proteomic information on their own in order to provide patients with safe therapies. The following consensus summary identifies the needed for practice changes, proposes potential solutions to the present crisis of informational overload, suggests ways of providing physicians with the tools necessary for interpreting patient specific molecular profiles, and facilitates the implementation of quantitative precision medicine. It also provides two case studies where this approach has been used
Co2-H2o Fugacity Modeling Using Neural Network
Duan and Sun (2003) have designed a theoretical model for carbon dioxide (CO,) solubility in
pure water. This model is valid for solutions !rom 273 to 573K and from 0 to 2000 bar, while in
the other hand, all the parameters presented in the model can be directly calculated without any
iteration, except fugacity coefficient of C02 ( ~c02) which is a function of temperature (T) and
pressure (P). In order to calculate the ~c02, 15 coefficients must be fitted into the equation. Since
the P-T diagram of C02 is divided into 6 regions, different sets of these coefficients need to be
applied for different regions. Hence, there is a need to design a single model to calculate ~c02 for
the whole regions ofP-T diagram which will be done in this project
The Past, Present, and Future of Artificial Life
For millennia people have wondered what makes the living different from the non-living. Beginning in the mid-1980s, artificial life has studied living systems using a synthetic approach: build life in order to understand it better, be it by means of software, hardware, or wetware. This review provides a summary of the advances that led to the development of artificial life, its current research topics, and open problems and opportunities. We classify artificial life research into fourteen themes: origins of life, autonomy, self-organization, adaptation (including evolution, development, and learning), ecology, artificial societies, behavior, computational biology, artificial chemistries, information, living technology, art, and philosophy. Being interdisciplinary, artificial life seems to be losing its boundaries and merging with other fields
- …