15 research outputs found
Computations and Computers in the Sciences of Mind and Brain
Computationalism says that brains are computing mechanisms, that is, mechanisms that perform computations. At present, there is no consensus on how to formulate computationalism precisely or adjudicate the dispute between computationalism and its foes, or between different versions of computationalism. An important reason for the current impasse is the lack of a satisfactory philosophical account of computing mechanisms. The main goal of this dissertation is to offer such an account. I also believe that the history of computationalism sheds light on the current debate. By tracing different versions of computationalism to their common historical origin, we can see how the current divisions originated and understand their motivation. Reconstructing debates over computationalism in the context of their own intellectual history can contribute to philosophical progress on the relation between brains and computing mechanisms and help determine how brains and computing mechanisms are alike, and how they differ. Accordingly, my dissertation is divided into a historical part, which traces the early history of computationalism up to 1946, and a philosophical part, which offers an account of computing mechanisms. The two main ideas developed in this dissertation are that (1) computational states are to be identified functionally not semantically, and (2) computing mechanisms are to be studied by functional analysis. The resulting account of computing mechanism, which I call the functional account of computing mechanisms, can be used to identify computing mechanisms and the functions they compute. I use the functional account of computing mechanisms to taxonomize computing mechanisms based on their different computing power, and I use this taxonomy of computing mechanisms to taxonomize different versions of computationalism based on the functional properties that they ascribe to brains. By doing so, I begin to tease out empirically testable statements about the functional organization of the brain that different versions of computationalism are committed to. I submit that when computationalism is reformulated in the more explicit and precise way I propose, the disputes about computationalism can be adjudicated on the grounds of empirical evidence from neuroscience
Recommended from our members
Norbert Wiener and the growth of negative feedback in scientific explanation : with a proposed research program of "cybernetic analysis"
Negative feedback has become ubiquitous in science both as a
technique and as a conceptual tool. As a technique, negative feedback
has a long history; devices based in its use were made in
antiquity. It has only been during the last century, however, that
rigorous quantitiative methods have become associated with the applications
of negative feedback. These methods originated in communications
engineering and during the World War II period spread rapidly
to other areas of science where further applications were soon made.
During this process of dissemination negative feedback was transformed
into a powerful conceptual tool, of general application,
having to do with the organization of behavior.
The central figure responsible for both the dissemination and
transformation of negative feedback was the American mathematician,
Norbert Wiener, who, as a child prodigy, had developed graduate
level proficiency in science, mathematics and philosophy before he
was twenty. Wiener's multidisciplinary background and interests were
critically important in allowing him to interact with professionals in
many different fields and thereby to disseminate the feedback ideas.
Wiener and two colleagues were the authors of the 1943 paper,
"Behavior, Purpose and Teleology," which stimulated a number of
interdisciplinary meetings. These meetings were important in
spreading the feedback concepts to the different disciplines.
Participating in these meetings were, among others, Gregory Bateson,
Wolfgang Miler, Margaret Mead, Warren S. McCulloch, F. S. C.
Northrop, John von Neumann and Wiener. The successful assimilation
of feedback by the various disciplines in spite of the problems
associated with modern discipline specialization provides a lesson
in how these problems may be overcome. In the case of feedback, the
climate for its assimilation was made considerably more receptive by
concurrent developments in computer science and neurophysiology
which mutually reinforced the robotic view.
The role of negative feedback in scientific research and the
significance of this role have not yet been fully identified. Such
an identification must be made in order to evaluate the historical
events which led to the assimilation of negative feedback. I
attempt to define the role of negative feedback in scientific
research in terms of a program called "cybernetic analysis." This
program develops the behavioral and functional roles of negative
feedback in terms of "adaptive goal-directed behavior"; such
behavior occurs when a system can maintain a certain state or tend
toward a certain state even while being disturbed by external
influences. This behavior is exhibited both by organisms and by
mechanical devices controlled by negative feedback.
Until now the idea that systems could be directed toward an end
has been unacceptable because goal-directedness has been associated
with the outdated notions of teleology and final cause. The ability
of negative feedback to account for goal-directedness mechanistically
not only challenges the view that organisms alone can exhibit such
behavior, but also stands to revise the scientific view of goaldirectedness
in general. With the new legitimacy of both adaptive
and non-adaptive goal-directedness, the path is opened for more
effective analysis of scientific problems.
Despite the great value of Wiener's Cybernetics in focusing
attention on the many new robotic developments of the World War II
period, it tended to obscure many of the critical points made in
the earlier (1943) paper with regard to the role of negative feedback
in scientific explanation. The term "cybernetics" came to be
a great source of confusion because of Wiener's initial presentation,
a presentation which mirrored many of the earlier events in the
interdisciplinary meetings which led to the writing of the work.
It is suggested here that the term "cybernetic analysis" be used to
designate that type of problem analysis which utilizes the hypothesis
of a negative feedback mechanism to account for adaptive goaldirected
behavior. The use of the term "cybernetics" in this manner
will not only succinctly identify one of the great unnamed developments
in science, but give the word renewed meaning in terms of the
literal roots from which Wiener first derived it
Memories along the longitudinal axis of a rodent hippocampus: acquisition and consolidation of variants of a spatial task
The mammalian hippocampus is a structure of the brain believed to be essential in learning
and memory processes. A current controversy concerns whether it is involved in one unique
memory process or is responsible for several related but dissociable functions. And
irrespective of the function! s) there is controversy concerning its role in processes of
memory consolidation. This thesis, divided in two parts, addresses these two issues.Part I: Published studies have suggested that hippocampal involvement in spatial memory
acquisition is restricted to the septal (or dorsal) part of the structure, a result that supports the
idea that the septal and temporal (ventral) parts of the structure have different functions. In
the first part of this thesis I explore further the possibility of functional dissociations along
the septotemporal axis of the hippocampus, including the importance of commissural
projections. Partial lesions are made to the septal or temporal parts of the rat hippocampus,
on one side or both. The behavioural essay involves acquisition of a spatial task (variants of
reference memory in the watermaze). Although the original septal versus temporal
dissociation is replicated, variations of the task protocol (number of days and trials per day
of training) reveal that the temporal hippocampus can also support spatial memory. Learning
can be attained with as little as 30% of the hippocampus spared. The results support the idea
that the hippocampus is responsible for a unique process to which the projections to the
septal and temporal parts, as well as the commissural associations, contribute differently.
This contribution could be dependent on the training protocol.Part II: It is well established that damage to the hippocampus, across different species, can
result in graded retrograde amnesia. This has be taken by some to imply a role in the
consolidation, as well as the acquisition, of memories. The second part of the thesis describes
a series of collaborative experiments in which the involvement of the hippocampus in
acquisition, consolidation and retrieval of spatial memories is explored. Using an AMPA
receptor antagonist the septal part of the hippocampus is temporarily inactivated during
acquisition, retrieval or during the memory retention interval of a watermaze reference
memory task. The results reveal the hippocampus is involved in all three memory processes
when animals are tested 16 days after the end of acquisition. However it is believed that once
a memory has been consolidated, its retrieval can occur independently of the hippocampus.Animal experiments suggesting this involved lesions of the septal hippocampus only. In
work reported in this thesis, lesions to the septal or the whole hippocampus are made at
different times (1 day or 6 weeks) after acquisition. Using a novel memory testing protocol,
the temporal 30% of the hippocampus was found to be sufficient in the retrieval of this
memory in a time-independent manner. Animals given lesions to the whole of the structure
could not be reminded of what they had leamt earlier at either interval. The results suggest
that the whole hippocampus is necessary for the consolidation of memories acquired with an
intact hippocampus and that at least part of the hippocampus is necessary for retrieval of
memories. The results obtained in part two of the thesis could be dependent on the training
and testing protocol as well as on the navigational aspects of the task
The External Tape Hypothesis: a Turing machine based approach to cognitive computation
The symbol processing or "classical cognitivist" approach to mental computation suggests that the cognitive architecture operates rather like a digital computer. The components of the architecture are input, output and central systems. The input and output systems communicate with both the internal and external environments of the cognizer and transmit codes to and from the rule governed, central processing system which operates on structured representational expressions in the internal environment. The connectionist approach, by contrast, suggests that the cognitive architecture should be thought of as a network of interconnected neuron-like processing elements (nodes) which operates rather like a brain. Connectionism distinguishes input, output and central or "hidden" layers of nodes. Connectionists claim that internal processing consists not of the rule governed manipulation of structured symbolic expressions, but of the excitation and inhibition of activity and the alteration of connection strengths via message passing within and between layers of nodes in the network. A central claim of the thesis is that neither symbol processing nor connectionism provides an adequate characterization of the role of the external environment in cognitive computation. An alternative approach, called the External Tape Hypothesis (ETH), is developed which claims, on the basis of Turing's analysis of routine computation, that the Turing machine model can be used as the basis for a theory which includes the environment as an essential part of the cognitive architecture. The environment is thought of as the tape, and the brain as the control of a Turing machine. Finite state automata, Turing machines,
and universal Turing machines are described, including details of Turing's original universal machine construction. A short account of relevant aspects of the history of digital computation is followed by a critique of the symbol processing approach as it is construed by influential proponents such as Allen Newell and Zenon Pylyshyn among others. The External Tape Hypothesis is then developed as an alternative theoretical basis. In the final chapter, the ETH is combined with the notion of a self-describing Turing machine to provide the basis for an account of thinking and the development of internal representations
Design Methods Movement, 1944-1967
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Architecture, 2008.Includes bibliographical references (p. 259-282).In the mythic construct of the West, nature, for a considerable era, has served as a seminal broker in basal underpinning discourse. This is despite nature's commutative, convertible and contradictory disclosures. As the antithesis of socio-culture, nature has been the arena of the given, of necessity and compulsion, and a zone of constraint. As "Nature" it has worked as the precipitate of humanity and ministered as the model for human activity. To violate the norms of nature, to be unnatural, has been considered unhealthy, amoral and illegal.Following the Second World War, constructs of nature, socio-culture and norms were altered in design education and practice. Postwar, an emerging discourse of computer-related technologies contributed to reconfiguring representations of architecture, engineering, product and urban planning in the US and UK. The collective driving these changes became known as the Design Methods movement. Together with trajectories of thought in psychology and psychiatry, discourses materializing from such fields as cybernetics, operations research, information theory and computers altered design processes and education.This dissertation ranges from examining the politics of funding surrounding an urban planning research center in Cambridge, Massachusetts to elucidating conferences concerning, architecture, engineering, urban planning and product design in the UK. Taking from media theorist Friedrich Kittler that technologically possible manipulations condition what can become a discourse, this dissertation is structured around two threads.(cont.) One thread concerns how computer-related technologies configured a re-conceptualization of nature and socio-culture in design practice and education. A second thread examines how psychology and psychoanalytic concerns were reworked for design through a lens of computer related technologies. A line between the natural and the normative is questioned concerning concepts of abnormality and deviation.by Alise Upitis.Ph.D
Assembling life : models, the cell, and the reformations of biological science, 1920-1960
Imperial Users onl
Recommended from our members
Signal separation of musical instruments: simulation-based methods for musical signal decomposition and transcription
This thesis presents techniques for the modelling of musical signals, with particular regard to monophonic and polyphonic pitch estimation. Musical signals are modelled as a set of notes, each comprising of a set of harmonically-related sinusoids. An hierarchical model is presented that is very general and applicable to any signal that can be decomposed as the sum of basis functions. Parameter estimation is posed within a Bayesian framework, allowing for the incorporation of prior information about model parameters. The resulting posterior distribution is of variable dimension and so reversible jump MCMC simulation techniques are employed for the parameter estimation task. The extension of the model to time-varying signals with high posterior correlations between model parameters is described. The parameters and hyperparameters of several frames of data are estimated jointly to achieve a more robust detection. A general model for the description of time-varying homogeneous and heterogeneous multiple component signals is developed, and then applied to the analysis of musical signals. The importance of high level musical and perceptual psychological knowledge in the formulation of the model is highlighted, and attention is drawn to the limitation of pure signal processing techniques for dealing with musical signals. Gestalt psychological grouping principles motivate the hierarchical signal model, and component identifiability is considered in terms of perceptual streaming where each component establishes its own context. A major emphasis of this thesis is the practical application of MCMC techniques, which are generally deemed to be too slow for many applications. Through the design of efficient transition kernels highly optimised for harmonic models, and by careful choice of assumptions and approximations, implementations approaching the order of realtime are viable.Engineering and Physical Sciences Research Counci
Lost in the archive: vision, artefact and loss in the evolution of hypertext
How does one write the history of a technical machine? Can we say that technical machines have their own genealogies, their own evolutionary dynamic? The technical artefact constitutes a series of objects, a lineage or a line. At a cursory level, we can see this in the fact that technical machines come in generations - they adapt and adopt characteristics over time, one suppressing the other as it becomes obsolete. It is argued that technics has its own evolutionary dynamic, and that this dynamic stems neither from biology nor from human societies. Yet 'it is impossible to deny the role of human thought in the creation of technical artefacts' (Guattari 1995, p. 37). Stones do not automatically rise up into a wall - humans 'invent' technical objects. This, then, raises the question of technical memory. Is it humans that remember previous generations of machines and transfer their characteristics to new machines? If so, how and where do they remember them? It is suggested that humans learn techniques from technical artefacts, and transfer these between machines. This theory of technical evolution is then used to understand the genealogy of hypertext. The historical differentiations of hypertext in different technical systems is traced. Hypertext is defined as both a technical artefact and also a set of techniques: both are a part of this third milieu, technics. The difference between technical artefact and technical vision is highlighted, and it is suggested that technique and vision change when they are externalised as material artefact. The primary technique traced is association, the organisational principle behind the hypertext systems explored in the manuscript. In conclusion, invention is shown to be an act of exhumation, the transfer and retroactiviation of techniques from the past. This thesis presents an argument for a new model of technical evolution, a model which claims that technics constitutes its own dynamic, and that this dynamic exceeds human evolution. It traces the genealogy of hypertext as a set of techniques and as series of material artefacts. To create this geneaology I draw on interviews conducted with Douglas Engelbart, Ted Nelson and Andries van Dam, as well as a wide variety of primary and secondary resources
UNSTABLE TERRITORIES OF REPRESENTATION: Architectural Experience and the Behaviour of Forms, Spaces and the Collective Dynamic Environment
This thesis applies an interdisciplinary cybernetic and phenomenological analysis to contemporary theories of representation and interpretation of architecture, resulting in a speculative theoretical model of architectural experience as a behavioural system.
The methodological model adopted for this research defines the main structure of the thesis where the narrative and the contributing parts of its complexity emerge. The narrative is presented through objectives and hypotheses that shift and slide between architectural representation and its experience based on three key internal components in architecture: the architectural forms and spaces, the active observers that interact with their environment, and finally, the responsive environment. Three interrelated research questions are considered. The first seeks to define the influence of the theoretical instability between complex life processes, emerging technologies and active perception upon architecture. The second questions the way in which the architectural experience is generated. The third asks: Does architecture behave? And if so, is it possible to define its behavioural characteristics related to its representation, experience and the medium of communication in-between?
The thesis begins by exploring the effect of developments in digitally interactive, biological, and hybrid technologies on representation in architecture. An account of architectural examples considers the shift in the meaning of representation in architecture from the actual and literal to the more conceptual and experimental, from the individual human body and its relations to the multifaceted ecosystem of collective and connected cultures. The writings of Kester Rattenbury, Neil Leach, and Peter Cook among others contribute to the transformation of the ordinary perceptual experience of architecture, the development of experimental practices in architectural theory, and the dynamism of our perception.
The thesis goes on to suggest that instability in architectural representation does not only depend on the internal components of the architectural system but also on the principles and processes of complex systems as well as changes in active perception and our consciousness that act as the external influences on the system. Established theoretical endeavours in biology of D’Arcy Thompson, Alan Turing, and John Holland and philosophies of Merleau-Ponty, Richard Gregory, and Deleuze and Guattari are discussed in this context. Pre-programmed and computational models, illustrative and generative, are presented throughout the thesis.
In the final stage of the development of the thesis architecture is analysed as a system. This is not an unprecedented notion, however defining the main elements and components of this system and their interactions and thereafter identifying that the system behaves and defining its behavioural characteristics, adds to the knowledge in the field of theoretical and experimental architecture. This thesis considers the behavioural characteristics of architecture to be derived from the hypothetical links and unstable thresholds of its non-dualistic notions of materiality and immateriality, reality and virtuality, and finally, intentionality and interpretation