315 research outputs found
Data Flow Program Graphs
Data flow languages form a subclass of the languages which are based primarily upon function application (i.e., applicative languages). By data flow language we mean any applicative language based entirely upon the notion of data flowing from one function entity to another or any language that directly supports such flowing. This flow concept gives data flow languages the advantage of allowing program definitions to be represented exclusively by graphs. Graphical representations and their applications are the subject of this article
Embodiment and Grammatical Structure: An Approach to the Relation of Experience, Assertion and Truth
In this thesis I address a concern in both existential phenomenology and embodied cognition, namely, the question of how âhigherâ cognitive abilities such as language and judgements of truth relate to embodied experience. I suggest that although our words are grounded in experience, what makes this grounding and our higher abilities possible is grammatical structure.
The opening chapter contrasts the âsituatedâ approach of embodied cognition and existential phenomenology with Cartesian methodological solipsism. The latter produces a series of dualisms, including that of language and meaning, whereas the former dissolves such dualisms. The second chapter adapts Merleau-Pontyâs arguments against the perceptual constancy hypothesis in order to undermine the dualism of grammar and meaning. This raises the question of what grammar is, which is addressed in the third chapter. I acknowledge the force of Chomskyâs observation that language is structure dependent and briefly introduce a minimal grammatical operation which might be the âspark which lit the intellectual forest fireâ (Clark: 2001, 151).
Grammatical relations are argued to make possible the grounding of our symbols in chapters 4 and 5, which attempt to ground the categories of determiner and aspect in spatial deixis and embodied motor processes respectively. Chapter 6 ties the previous three together, arguing that we may understand a given lexeme as an object or as an event by subsuming it within a determiner phrase or aspectualising it respectively. I suggest that such modification of a wordâs meaning is possible because determiners and aspect schematise, i.e. determine the temporal structure, of the lexeme. Chapter 7 uses this account to take up Heideggerâs claim that the relation between being and truth be cast in terms of temporality (2006, H349), though falls short of providing a complete account of the âorigin of truthâ. Chapter 8 concludes and notes further avenues of research
Recommended from our members
The cybernetics of concepts: An integrated system of postulates to explain their nature, origins, use, malfunction and maintenance within a natural neural-molecular medium in the brain. [Original title]
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Behaviourists and Logical Positivists commendably set out to purge prejudiced arguments from science, but where it is obvious that there remains some sort of "ghost" in their rational "machine", it is self-defeating simply to ignore its existence. Freaud, Piaget, and the ethologists have made some progress in grasping this nettle -- moving towards a material explanation of the "other-worldly" properties of the individual -- but their models of the individual remain nebulously structured in their basic elements. Consequently such theories remain disturbingly controversial, and circumscribed in their applicability.
[#] The present work accordingly sets out to bridge this gap by postulating plausible functions for existing micro-structure which could account both for observed behavioural phenomena, and for many of the existing vaguer theoretical constructs. Part A develops such an explanation for Piagetian constructs, while Part B fills in some of the technical details concerning quantitative problems of signal generation, transmission, and selective reception.
[#] Part C applies these notions to other non-Piagetian descriptions and interpretations of psychological phenomena, thereby offering an integration and reconciliation of various schools of theory. (Major areas considered include Ashby's "homeostat" approach, biological self-organization, sleep-modes and dreaming, Freudian theories of neuroses, and various theories concerning psychosis). The basic theory itself is meanwhile developed in much greater detail.
[#] A recurring theme throughout the work is the notion that knowledge-acquisition by any independent system depends not only on "external" interaction with the "real" world, but also on an active seeking for internal consistency within the resulting "internal" model. This concept is crucial to the study in two ways:- (i) The operation of the brain-systems being considered, and (ii) As a guuide to the methodology of the present study itself -- in an area where experimental data is uncomfortably sparse, and likely to remain so
Topics in Programming Languages, a Philosophical Analysis through the case of Prolog
[EN]Programming languages seldom find proper anchorage in philosophy of logic, language and science. is more, philosophy of language seems to be restricted to natural languages and linguistics, and even philosophy of logic is rarely framed into programming languages topics. The logic programming paradigm and Prolog are, thus, the most adequate paradigm and programming language to work on this subject, combining natural language processing and linguistics, logic programming and constriction methodology on both algorithms and procedures, on an overall philosophizing declarative status. Not only this, but the dimension of the Fifth Generation Computer system related to strong Al wherein Prolog took a major role. and its historical frame in the very crucial dialectic between procedural and declarative paradigms, structuralist and empiricist biases, serves, in exemplar form, to treat straight ahead philosophy of logic, language and science in the contemporaneous age as well.
In recounting Prolog's philosophical, mechanical and algorithmic harbingers, the opportunity is open to various routes. We herein shall exemplify some:
- the mechanical-computational background explored by Pascal, Leibniz, Boole, Jacquard, Babbage, Konrad Zuse, until reaching to the ACE (Alan Turing) and EDVAC (von Neumann), offering the backbone in computer architecture, and the work of Turing, Church, Gödel, Kleene, von Neumann, Shannon, and others on computability, in parallel lines, throughly studied in detail, permit us to interpret ahead the evolving realm of programming languages. The proper line from lambda-calculus, to the Algol-family, the declarative and procedural split with the C language and Prolog, and the ensuing branching and programming languages explosion and further delimitation, are thereupon inspected as to relate them with the proper syntax, semantics and philosophical élan of logic programming and Prolog
With Language in Mind
Does language have a role to play in conceptual development, and if so, what is that role? Understanding the contents of another personâs mind parallels the development in early childhood of mental state language. Does the conceptual understanding get reflected in and drive the language development, or does the language allow the representation of propositional attitudes like belief? The paper reviews the evidence and sets up the terms of the debate, focusing on the syntax for mental states. It also asks whether syntax development could serve as a scaffold for other concepts that are described by propositions rather than labels. Finally, it reviews experimentation on the syntax of embedded clauses, where subtle phenomena are acquired for which it is impossible to imagine nonverbal counterparts: here, language is human thinking
Deep Semantic Learning Machine Initial design and experiments
Dissertation presented as the partial requirement for obtaining a Master's degree in Data Science and Advanced AnalyticsComputer vision is an interdisciplinary scientific field that allows the digital world to interact with the real world. It is one of the fastest-growing and most important areas of data science. Applications are endless, given various tasks that can be solved thanks to the advances in the computer vision field. Examples of types of tasks that can be solved thanks to computer vision models are: image analysis, object detection, image transformation, and image generation. Having that many applications is vital for providing models with the best possible performance. Although many years have passed since backpropagation was invented, it is still the most commonly used approach of training neural networks. A satisfactory performance can be achieved with this approach, but is it the best it can get? A fixed topology of a neural network that needs to be defined before any training begins seems to be a significant limitation as the performance of a network is highly dependent on the topology. Since there are no studies that would precisely guide scientists on selecting a proper network structure, the ability to adjust a topology to a problem seems highly promising. Initial ideas of the evolution of neural networks that involve heuristic search methods have provided encouragingly good results for the various reinforcement learning task. This thesis presents the initial experiments on the usage of a similar approach to solve image classification tasks. The new model called Deep Semantic Learning Machine is introduced with a new mutation method specially designed to solve computer vision problems. Deep Semantic Learning Machine allows a topology to evolve from a small network and adjust to a given problem. The initial results are pretty promising, especially in a training dataset. However, in this thesis Deep Semantic Learning Machine was developed only as proof of a concept and further improvements to the approach can be made
Natural Language Syntax Complies with the Free-Energy Principle
Natural language syntax yields an unbounded array of hierarchically
structured expressions. We claim that these are used in the service of active
inference in accord with the free-energy principle (FEP). While conceptual
advances alongside modelling and simulation work have attempted to connect
speech segmentation and linguistic communication with the FEP, we extend this
program to the underlying computations responsible for generating syntactic
objects. We argue that recently proposed principles of economy in language
design - such as "minimal search" criteria from theoretical syntax - adhere to
the FEP. This affords a greater degree of explanatory power to the FEP - with
respect to higher language functions - and offers linguistics a grounding in
first principles with respect to computability. We show how both tree-geometric
depth and a Kolmogorov complexity estimate (recruiting a Lempel-Ziv compression
algorithm) can be used to accurately predict legal operations on syntactic
workspaces, directly in line with formulations of variational free energy
minimization. This is used to motivate a general principle of language design
that we term Turing-Chomsky Compression (TCC). We use TCC to align concerns of
linguists with the normative account of self-organization furnished by the FEP,
by marshalling evidence from theoretical linguistics and psycholinguistics to
ground core principles of efficient syntactic computation within active
inference
Distributed Differential Privacy and Applications
Recent growth in the size and scope of databases has resulted in more
research into making productive use of this data. Unfortunately, a
significant stumbling block which remains is protecting the privacy of
the individuals that populate these datasets. As people spend more
time connected to the Internet, and conduct more of their daily lives
online, privacy becomes a more important consideration, just as the
data becomes more useful for researchers, companies, and
individuals. As a result, plenty of important information remains
locked down and unavailable to honest researchers today, due to fears
that data leakages will harm individuals.
Recent research in differential privacy opens a promising pathway to
guarantee individual privacy while simultaneously making use of the
data to answer useful queries. Differential privacy is a theory that
provides provable information theoretic guarantees on what any answer
may reveal about any single individual in the database. This approach
has resulted in a flurry of recent research, presenting novel
algorithms that can compute a rich class of computations in this
setting.
In this dissertation, we focus on some real world challenges that
arise when trying to provide differential privacy guarantees in the
real world. We design and build runtimes that achieve the mathematical
differential privacy guarantee in the face of three real world
challenges: securing the runtimes against adversaries, enabling
readers to verify that the answers are accurate, and dealing with data
distributed across multiple domains
Proceedings of the 1994 Monterey Workshop, Increasing the Practical Impact of Formal Methods for Computer-Aided Software Development: Evolution Control for Large Software Systems Techniques for Integrating Software Development Environments
Office of Naval Research, Advanced Research Projects Agency, Air Force Office of Scientific Research, Army Research Office, Naval Postgraduate School, National Science Foundatio
- âŠ