17,172 research outputs found
Towards Understanding the Origin of Genetic Languages
Molecular biology is a nanotechnology that works--it has worked for billions
of years and in an amazing variety of circumstances. At its core is a system
for acquiring, processing and communicating information that is universal, from
viruses and bacteria to human beings. Advances in genetics and experience in
designing computers have taken us to a stage where we can understand the
optimisation principles at the root of this system, from the availability of
basic building blocks to the execution of tasks. The languages of DNA and
proteins are argued to be the optimal solutions to the information processing
tasks they carry out. The analysis also suggests simpler predecessors to these
languages, and provides fascinating clues about their origin. Obviously, a
comprehensive unraveling of the puzzle of life would have a lot to say about
what we may design or convert ourselves into.Comment: (v1) 33 pages, contributed chapter to "Quantum Aspects of Life",
edited by D. Abbott, P. Davies and A. Pati, (v2) published version with some
editin
Customization principles of an aeronautics SLM environment and an illustration on aeronautics use cases: the doors management system and the flight control system
International audienceHaving efficient means to build an appropriate system engineering framework is a differentiating factor for the competitiveness of the European aeronautics industry. We present in this paper an approach and a set of concepts enabling the construction and customization of a system engineering environment. These customization principles, set up in the frame of the CESAR project, are driven by a multi-view model-based system development process and by a system data description. The flexibility and the benefits of this approach are demonstrated on two industrial cases: the doors management system and the flight control system
Emergence of Zipf's Law in the Evolution of Communication
Zipf's law seems to be ubiquitous in human languages and appears to be a
universal property of complex communicating systems. Following the early
proposal made by Zipf concerning the presence of a tension between the efforts
of speaker and hearer in a communication system, we introduce evolution by
means of a variational approach to the problem based on Kullback's Minimum
Discrimination of Information Principle. Therefore, using a formalism fully
embedded in the framework of information theory, we demonstrate that Zipf's law
is the only expected outcome of an evolving, communicative system under a
rigorous definition of the communicative tension described by Zipf.Comment: 7 pages, 2 figure
Carbon--The First Frontier of Information Processing
Information is often encoded as an aperiodic chain of building blocks. Modern
digital computers use bits as the building blocks, but in general the choice of
building blocks depends on the nature of the information to be encoded. What
are the optimal building blocks to encode structural information? This can be
analysed by substituting the operations of addition and multiplication of
conventional arithmetic with translation and rotation. It is argued that at the
molecular level, the best component for encoding discretised structural
information is carbon. Living organisms discovered this billions of years ago,
and used carbon as the back-bone for constructing proteins that function
according to their structure. Structural analysis of polypeptide chains shows
that an efficient and versatile structural language of 20 building blocks is
needed to implement all the tasks carried out by proteins. Properties of amino
acids indicate that the present triplet genetic code was preceded by a more
primitive one, coding for 10 amino acids using two nucleotide bases.Comment: (v1) 9 pages, revtex. (v2) 10 pages. Several arguments expanded to
make the article self-contained and to increase clarity. Applications pointed
out. (v3) 11 pages. Published version. Well-known properties of proteins
shifted to an appendix. Reformatted according to journal styl
Recovering Grammar Relationships for the Java Language Specification
Grammar convergence is a method that helps discovering relationships between
different grammars of the same language or different language versions. The key
element of the method is the operational, transformation-based representation
of those relationships. Given input grammars for convergence, they are
transformed until they are structurally equal. The transformations are composed
from primitive operators; properties of these operators and the composed chains
provide quantitative and qualitative insight into the relationships between the
grammars at hand. We describe a refined method for grammar convergence, and we
use it in a major study, where we recover the relationships between all the
grammars that occur in the different versions of the Java Language
Specification (JLS). The relationships are represented as grammar
transformation chains that capture all accidental or intended differences
between the JLS grammars. This method is mechanized and driven by nominal and
structural differences between pairs of grammars that are subject to
asymmetric, binary convergence steps. We present the underlying operator suite
for grammar transformation in detail, and we illustrate the suite with many
examples of transformations on the JLS grammars. We also describe the
extraction effort, which was needed to make the JLS grammars amenable to
automated processing. We include substantial metadata about the convergence
process for the JLS so that the effort becomes reproducible and transparent
Programming as a mathematical activity
Programming in undergraduate mathematics is an opportunity to develop various mathematical skills. This paper outlines some topics covered in a second year, optional module ‘Programming with Mathematical Applications’ that develop mathematical thinking and involve mathematical activities, showing that practical programming can be taught to mathematicians as a mathematical skill
Think the image, don't make it! On algorithmic thinking, art education, and re-coding
In conceptual art, the idea is not only starting point and motivation for the material work, it is often considered the work itself. In algorithmic art, thinking the process of generating the image as one instance of an entire class of images becomes the decisive kernel of the creative work. This is so because the generative algorithm is the innovative component of the artist's work. We demonstrate this by critically looking at attempts to re-construct works of early computer art by the re-coding movement. Thinking images is not the same as thinking of images. For thinking images is the act of preparing precise descriptions that control the machinic materialization of images. This kind of activity is a case of algorithmic thinking which, in turn, has become an important general aspect of current society. Art education may play an important role in establishing concrete connections between open artistic and more confined technological ways of thinking when thinking pro-gresses algorithmically
The Noetic Prism
Definitions of ‘knowledge’ and its relationships with ‘data’ and ‘information’ are varied, inconsistent and often contradictory. In particular the traditional hierarchy of data-information-knowledge and its various revisions do not stand up to close scrutiny. We suggest that the problem lies in a flawed analysis that sees data, information and knowledge as separable concepts that are transformed into one another through processing. We propose instead that we can describe collectively all of the materials of computation as ‘noetica’, and that the terms data, information and knowledge can be reconceptualised as late-binding, purpose-determined aspects of the same body of material. Changes in complexity of noetica occur due to value-adding through the imposition of three different principles: increase in aggregation (granularity), increase in set relatedness (shape), and increase in contextualisation through the formation of networks (scope). We present a new model in which granularity, shape and scope are seen as the three vertices of a triangular prism, and show that all value-adding through computation can be seen as movement within the prism space. We show how the conceptual framework of the noetic prism provides a new and comprehensive analysis of the foundations of computing and information systems, and how it can provide a fresh analysis of many of the common problems in the management of intellectual resources
- …