255,907 research outputs found
Primordial Evolution in the Finitary Process Soup
A general and basic model of primordial evolution--a soup of reacting
finitary and discrete processes--is employed to identify and analyze
fundamental mechanisms that generate and maintain complex structures in
prebiotic systems. The processes---machines as defined in
computational mechanics--and their interaction networks both provide well
defined notions of structure. This enables us to quantitatively demonstrate
hierarchical self-organization in the soup in terms of complexity. We found
that replicating processes evolve the strategy of successively building higher
levels of organization by autocatalysis. Moreover, this is facilitated by local
components that have low structural complexity, but high generality. In effect,
the finitary process soup spontaneously evolves a selection pressure that
favors such components. In light of the finitary process soup's generality,
these results suggest a fundamental law of hierarchical systems: global
complexity requires local simplicity.Comment: 7 pages, 10 figures;
http://cse.ucdavis.edu/~cmg/compmech/pubs/pefps.ht
different numerical approaches for the analysis of a single screw expander
Abstract Positive displacement machines (e.g. scroll, twin screw, reciprocating, etc.) are proven to be suitable as expanders for organic Rankine cycle (ORC) applications, especially in the medium to low power range. However, in order to increase their performance, detailed simulation models are required to optimize the design and reduce the internal losses. In recent years, computational fluid dynamics (CFD) has been applied for the design and analysis of positive displacement machines (both compressors and expanders) with numerous challenges due to the dynamics of the expansion (or compression) process and deforming working chambers. The majority of the studies reported in literature focused on scroll, twin screw and reciprocating machines. Furthermore, the limitation of such methodologies to be applied directly to complex multi-rotor machines has been highlighted in literature. In this paper, a single screw expander (SSE) is used as benchmark to evaluate the applicability of different grid generation methodologies (dynamic remeshing and Chimera strategy overlapping grid), in terms of computational resources required, accuracy of the results and limitations. Although, the low-order models have been applied to single screw machines, there is still a lack of CFD analyses due to the particular complexity of the machine geometry and of its working principle. The calculations have been performed with air to reduce the complexity of the problem. to the main results are two folds: (i) the assessment of a numerical strategy with respect to the most critical parameters of a dynamic mesh-based simulation and (ii) the comparison of the pressure field and internal flow features obtained by using different numerical approaches
Approximations of Algorithmic and Structural Complexity Validate Cognitive-behavioural Experimental Results
We apply methods for estimating the algorithmic complexity of sequences to
behavioural sequences of three landmark studies of animal behavior each of
increasing sophistication, including foraging communication by ants, flight
patterns of fruit flies, and tactical deception and competition strategies in
rodents. In each case, we demonstrate that approximations of Logical Depth and
Kolmogorv-Chaitin complexity capture and validate previously reported results,
in contrast to other measures such as Shannon Entropy, compression or ad hoc.
Our method is practically useful when dealing with short sequences, such as
those often encountered in cognitive-behavioural research. Our analysis
supports and reveals non-random behavior (LD and K complexity) in flies even in
the absence of external stimuli, and confirms the "stochastic" behaviour of
transgenic rats when faced that they cannot defeat by counter prediction. The
method constitutes a formal approach for testing hypotheses about the
mechanisms underlying animal behaviour.Comment: 28 pages, 7 figures and 2 table
Bounding Rationality by Discounting Time
Consider a game where Alice generates an integer and Bob wins if he can
factor that integer. Traditional game theory tells us that Bob will always win
this game even though in practice Alice will win given our usual assumptions
about the hardness of factoring.
We define a new notion of bounded rationality, where the payoffs of players
are discounted by the computation time they take to produce their actions. We
use this notion to give a direct correspondence between the existence of
equilibria where Alice has a winning strategy and the hardness of factoring.
Namely, under a natural assumption on the discount rates, there is an
equilibriumwhere Alice has a winning strategy iff there is a linear-time
samplable distribution with respect to which Factoring is hard on average.
We also give general results for discounted games over countable action
spaces, including showing that any game with bounded and computable payoffs has
an equilibrium in our model, even if each player is allowed a countable number
of actions. It follows, for example, that the Largest Integer game has an
equilibrium in our model though it has no Nash equilibria or epsilon-Nash
equilibria.Comment: To appear in Proceedings of The First Symposium on Innovations in
Computer Scienc
A Periodicity Metric for Assessing Maintenance Strategies
Organised by: Cranfield UniversityThe maintenance policy in manufacturing systems is devised to reset the machines functionality
in an economical fashion in order to keep the products quality within acceptable levels. Therefore,
there is a need for a metric to evaluate and quantify function resetting due to the adopted
maintenance policy. A novel metric for measuring the functional periodicity has been developed
using the complexity theory. It is based on the rate and extent of function resetting. It can be used
as an important criterion for comparing the different maintenance policy alternatives. An industrial
example is used to illustrate the application of the new metric.Mori Seiki – The Machine Tool Company; BAE Systems; S4T – Support Service Solutions: Strategy and Transitio
Comparison between the two definitions of AI
Two different definitions of the Artificial Intelligence concept have been
proposed in papers [1] and [2]. The first definition is informal. It says that
any program that is cleverer than a human being, is acknowledged as Artificial
Intelligence. The second definition is formal because it avoids reference to
the concept of human being. The readers of papers [1] and [2] might be left
with the impression that both definitions are equivalent and the definition in
[2] is simply a formal version of that in [1]. This paper will compare both
definitions of Artificial Intelligence and, hopefully, will bring a better
understanding of the concept.Comment: added four new section
Recommended from our members
Scheduling reentrant jobs on parallel machines with a remote server
This paper explores a specific combinatorial problem relating to re-entrant jobs on parallel primary machines, with a remote server machine. A middle operation is required by each job on the server before it returns to its primary processing machine. The problem is inspired by the logistics of a semi-automated micro-biology laboratory. The testing programme in the laboratory corresponds roughly to a hybrid flowshop, whose bottleneck stage is the subject of study. We demonstrate the NP-hard nature of the problem, and provide various structural features. A heuristic is developed and tested on randomly generated benchmark data. Results indicate solutions reliably within 1.5% of optimum. We also provide a greedy 2-approximation algorithm. Test on real-life data from the microbiology laboratory indicate a 20% saving relative to current practice, which is more than can be achieved currently with 3 instead of 2 people staffing the primary machines
- …