868 research outputs found
On finitely ambiguous B\"uchi automata
Unambiguous B\"uchi automata, i.e. B\"uchi automata allowing only one
accepting run per word, are a useful restriction of B\"uchi automata that is
well-suited for probabilistic model-checking. In this paper we propose a more
permissive variant, namely finitely ambiguous B\"uchi automata, a
generalisation where each word has at most accepting runs, for some fixed
. We adapt existing notions and results concerning finite and bounded
ambiguity of finite automata to the setting of -languages and present a
translation from arbitrary nondeterministic B\"uchi automata with states to
finitely ambiguous automata with at most states and at most accepting
runs per word
Migratory winter bag-net fishery in coastal waters of the Hooghly estuary
The migratory winter big-net fishery is a typical feature of the coastal waters of the Hooghly
estuary, 4,000 man with about 800 bag-nets migrated from different estuarine areas and established
fishing camps In different islands during 1934 85 and 1985-86. Three and a half months seasonal
fishery accounted for an average estimated fish yield of 17,872 t, forming about 71% of the
total fish yield from the estuary as against 29% to 33% about 15 year* ago. An average catch per
unit of effort of 152 kg was about 18 to 36 times that obtained in the upper and middle stretches and
about 3 times more than that 15 years ago in the lower coastal waters. Harpodon nehereus, Trichlurus
spp., Psma pama, Setipinna spp. and different species of prawns dominated in the catches. The bulk
of the catches are tundrlsd and exported to marketing centres. The reasons for tremendous increase in
the winter migratory bag-net catches have been discusse
Random Costs in Combinatorial Optimization
The random cost problem is the problem of finding the minimum in an
exponentially long list of random numbers. By definition, this problem cannot
be solved faster than by exhaustive search. It is shown that a classical
NP-hard optimization problem, number partitioning, is essentially equivalent to
the random cost problem. This explains the bad performance of heuristic
approaches to the number partitioning problem and allows us to calculate the
probability distributions of the optimum and sub-optimum costs.Comment: 4 pages, Revtex, 2 figures (eps), submitted to PR
Analysis of the intraspinal calcium dynamics and its implications on the plasticity of spiking neurons
The influx of calcium ions into the dendritic spines through the
N-metyl-D-aspartate (NMDA) channels is believed to be the primary trigger for
various forms of synaptic plasticity. In this paper, the authors calculate
analytically the mean values of the calcium transients elicited by a spiking
neuron undergoing a simple model of ionic currents and back-propagating action
potentials. The relative variability of these transients, due to the stochastic
nature of synaptic transmission, is further considered using a simple Markov
model of NMDA receptos. One finds that both the mean value and the variability
depend on the timing between pre- and postsynaptic action-potentials. These
results could have implications on the expected form of synaptic-plasticity
curve and can form a basis for a unified theory of spike time-dependent, and
rate based plasticity.Comment: 14 pages, 10 figures. A few changes in section IV and addition of a
new figur
Bounds on the Complexity of Halfspace Intersections when the Bounded Faces have Small Dimension
We study the combinatorial complexity of D-dimensional polyhedra defined as
the intersection of n halfspaces, with the property that the highest dimension
of any bounded face is much smaller than D. We show that, if d is the maximum
dimension of a bounded face, then the number of vertices of the polyhedron is
O(n^d) and the total number of bounded faces of the polyhedron is O(n^d^2). For
inputs in general position the number of bounded faces is O(n^d). For any fixed
d, we show how to compute the set of all vertices, how to determine the maximum
dimension of a bounded face of the polyhedron, and how to compute the set of
bounded faces in polynomial time, by solving a polynomial number of linear
programs
Knowledge-based energy functions for computational studies of proteins
This chapter discusses theoretical framework and methods for developing
knowledge-based potential functions essential for protein structure prediction,
protein-protein interaction, and protein sequence design. We discuss in some
details about the Miyazawa-Jernigan contact statistical potential,
distance-dependent statistical potentials, as well as geometric statistical
potentials. We also describe a geometric model for developing both linear and
non-linear potential functions by optimization. Applications of knowledge-based
potential functions in protein-decoy discrimination, in protein-protein
interactions, and in protein design are then described. Several issues of
knowledge-based potential functions are finally discussed.Comment: 57 pages, 6 figures. To be published in a book by Springe
Analysis of the Karmarkar-Karp Differencing Algorithm
The Karmarkar-Karp differencing algorithm is the best known polynomial time
heuristic for the number partitioning problem, fundamental in both theoretical
computer science and statistical physics. We analyze the performance of the
differencing algorithm on random instances by mapping it to a nonlinear rate
equation. Our analysis reveals strong finite size effects that explain why the
precise asymptotics of the differencing solution is hard to establish by
simulations. The asymptotic series emerging from the rate equation satisfies
all known bounds on the Karmarkar-Karp algorithm and projects a scaling
, where . Our calculations reveal subtle
relations between the algorithm and Fibonacci-like sequences, and we establish
an explicit identity to that effect.Comment: 9 pages, 8 figures; minor change
Encoding temporal regularities and information copying in hippocampal circuits
Discriminating, extracting and encoding temporal regularities is a critical requirement in the brain, relevant to sensory-motor processing and learning. However, the cellular mechanisms responsible remain enigmatic; for example, whether such abilities require specific, elaborately organized neural networks or arise from more fundamental, inherent properties of neurons. Here, using multi-electrode array technology, and focusing on interval learning, we demonstrate that sparse reconstituted rat hippocampal neural circuits are intrinsically capable of encoding and storing sub-second-order time intervals for over an hour timescale, represented in changes in the spatial-temporal architecture of firing relationships among populations of neurons. This learning is accompanied by increases in mutual information and transfer entropy, formal measures related to information storage and flow. Moreover, temporal relationships derived from previously trained circuits can act as templates for copying intervals into untrained networks, suggesting the possibility of circuit-to-circuit information transfer. Our findings illustrate that dynamic encoding and stable copying of temporal relationships are fundamental properties of simple in vitro networks, with general significance for understanding elemental principles of information processing, storage and replication
Homeostatic Scaling of Excitability in Recurrent Neural Networks
Neurons adjust their intrinsic excitability when experiencing a persistent change in synaptic drive. This process can prevent neural activity from moving into either a quiescent state or a saturated state in the face of ongoing plasticity, and is thought to promote stability of the network in which neurons reside. However, most neurons are embedded in recurrent networks, which require a delicate balance between excitation and inhibition to maintain network stability. This balance could be disrupted when neurons independently adjust their intrinsic excitability. Here, we study the functioning of activity-dependent homeostatic scaling of intrinsic excitability (HSE) in a recurrent neural network. Using both simulations of a recurrent network consisting of excitatory and inhibitory neurons that implement HSE, and a mean-field description of adapting excitatory and inhibitory populations, we show that the stability of such adapting networks critically depends on the relationship between the adaptation time scales of both neuron populations. In a stable adapting network, HSE can keep all neurons functioning within their dynamic range, while the network is undergoing several (patho)physiologically relevant types of plasticity, such as persistent changes in external drive, changes in connection strengths, or the loss of inhibitory cells from the network. However, HSE cannot prevent the unstable network dynamics that result when, due to such plasticity, recurrent excitation in the network becomes too strong compared to feedback inhibition. This suggests that keeping a neural network in a stable and functional state requires the coordination of distinct homeostatic mechanisms that operate not only by adjusting neural excitability, but also by controlling network connectivity
- …