477 research outputs found

    Effects of Water Stress on Seed Production in Ruzi Grass \u3ci\u3e(Brachiaria ruziziensis Germain and Everard)\u3c/i\u3e

    Get PDF
    Water stress at different stages of reproductive development influenced seed yield in Ruzi grass differently. Under mild water stress, the earlier in the reproductive developmental stage the stress was applied (before ear emergence) the faster the plants recovered and the less the ultimate damage to inflorescence structure and seed set compared with the situation where water stress occurred during the later stages after inflorescences had emerged. Conversely, severe water stress before ear emergence had a severe effect in damaging both inflorescence numbers and seed quality. Permanent damage to the reproductive structures resulted in deformed inflorescences. Moreover, basal vegetative tillers were stunted and were capable of only limited regrowth after re-watering

    The Little-Hopfield model on a Random Graph

    Full text link
    We study the Hopfield model on a random graph in scaling regimes where the average number of connections per neuron is a finite number and where the spin dynamics is governed by a synchronous execution of the microscopic update rule (Little-Hopfield model).We solve this model within replica symmetry and by using bifurcation analysis we prove that the spin-glass/paramagnetic and the retrieval/paramagnetictransition lines of our phase diagram are identical to those of sequential dynamics.The first-order retrieval/spin-glass transition line follows by direct evaluation of our observables using population dynamics. Within the accuracy of numerical precision and for sufficiently small values of the connectivity parameter we find that this line coincides with the corresponding sequential one. Comparison with simulation experiments shows excellent agreement.Comment: 14 pages, 4 figure

    Phase transitions in optimal unsupervised learning

    Full text link
    We determine the optimal performance of learning the orientation of the symmetry axis of a set of P = alpha N points that are uniformly distributed in all the directions but one on the N-dimensional sphere. The components along the symmetry breaking direction, of unitary vector B, are sampled from a mixture of two gaussians of variable separation and width. The typical optimal performance is measured through the overlap Ropt=B.J* where J* is the optimal guess of the symmetry breaking direction. Within this general scenario, the learning curves Ropt(alpha) may present first order transitions if the clusters are narrow enough. Close to these transitions, high performance states can be obtained through the minimization of the corresponding optimal potential, although these solutions are metastable, and therefore not learnable, within the usual bayesian scenario.Comment: 9 pages, 8 figures, submitted to PRE, This new version of the paper contains one new section, Bayesian versus optimal solutions, where we explain in detail the results supporting our claim that bayesian learning may not be optimal. Figures 4 of the first submission was difficult to understand. We replaced it by two new figures (Figs. 4 and 5 in this new version) containing more detail

    Double-Blind Controlled Assessment of the Effect of Intra-Articular Hydrocortisone and Urokinase in Rheumatoid Arthritis

    Get PDF
    The merits of intra-articular urokinase in the treatment of rheumatoid arthritis are discussed. The results of a double-blind controlled study of its use in the knee joints following intra-articular hydrocortisone are presented

    A Hebbian approach to complex network generation

    Full text link
    Through a redefinition of patterns in an Hopfield-like model, we introduce and develop an approach to model discrete systems made up of many, interacting components with inner degrees of freedom. Our approach clarifies the intrinsic connection between the kind of interactions among components and the emergent topology describing the system itself; also, it allows to effectively address the statistical mechanics on the resulting networks. Indeed, a wide class of analytically treatable, weighted random graphs with a tunable level of correlation can be recovered and controlled. We especially focus on the case of imitative couplings among components endowed with similar patterns (i.e. attributes), which, as we show, naturally and without any a-priori assumption, gives rise to small-world effects. We also solve the thermodynamics (at a replica symmetric level) by extending the double stochastic stability technique: free energy, self consistency relations and fluctuation analysis for a picture of criticality are obtained

    Statistical Mechanics of Soft Margin Classifiers

    Full text link
    We study the typical learning properties of the recently introduced Soft Margin Classifiers (SMCs), learning realizable and unrealizable tasks, with the tools of Statistical Mechanics. We derive analytically the behaviour of the learning curves in the regime of very large training sets. We obtain exponential and power laws for the decay of the generalization error towards the asymptotic value, depending on the task and on general characteristics of the distribution of stabilities of the patterns to be learned. The optimal learning curves of the SMCs, which give the minimal generalization error, are obtained by tuning the coefficient controlling the trade-off between the error and the regularization terms in the cost function. If the task is realizable by the SMC, the optimal performance is better than that of a hard margin Support Vector Machine and is very close to that of a Bayesian classifier.Comment: 26 pages, 12 figures, submitted to Physical Review

    Finite Connectivity Attractor Neural Networks

    Full text link
    We study a family of diluted attractor neural networks with a finite average number of (symmetric) connections per neuron. As in finite connectivity spin glasses, their equilibrium properties are described by order parameter functions, for which we derive an integral equation in replica symmetric (RS) approximation. A bifurcation analysis of this equation reveals the locations of the paramagnetic to recall and paramagnetic to spin-glass transition lines in the phase diagram. The line separating the retrieval phase from the spin-glass phase is calculated at zero temperature. All phase transitions are found to be continuous.Comment: 17 pages, 4 figure

    Replicated Transfer Matrix Analysis of Ising Spin Models on `Small World' Lattices

    Full text link
    We calculate equilibrium solutions for Ising spin models on `small world' lattices, which are constructed by super-imposing random and sparse Poissonian graphs with finite average connectivity c onto a one-dimensional ring. The nearest neighbour bonds along the ring are ferromagnetic, whereas those corresponding to the Poisonnian graph are allowed to be random. Our models thus generally contain quenched connectivity and bond disorder. Within the replica formalism, calculating the disorder-averaged free energy requires the diagonalization of replicated transfer matrices. In addition to developing the general replica symmetric theory, we derive phase diagrams and calculate effective field distributions for two specific cases: that of uniform sparse long-range bonds (i.e. `small world' magnets), and that of (+J/-J) random sparse long-range bonds (i.e. `small world' spin-glasses).Comment: 22 pages, LaTeX, IOP macros, eps figure

    Retarded Learning: Rigorous Results from Statistical Mechanics

    Full text link
    We study learning of probability distributions characterized by an unknown symmetry direction. Based on an entropic performance measure and the variational method of statistical mechanics we develop exact upper and lower bounds on the scaled critical number of examples below which learning of the direction is impossible. The asymptotic tightness of the bounds suggests an asymptotically optimal method for learning nonsmooth distributions.Comment: 8 pages, 1 figur

    The signal-to-noise analysis of the Little-Hopfield model revisited

    Full text link
    Using the generating functional analysis an exact recursion relation is derived for the time evolution of the effective local field of the fully connected Little-Hopfield model. It is shown that, by leaving out the feedback correlations arising from earlier times in this effective dynamics, one precisely finds the recursion relations usually employed in the signal-to-noise approach. The consequences of this approximation as well as the physics behind it are discussed. In particular, it is pointed out why it is hard to notice the effects, especially for model parameters corresponding to retrieval. Numerical simulations confirm these findings. The signal-to-noise analysis is then extended to include all correlations, making it a full theory for dynamics at the level of the generating functional analysis. The results are applied to the frequently employed extremely diluted (a)symmetric architectures and to sequence processing networks.Comment: 26 pages, 3 figure
    • …
    corecore