741 research outputs found

    An Introduction to Declarative Programming in CLIPS and PROLOG

    Get PDF
    We provide a brief introduction to CLIPS—a declarative/logic programming language for implementing expert systems—and PROLOG—a declarative/logic programming language based on first-order, predicate calculus. Unlike imperative languages in which the programmer specifies how to compute a solution to a problem, in a declarative language, the programmer specifies what they what to find, and the system uses a search strategy built into the language. We also briefly discuss applications of CLIPS and PROLOG

    The Little-Hopfield model on a Random Graph

    Full text link
    We study the Hopfield model on a random graph in scaling regimes where the average number of connections per neuron is a finite number and where the spin dynamics is governed by a synchronous execution of the microscopic update rule (Little-Hopfield model).We solve this model within replica symmetry and by using bifurcation analysis we prove that the spin-glass/paramagnetic and the retrieval/paramagnetictransition lines of our phase diagram are identical to those of sequential dynamics.The first-order retrieval/spin-glass transition line follows by direct evaluation of our observables using population dynamics. Within the accuracy of numerical precision and for sufficiently small values of the connectivity parameter we find that this line coincides with the corresponding sequential one. Comparison with simulation experiments shows excellent agreement.Comment: 14 pages, 4 figure

    Effects of Water Stress on Seed Production in Ruzi Grass \u3ci\u3e(Brachiaria ruziziensis Germain and Everard)\u3c/i\u3e

    Get PDF
    Water stress at different stages of reproductive development influenced seed yield in Ruzi grass differently. Under mild water stress, the earlier in the reproductive developmental stage the stress was applied (before ear emergence) the faster the plants recovered and the less the ultimate damage to inflorescence structure and seed set compared with the situation where water stress occurred during the later stages after inflorescences had emerged. Conversely, severe water stress before ear emergence had a severe effect in damaging both inflorescence numbers and seed quality. Permanent damage to the reproductive structures resulted in deformed inflorescences. Moreover, basal vegetative tillers were stunted and were capable of only limited regrowth after re-watering

    Dynamics of on-line Hebbian learning with structurally unrealizable restricted training sets

    Full text link
    We present an exact solution for the dynamics of on-line Hebbian learning in neural networks, with restricted and unrealizable training sets. In contrast to other studies on learning with restricted training sets, unrealizability is here caused by structural mismatch, rather than data noise: the teacher machine is a perceptron with a reversed wedge-type transfer function, while the student machine is a perceptron with a sigmoidal transfer function. We calculate the glassy dynamics of the macroscopic performance measures, training error and generalization error, and the (non-Gaussian) student field distribution. Our results, which find excellent confirmation in numerical simulations, provide a new benchmark test for general formalisms with which to study unrealizable learning processes with restricted training sets.Comment: 7 pages including 3 figures, using IOP latex2e preprint class fil

    Slowly evolving geometry in recurrent neural networks I: extreme dilution regime

    Full text link
    We study extremely diluted spin models of neural networks in which the connectivity evolves in time, although adiabatically slowly compared to the neurons, according to stochastic equations which on average aim to reduce frustration. The (fast) neurons and (slow) connectivity variables equilibrate separately, but at different temperatures. Our model is exactly solvable in equilibrium. We obtain phase diagrams upon making the condensed ansatz (i.e. recall of one pattern). These show that, as the connectivity temperature is lowered, the volume of the retrieval phase diverges and the fraction of mis-aligned spins is reduced. Still one always retains a region in the retrieval phase where recall states other than the one corresponding to the `condensed' pattern are locally stable, so the associative memory character of our model is preserved.Comment: 18 pages, 6 figure

    Statistical Mechanics of Support Vector Networks

    Get PDF
    Using methods of Statistical Physics, we investigate the generalization performance of support vector machines (SVMs), which have been recently introduced as a general alternative to neural networks. For nonlinear classification rules, the generalization error saturates on a plateau, when the number of examples is too small to properly estimate the coefficients of the nonlinear part. When trained on simple rules, we find that SVMs overfit only weakly. The performance of SVMs is strongly enhanced, when the distribution of the inputs has a gap in feature space.Comment: REVTeX, 4 pages, 2 figures, accepted by Phys. Rev. Lett (typos corrected

    Phase transitions in optimal unsupervised learning

    Full text link
    We determine the optimal performance of learning the orientation of the symmetry axis of a set of P = alpha N points that are uniformly distributed in all the directions but one on the N-dimensional sphere. The components along the symmetry breaking direction, of unitary vector B, are sampled from a mixture of two gaussians of variable separation and width. The typical optimal performance is measured through the overlap Ropt=B.J* where J* is the optimal guess of the symmetry breaking direction. Within this general scenario, the learning curves Ropt(alpha) may present first order transitions if the clusters are narrow enough. Close to these transitions, high performance states can be obtained through the minimization of the corresponding optimal potential, although these solutions are metastable, and therefore not learnable, within the usual bayesian scenario.Comment: 9 pages, 8 figures, submitted to PRE, This new version of the paper contains one new section, Bayesian versus optimal solutions, where we explain in detail the results supporting our claim that bayesian learning may not be optimal. Figures 4 of the first submission was difficult to understand. We replaced it by two new figures (Figs. 4 and 5 in this new version) containing more detail

    On the center of mass of Ising vectors

    Full text link
    We show that the center of mass of Ising vectors that obey some simple constraints, is again an Ising vector.Comment: 8 pages, 3 figures, LaTeX; Claims in connection with disordered systems have been withdrawn; More detailed description of the simulations; Inset added to figure

    Generalizing with perceptrons in case of structured phase- and pattern-spaces

    Full text link
    We investigate the influence of different kinds of structure on the learning behaviour of a perceptron performing a classification task defined by a teacher rule. The underlying pattern distribution is permitted to have spatial correlations. The prior distribution for the teacher coupling vectors itself is assumed to be nonuniform. Thus classification tasks of quite different difficulty are included. As learning algorithms we discuss Hebbian learning, Gibbs learning, and Bayesian learning with different priors, using methods from statistics and the replica formalism. We find that the Hebb rule is quite sensitive to the structure of the actual learning problem, failing asymptotically in most cases. Contrarily, the behaviour of the more sophisticated methods of Gibbs and Bayes learning is influenced by the spatial correlations only in an intermediate regime of α\alpha, where α\alpha specifies the size of the training set. Concerning the Bayesian case we show, how enhanced prior knowledge improves the performance.Comment: LaTeX, 32 pages with eps-figs, accepted by J Phys

    Stemofoline ethyl acetate solvate

    Get PDF
    Crystals of the title compound, C22H29NO5·C4H8O2, {[systematic name: (2R,3R,5R,5aS,6R,8aR,9S)-(5Z)-5-[3-butyl­tetra­hydro-6-methyl-2,5-methano-4,3,8a-[1]propan­yl[3]yl­idene­furo[3,2-f][1,4]oxazepin-7(5H)-yl­idene]-4-meth­oxy-3-methyl­furan-2(5H)-one ethyl acetate solvate} were isolated from the root extracts of Stemona aphylla (Stemonaceae). The structure closely resembles those of stemofoline derivatives which have previously been reported. Inter­molecular contacts are observed between some C-bonded H atoms and nearby O atoms, perhaps indicating weak inter­actions which could influence the packing of species within the unit cell
    • …
    corecore