2,729 research outputs found

    A Probabilistic Linear Genetic Programming with Stochastic Context-Free Grammar for solving Symbolic Regression problems

    Full text link
    Traditional Linear Genetic Programming (LGP) algorithms are based only on the selection mechanism to guide the search. Genetic operators combine or mutate random portions of the individuals, without knowing if the result will lead to a fitter individual. Probabilistic Model Building Genetic Programming (PMB-GP) methods were proposed to overcome this issue through a probability model that captures the structure of the fit individuals and use it to sample new individuals. This work proposes the use of LGP with a Stochastic Context-Free Grammar (SCFG), that has a probability distribution that is updated according to selected individuals. We proposed a method for adapting the grammar into the linear representation of LGP. Tests performed with the proposed probabilistic method, and with two hybrid approaches, on several symbolic regression benchmark problems show that the results are statistically better than the obtained by the traditional LGP.Comment: Genetic and Evolutionary Computation Conference (GECCO) 2017, Berlin, German

    Soil nitrogen affects phosphorus recycling: foliar resorption and plant–soil feedbacks in a northern hardwood forest

    Get PDF
    Previous studies have attempted to link foliar resorption of nitrogen and phosphorus to their respective availabilities in soil, with mixed results. Based on resource optimization theory, we hypothesized that the foliar resorption of one element could be driven by the availability of another element. We tested various measures of soil N and P as predictors of N and P resorption in six tree species in 18 plots across six stands at the Bartlett Experimental Forest, New Hampshire, USA. Phosphorus resorption efficiency (P , 0.01) and proficiency (P ¼ 0.01) increased with soil N content to 30 cm depth, suggesting that trees conserve P based on the availability of soil N. Phosphorus resorption also increased with soil P content, which is difficult to explain based on single-element limitation, but follows from the correlation between soil N and soil P. The expected single-element relationships were evident only in the O horizon: P resorption was high where resin-available P was low in the Oe (P , 0.01 for efficiency, P , 0.001 for proficiency) and N resorption was high where potential N mineralization in the Oa was low (P , 0.01 for efficiency and 0.11 for proficiency). Since leaf litter is a principal source of N and P to the O horizon, low nutrient availability there could be a result rather than a cause of high resorption. The striking effect of soil N content on foliar P resorption is the first evidence of multiple-element control on nutrient resorption to be reported from an unmanipulated ecosystem

    Recovery from disturbance requires resynchronization of ecosystem nutrient cycles

    Get PDF
    Nitrogen (N) and phosphorus (P) are tightly cycled in most terrestrial ecosystems, with plant uptake more than 10 times higher than the rate of supply from deposition and weathering. This near-total dependence on recycled nutrients and the stoichiometric constraints on resource use by plants and microbes mean that the two cycles have to be synchronized such that the ratio of N:P in plant uptake, litterfall, and net mineralization are nearly the same. Disturbance can disrupt this synchronization if there is a disproportionate loss of one nutrient relative to the other. We model the resynchronization of N and P cycles following harvest of a northern hardwood forest. In our simulations, nutrient loss in the harvest is small relative to postharvest losses. The low N:P ratio of harvest residue results in a preferential release of P and retention of N. The P release is in excess of plant requirements and P is lost from the active ecosystem cycle through secondary mineral formation and leaching early in succession. Because external P inputs are small, the resynchronization of the N and P cycles later in succession is achieved by a commensurate loss of N. Through succession, the ecosystem undergoes alternating periods of N limitation, then P limitation, and eventually co-limitation as the two cycles resynchronize. However, our simulations indicate that the overall rate and extent of recovery is limited by P unless a mechanism exists either to prevent the P loss early in succession (e.g., P sequestration not stoichiometrically constrained by N) or to increase the P supply to the ecosystem later in succession (e.g., biologically enhanced weathering). Our model provides a heuristic perspective from which to assess the resynchronization among tightly cycled nutrients and the effect of that resynchronization on recovery of ecosystems from disturbance

    Absolute value measurement of ion-scale turbulence by two-dimensional phase contrast imaging in Large Helical Device

    Full text link
    Absolute value measurements of turbulence amplitude in magnetically confined high-temperature plasmas can effectively explain turbulence-driven transport characteristics and their role in plasma confinements. Two-dimensional phase contrast imaging (2D-PCI) is a technique to evaluate the space-time spectrum of ion-scale electron density fluctuation. However, absolute value measurement of turbulence amplitude has not been conducted owing to the nonlinearity of the detector. In this study, the absolute measurement method proposed in the previous study is applied to turbulence measurement results in the large helical device. As a result, the localized turbulence amplitude at ne=1.5×1019n_e=1.5\times 10^{19}m3^{-3} is approximately 3.5×10153.5\times 10^{15}m3^{-3}, which is 0.02\% of the electron density. In addition, the evaluated poloidal wavenumber spectrum is almost consistent, within a certain error range, the spectrum being calculated using a nonlinear gyrokinetic simulation. This result is the first to the best of our knowledge to quantitatively evaluate turbulence amplitudes measured by 2D-PCI and compare with simulations

    Correct quantum chemistry in a minimal basis from effective Hamiltonians

    Get PDF
    We describe how to create ab-initio effective Hamiltonians that qualitatively describe correct chemistry even when used with a minimal basis. The Hamiltonians are obtained by folding correlation down from a large parent basis into a small, or minimal, target basis, using the machinery of canonical transformations. We demonstrate the quality of these effective Hamiltonians to correctly capture a wide range of excited states in water, nitrogen, and ethylene, and to describe ground and excited state bond-breaking in nitrogen and the chromium dimer, all in small or minimal basis sets

    Semantic distillation: a method for clustering objects by their contextual specificity

    Full text link
    Techniques for data-mining, latent semantic analysis, contextual search of databases, etc. have long ago been developed by computer scientists working on information retrieval (IR). Experimental scientists, from all disciplines, having to analyse large collections of raw experimental data (astronomical, physical, biological, etc.) have developed powerful methods for their statistical analysis and for clustering, categorising, and classifying objects. Finally, physicists have developed a theory of quantum measurement, unifying the logical, algebraic, and probabilistic aspects of queries into a single formalism. The purpose of this paper is twofold: first to show that when formulated at an abstract level, problems from IR, from statistical data analysis, and from physical measurement theories are very similar and hence can profitably be cross-fertilised, and, secondly, to propose a novel method of fuzzy hierarchical clustering, termed \textit{semantic distillation} -- strongly inspired from the theory of quantum measurement --, we developed to analyse raw data coming from various types of experiments on DNA arrays. We illustrate the method by analysing DNA arrays experiments and clustering the genes of the array according to their specificity.Comment: Accepted for publication in Studies in Computational Intelligence, Springer-Verla

    Analysis of Bidirectional Associative Memory using SCSNA and Statistical Neurodynamics

    Full text link
    Bidirectional associative memory (BAM) is a kind of an artificial neural network used to memorize and retrieve heterogeneous pattern pairs. Many efforts have been made to improve BAM from the the viewpoint of computer application, and few theoretical studies have been done. We investigated the theoretical characteristics of BAM using a framework of statistical-mechanical analysis. To investigate the equilibrium state of BAM, we applied self-consistent signal to noise analysis (SCSNA) and obtained a macroscopic parameter equations and relative capacity. Moreover, to investigate not only the equilibrium state but also the retrieval process of reaching the equilibrium state, we applied statistical neurodynamics to the update rule of BAM and obtained evolution equations for the macroscopic parameters. These evolution equations are consistent with the results of SCSNA in the equilibrium state.Comment: 13 pages, 4 figure
    corecore