695 research outputs found

    Prospects for Measuring Δg\Delta g from Jets at HERA with Polarized Protons

    Full text link
    The measurement of the polarized gluon distribution function Δg(x)\Delta g(x) from photon gluon fusion processes in electron proton deep inelastic scattering producing two jets has been investigated. The study is based on the MEPJET simulation program. The size of the expected spin asymmetry and corresponding statistical uncertainties for a possible measurement with polarized beams of electrons and protons at HERA have been estimated. The results show that the asymmetry can reach a few percents.Comment: 8 pages (Latex) plus 3 figures enclosed as a uuencoded postscript file. The complete paper, including figures, is also available via anonymous ftp at ftp://www-ttp.physik.uni-karlsruhe.de/ , or via www at http://www-ttp.physik.uni-karlsruhe.de/cgi-bin/preprints

    Evolutionary Theories and the Simple Simon Syndrome

    Get PDF

    Data complexity measured by principal graphs

    Full text link
    How to measure the complexity of a finite set of vectors embedded in a multidimensional space? This is a non-trivial question which can be approached in many different ways. Here we suggest a set of data complexity measures using universal approximators, principal cubic complexes. Principal cubic complexes generalise the notion of principal manifolds for datasets with non-trivial topologies. The type of the principal cubic complex is determined by its dimension and a grammar of elementary graph transformations. The simplest grammar produces principal trees. We introduce three natural types of data complexity: 1) geometric (deviation of the data's approximator from some "idealized" configuration, such as deviation from harmonicity); 2) structural (how many elements of a principal graph are needed to approximate the data), and 3) construction complexity (how many applications of elementary graph transformations are needed to construct the principal object starting from the simplest one). We compute these measures for several simulated and real-life data distributions and show them in the "accuracy-complexity" plots, helping to optimize the accuracy/complexity ratio. We discuss various issues connected with measuring data complexity. Software for computing data complexity measures from principal cubic complexes is provided as well.Comment: Computers and Mathematics with Applications, in pres

    Artificial Neural Network Pruning to Extract Knowledge

    Full text link
    Artificial Neural Networks (NN) are widely used for solving complex problems from medical diagnostics to face recognition. Despite notable successes, the main disadvantages of NN are also well known: the risk of overfitting, lack of explainability (inability to extract algorithms from trained NN), and high consumption of computing resources. Determining the appropriate specific NN structure for each problem can help overcome these difficulties: Too poor NN cannot be successfully trained, but too rich NN gives unexplainable results and may have a high chance of overfitting. Reducing precision of NN parameters simplifies the implementation of these NN, saves computing resources, and makes the NN skills more transparent. This paper lists the basic NN simplification problems and controlled pruning procedures to solve these problems. All the described pruning procedures can be implemented in one framework. The developed procedures, in particular, find the optimal structure of NN for each task, measure the influence of each input signal and NN parameter, and provide a detailed verbal description of the algorithms and skills of NN. The described methods are illustrated by a simple example: the generation of explicit algorithms for predicting the results of the US presidential election.Comment: IJCNN 202
    • …
    corecore