Skip to main content
Article thumbnail
Location of Repository

The influence of oppositely classified examples on the generalization complexity of Boolean functions

By Martin Anthony and Leonardo Franco
Topics: QA76 Computer software
Publisher: IEEE
Year: 2006
DOI identifier: 10.1109/TNN.2006.872352
OAI identifier: oai:eprints.lse.ac.uk:14969
Provided by: LSE Research Online

Suggested articles

Citations

  1. (1999). A theorem on sensitivity and applications in private computation,” in doi
  2. (1989). Almost optimal lower bounds for small depth circuits,” doi
  3. Antipredictable sequences: harder to predict thanrandomsequences,”NeuralComput.,vol.10,pp.2219–2230,1998. doi
  4. (1994). Circuit Complexity and Neural Networks. doi
  5. (2002). Computational complexity for physicists,” doi
  6. (1993). Constant depth circuits, Fourier transform and learnability,” doi
  7. (1991). Depth-size tradeoffs for neural computation,” doi
  8. (2006). Generalization ability of Boolean functions implemented in feedforward neural networks,” Neurocomputing, doi
  9. (2000). Generalization and selection of examples in feedforward neural networks,” doi
  10. (2001). Generalization properties of modular networks implementing the parity function,” doi
  11. (1991). Improved learning of AC functions,” in doi
  12. (2003). Learning capability and storage capacity of two-hiddenlayer feedforward networks,” doi
  13. (2000). Minimization of Boolean complexity in human concept learning,”
  14. (2002). Minimizing the average query complexityoflearningmonotoneBooleanfunctions,”INFORMSJ.Comput.,
  15. (1994). Networks: A Comprehensive Foundation: doi
  16. (1994). Neural models and spectral methods,” doi
  17. (1999). Neural Network Learning: Theoretical Foundations. doi
  18. (2004). Non-glassy ground state in a long-range antiferromagnetic frustrated model in the hypercubic cell,” doi
  19. (1995). On specifying Boolean functions by labeled examples,” doi
  20. On the computational complexity of ising spin glasses,” doi
  21. (1998). Periodic symmetric functions, serial addition, and multiplication with neural networks,” doi
  22. (1992). Polynomial threshold functions, AC functions, and spectral norms,” doi
  23. (1975). Randomness and mathematical proof,” doi
  24. (1993). Slicing the hypercube,” in Surveys doi
  25. (1965). Solving arithmetic problems using feedforward neural networks,” doi
  26. (1998). Statistical Learning Theory. doi
  27. (2001). Statistical mechanics methods and phase transitions in optimization problems,” doi
  28. (2000). Statistical mechanics, three-dimensionality and NP-completeness I. Universality of intractability for the partition function of the Ising model accross nonplanar lattices,” in doi
  29. (1993). The anisotropic kagome antiferromagnet: a topical spin glass?,” doi
  30. (1997). The average sensitivity of bounded-depth circuits,” doi
  31. (2000). The average sensitivity of square-freeness,” doi
  32. (1987). The Complexity of Boolean Functions. doi
  33. (1988). The influence of variables on Boolean functions,” in doi
  34. (1998). The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network,” doi
  35. (1965). Three approaches to the quantitative definition of information,” doi
  36. (2001). V.Deolalikar,“MappingBooleanfunctionswithneuralnetworkshaving binary weights and zero thresholds,” doi

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.