2,137 research outputs found
PAC learning with generalized samples and an application to stochastic geometry
Includes bibliographical references (p. 16-17).Caption title.Research supported by the National Science Foundation. ECS-8552419 Research supported by the U.S. Army Research Office. DAAL01-86-K-0171 Research supported by the Dept. of the Navy under an Air Force Contract. F19628-90-C-0002S.R. Kulkarni ... [et al.]
Probably Approximately Correct Nash Equilibrium Learning
We consider a multi-agent noncooperative game with agents' objective
functions being affected by uncertainty. Following a data driven paradigm, we
represent uncertainty by means of scenarios and seek a robust Nash equilibrium
solution. We treat the Nash equilibrium computation problem within the realm of
probably approximately correct (PAC) learning. Building upon recent
developments in scenario-based optimization, we accompany the computed Nash
equilibrium with a priori and a posteriori probabilistic robustness
certificates, providing confidence that the computed equilibrium remains
unaffected (in probabilistic terms) when a new uncertainty realization is
encountered. For a wide class of games, we also show that the computation of
the so called compression set - a key concept in scenario-based optimization -
can be directly obtained as a byproduct of the proposed solution methodology.
Finally, we illustrate how to overcome differentiability issues, arising due to
the introduction of scenarios, and compute a Nash equilibrium solution in a
decentralized manner. We demonstrate the efficacy of the proposed approach on
an electric vehicle charging control problem.Comment: Preprint submitted to IEEE Transactions on Automatic Contro
Changes from Classical Statistics to Modern Statistics and Data Science
A coordinate system is a foundation for every quantitative science,
engineering, and medicine. Classical physics and statistics are based on the
Cartesian coordinate system. The classical probability and hypothesis testing
theory can only be applied to Euclidean data. However, modern data in the real
world are from natural language processing, mathematical formulas, social
networks, transportation and sensor networks, computer visions, automations,
and biomedical measurements. The Euclidean assumption is not appropriate for
non Euclidean data. This perspective addresses the urgent need to overcome
those fundamental limitations and encourages extensions of classical
probability theory and hypothesis testing , diffusion models and stochastic
differential equations from Euclidean space to non Euclidean space. Artificial
intelligence such as natural language processing, computer vision, graphical
neural networks, manifold regression and inference theory, manifold learning,
graph neural networks, compositional diffusion models for automatically
compositional generations of concepts and demystifying machine learning
systems, has been rapidly developed. Differential manifold theory is the
mathematic foundations of deep learning and data science as well. We urgently
need to shift the paradigm for data analysis from the classical Euclidean data
analysis to both Euclidean and non Euclidean data analysis and develop more and
more innovative methods for describing, estimating and inferring non Euclidean
geometries of modern real datasets. A general framework for integrated analysis
of both Euclidean and non Euclidean data, composite AI, decision intelligence
and edge AI provide powerful innovative ideas and strategies for fundamentally
advancing AI. We are expected to marry statistics with AI, develop a unified
theory of modern statistics and drive next generation of AI and data science.Comment: 37 page
- …