214 research outputs found
New Anisotropic Gauss-Bonnet Black Holes in Five Dimensions at the Critical Point
We obtain new vacuum static black hole solutions with anisotropic horizons in
Einstein-Gauss-Bonnet gravity with a negative cosmological constant in five
dimensions. The translational invariance along one direction on the
3-dimensional horizon cross section is broken. The Gauss-Bonnet coupling
{\alpha} is at the critical point where there is one single AdS vacuum. These
solutions does not appear in the form of a warped product, i.e. they lack a
common warping factor, and the metric contains 2 arbitrary functions, h(r) of
the radial coordinate r and H(y) of the horizon coordinate y -- some degeneracy
in the metric. The nontrivial horizon and the degeneracy may be closely related
to the critical value of {\alpha}. We introduce the process of obtaining the
solutions and some of their properties, and also prove a uniqueness theorem for
the case when there is a common warping factor for the rest two directions.Comment: 8pages, no figures. arXiv admin note: text overlap with
arXiv:2105.0848
High-Dimensional Bayesian Optimization via Semi-Supervised Learning with Optimized Unlabeled Data Sampling
We introduce a novel semi-supervised learning approach, named Teacher-Student
Bayesian Optimization (), integrating the teacher-student
paradigm into BO to minimize expensive labeled data queries for the first time.
incorporates a teacher model, an unlabeled data sampler, and a
student model. The student is trained on unlabeled data locations generated by
the sampler, with pseudo labels predicted by the teacher. The interplay between
these three components implements a unique selective regularization to the
teacher in the form of student feedback. This scheme enables the teacher to
predict high-quality pseudo labels, enhancing the generalization of the GP
surrogate model in the search space. To fully exploit , we
propose two optimized unlabeled data samplers to construct effective student
feedback that well aligns with the objective of Bayesian optimization.
Furthermore, we quantify and leverage the uncertainty of the teacher-student
model for the provision of reliable feedback to the teacher in the presence of
risky pseudo-label predictions. demonstrates significantly
improved sample-efficiency in several global optimization tasks under tight
labeled data budgets.Comment: 15 page
Building Program Vector Representations for Deep Learning
Deep learning has made significant breakthroughs in various fields of
artificial intelligence. Advantages of deep learning include the ability to
capture highly complicated features, weak involvement of human engineering,
etc. However, it is still virtually impossible to use deep learning to analyze
programs since deep architectures cannot be trained effectively with pure back
propagation. In this pioneering paper, we propose the "coding criterion" to
build program vector representations, which are the premise of deep learning
for program analysis. Our representation learning approach directly makes deep
learning a reality in this new field. We evaluate the learned vector
representations both qualitatively and quantitatively. We conclude, based on
the experiments, the coding criterion is successful in building program
representations. To evaluate whether deep learning is beneficial for program
analysis, we feed the representations to deep neural networks, and achieve
higher accuracy in the program classification task than "shallow" methods, such
as logistic regression and the support vector machine. This result confirms the
feasibility of deep learning to analyze programs. It also gives primary
evidence of its success in this new field. We believe deep learning will become
an outstanding technique for program analysis in the near future.Comment: This paper was submitted to ICSE'1
- …