207,210 research outputs found
Modern applications of machine learning in quantum sciences
In these Lecture Notes, we provide a comprehensive introduction to the most
recent advances in the application of machine learning methods in quantum
sciences. We cover the use of deep learning and kernel methods in supervised,
unsupervised, and reinforcement learning algorithms for phase classification,
representation of many-body quantum states, quantum feedback control, and
quantum circuits optimization. Moreover, we introduce and discuss more
specialized topics such as differentiable programming, generative models,
statistical approach to machine learning, and quantum machine learning.Comment: 268 pages, 87 figures. Comments and feedback are very welcome.
Figures and tex files are available at
https://github.com/Shmoo137/Lecture-Note
Modern applications of machine learning in quantum sciences
In these Lecture Notes, we provide a comprehensive introduction to the most recent advances in the application of machine learning methods in quantum sciences. We cover the use of deep learning and kernel methods in supervised, unsupervised, and reinforcement learning algorithms for phase classification, representation of many-body quantum states, quantum feedback control, and quantum circuits optimization. Moreover, we introduce and discuss more specialized topics such as differentiable programming, generative models, statistical approach to machine learning, and quantum machine learning
Introduction to Machine Learning - Supplementary notes
Artelt A. Introduction to Machine Learning - Supplementary notes.; 2019.These supplementary notes are roughly connected to the lecture ”Introduction to Machine Learning”.
The aim of these notes is to help students getting a deep understanding of
the topics discussed in the lecture and exercises. However, we do not claim
or guarantee that all topics (from the lecture/exercises) are covered in these
notes (since there are minor changes each year). Therefore, the existence of
these notes is no excuse for not visiting & attending the lecture and exercises.
Furthermore, many topics are covered in far more details than (for just passing the exam) necessary. However, we think that these - more in depth - explanations might be interesting & useful to curious students who want to ”dive deeper” into the material
Machine Learning and Quantum Devices
These brief lecture notes cover the basics of neural networks and deep learning as well as their applications in the quantum domain, for physicists without prior knowledge. In the first part, we describe training using back-propagation, image classification, convolutional networks and autoencoders.The second part is about advanced techniques like reinforcement learning (for discovering control strategies), recurrent neural networks (for analyzing timetraces), and Boltzmann machines (for learning probability distributions). In the third lecture, we discuss first recent applications to quantum physics, with an emphasis on quantum information processing machines. Finally, the fourth lecture is devoted to the promise of using quantum effects to accelerate machine learning
The Shallow and the Deep:A biased introduction to neural networks and old school machine learning
The Shallow and the Deep is a collection of lecture notes that offers an accessible introduction to neural networks and machine learning in general. However, it was clear from the beginning that these notes would not be able to cover this rapidly changing and growing field in its entirety. The focus lies on classical machine learning techniques, with a bias towards classification and regression. Other learning paradigms and many recent developments in, for instance, Deep Learning are not addressed or only briefly touched upon.Biehl argues that having a solid knowledge of the foundations of the field is essential, especially for anyone who wants to explore the world of machine learning with an ambition that goes beyond the application of some software package to some data set. Therefore, The Shallow and the Deep places emphasis on fundamental concepts and theoretical background. This also involves delving into the history and pre-history of neural networks, where the foundations for most of the recent developments were laid. These notes aim to demystify machine learning and neural networks without losing the appreciation for their impressive power and versatility
A miRNA-Target Prediction Case Study
Giansanti, V., Castelli, M., Beretta, S., & Merelli, I. (2019). Comparing Deep and Machine Learning Approaches in Bioinformatics: A miRNA-Target Prediction Case Study. In V. V. Krzhizhanovskaya, M. H. Lees, P. M. A. Sloot, J. J. Dongarra, J. M. F. Rodrigues, P. J. S. Cardoso, J. Monteiro, ... R. Lam (Eds.), Computational Science – ICCS 2019: 19th International Conference, Faro, Portugal, June 12–14, 2019, Proceedings, Part III (pp. 31-44). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11538 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-030-22744-9_3MicroRNAs (miRNAs) are small non-coding RNAs with a key role in the post-transcriptional gene expression regularization, thanks to their ability to link with the target mRNA through the complementary base pairing mechanism. Given their role, it is important to identify their targets and, to this purpose, different tools were proposed to solve this problem. However, their results can be very different, so the community is now moving toward the deployment of integration tools, which should be able to perform better than the single ones. As Machine and Deep Learning algorithms are now in their popular years, we developed different classifiers from both areas to verify their ability to recognize possible miRNA-mRNA interactions and evaluated their performance, showing the potentialities and the limits that those algorithms have in this field. Here, we apply two deep learning classifiers and three different machine learning models to two different miRNA-mRNA datasets, of predictions from 3 different tools: TargetScan, miRanda, and RNAhybrid. Although an experimental validation of the results is needed to better confirm the predictions, deep learning techniques achieved the best performance when the evaluation scores are taken into account.authorsversionpublishe
High-Dimensional Non-Convex Landscapes and Gradient Descent Dynamics
In these lecture notes we present different methods and concepts developed in
statistical physics to analyze gradient descent dynamics in high-dimensional
non-convex landscapes. Our aim is to show how approaches developed in physics,
mainly statistical physics of disordered systems, can be used to tackle open
questions on high-dimensional dynamics in Machine Learning.Comment: Lectures given by G. Biroli at the 2022 Les Houches Summer School
"Statistical Physics and Machine Learning
A Revised Publication Model for ECML PKDD
ECML PKDD is the main European conference on machine learning and data
mining. Since its foundation it implemented the publication model common in
computer science: there was one conference deadline; conference submissions
were reviewed by a program committee; papers were accepted with a low
acceptance rate. Proceedings were published in several Springer Lecture Notes
in Artificial (LNAI) volumes, while selected papers were invited to special
issues of the Machine Learning and Data Mining and Knowledge Discovery
journals. In recent years, this model has however come under stress. Problems
include: reviews are of highly variable quality; the purpose of bringing the
community together is lost; reviewing workloads are high; the information
content of conferences and journals decreases; there is confusion among
scientists in interdisciplinary contexts. In this paper, we present a new
publication model, which will be adopted for the ECML PKDD 2013 conference, and
aims to solve some of the problems of the traditional model. The key feature of
this model is the creation of a journal track, which is open to submissions all
year long and allows for revision cycles.Comment: 13 page
Content Based Image Retrieval by Convolutional Neural Networks
Hamreras S., Benítez-Rochel R., Boucheham B., Molina-Cabello M.A., López-Rubio E. (2019) Content Based Image Retrieval by Convolutional Neural Networks. In: Ferrández Vicente J., Álvarez-Sánchez J., de la Paz López F., Toledo Moreo J., Adeli H. (eds) From Bioinspired Systems and Biomedical Applications to Machine Learning. IWINAC 2019. Lecture Notes in Computer Science, vol 11487. Springer.In this paper, we present a Convolutional Neural Network (CNN) for feature extraction in Content based Image Retrieval (CBIR). The proposed CNN aims at reducing the semantic gap between low level and high-level features. Thus, improving retrieval results. Our CNN is the result of a transfer learning technique using Alexnet pretrained network. It learns how to extract representative features from a learning database and then uses this knowledge in query feature extraction. Experimentations performed on Wang (Corel 1K) database show a significant improvement in terms of precision over the state of the art classic approaches.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech
- …