2,182 research outputs found
Bayesian Optimization Approach for Analog Circuit Synthesis Using Neural Network
Bayesian optimization with Gaussian process as surrogate model has been
successfully applied to analog circuit synthesis. In the traditional Gaussian
process regression model, the kernel functions are defined explicitly. The
computational complexity of training is O(N 3 ), and the computation complexity
of prediction is O(N 2 ), where N is the number of training data. Gaussian
process model can also be derived from a weight space view, where the original
data are mapped to feature space, and the kernel function is defined as the
inner product of nonlinear features. In this paper, we propose a Bayesian
optimization approach for analog circuit synthesis using neural network. We use
deep neural network to extract good feature representations, and then define
Gaussian process using the extracted features. Model averaging method is
applied to improve the quality of uncertainty prediction. Compared to Gaussian
process model with explicitly defined kernel functions, the
neural-network-based Gaussian process model can automatically learn a kernel
function from data, which makes it possible to provide more accurate
predictions and thus accelerate the follow-up optimization procedure. Also, the
neural-network-based model has O(N) training time and constant prediction time.
The efficiency of the proposed method has been verified by two real-world
analog circuits
Development of a Novel Media-independent Communication Theology for Accessing Local & Web-based Data: Case Study with Robotic Subsystems
Realizing media independence in todayâs communication system remains an open problem by and large. Information retrieval, mostly through the Internet, is becoming the most demanding feature in technological progress and this web-based data access should ideally be in user-selective form. While blind-folded access of data through the World Wide Web is quite streamlined, the counter-half of the facet, namely, seamless access of information database pertaining to a specific end-device, e.g. robotic systems, is still in a formative stage. This paradigm of access as well as systematic query-based retrieval of data, related to the physical enddevice is very crucial in designing the Internet-based network control of the same in real-time. Moreover, this control of the end-device is directly linked up to the characteristics of three coupled metrics, namely, âmultiple databasesâ, âmultiple serversâ and âmultiple inputsâ (to each server). This triad, viz. database-input-server (DIS) plays a significant role in overall performance of the system, the background details of which is still very sketchy in global research community. This work addresses the technical issues associated with this theology, with specific reference to formalism of a customized DIS considering real-time delay analysis. The present paper delineates the developmental paradigms of novel multi-input multioutput communication semantics for retrieving web-based information from physical devices, namely, two representative robotic sub-systems in a coherent and homogeneous mode. The developed protocol can be entrusted for use in real-time in a complete user-friendly manner
Recommended from our members
Applied Harmonic Analysis and Sparse Approximation
Efficiently analyzing functions, in particular multivariate functions, is a key problem in applied mathematics. The area of applied harmonic analysis has a significant impact on this problem by providing methodologies both for theoretical questions and for a wide range of applications in technology and science, such as image processing. Approximation theory, in particular the branch of the theory of sparse approximations, is closely intertwined with this area with a lot of recent exciting developments in the intersection of both. Research topics typically also involve related areas such as convex optimization, probability theory, and Banach space geometry. The workshop was the continuation of a first event in 2012 and intended to bring together world leading experts in these areas, to report on recent developments, and to foster new developments and collaborations
A Review of Formal Methods applied to Machine Learning
We review state-of-the-art formal methods applied to the emerging field of
the verification of machine learning systems. Formal methods can provide
rigorous correctness guarantees on hardware and software systems. Thanks to the
availability of mature tools, their use is well established in the industry,
and in particular to check safety-critical applications as they undergo a
stringent certification process. As machine learning is becoming more popular,
machine-learned components are now considered for inclusion in critical
systems. This raises the question of their safety and their verification. Yet,
established formal methods are limited to classic, i.e. non machine-learned
software. Applying formal methods to verify systems that include machine
learning has only been considered recently and poses novel challenges in
soundness, precision, and scalability.
We first recall established formal methods and their current use in an
exemplar safety-critical field, avionic software, with a focus on abstract
interpretation based techniques as they provide a high level of scalability.
This provides a golden standard and sets high expectations for machine learning
verification. We then provide a comprehensive and detailed review of the formal
methods developed so far for machine learning, highlighting their strengths and
limitations. The large majority of them verify trained neural networks and
employ either SMT, optimization, or abstract interpretation techniques. We also
discuss methods for support vector machines and decision tree ensembles, as
well as methods targeting training and data preparation, which are critical but
often neglected aspects of machine learning. Finally, we offer perspectives for
future research directions towards the formal verification of machine learning
systems
Quantum optimal control in quantum technologies. Strategic report on current status, visions and goals for research in Europe
Quantum optimal control, a toolbox for devising and implementing the shapes
of external fields that accomplish given tasks in the operation of a quantum
device in the best way possible, has evolved into one of the cornerstones for
enabling quantum technologies. The last few years have seen a rapid evolution
and expansion of the field. We review here recent progress in our understanding
of the controllability of open quantum systems and in the development and
application of quantum control techniques to quantum technologies. We also
address key challenges and sketch a roadmap for future developments.Comment: this is a living document - we welcome feedback and discussio
Recommended from our members
Applied Harmonic Analysis and Data Processing
Massive data sets have their own architecture. Each data source has an inherent structure, which we should attempt to detect in order to utilize it for applications, such as denoising, clustering, anomaly detection, knowledge extraction, or classification. Harmonic analysis revolves around creating new structures for decomposition, rearrangement and reconstruction of operators and functionsâin other words inventing and exploring new architectures for information and inference. Two previous very successful workshops on applied harmonic analysis and sparse approximation have taken place in 2012 and in 2015. This workshop was the an evolution and continuation of these workshops and intended to bring together world leading experts in applied harmonic analysis, data analysis, optimization, statistics, and machine learning to report on recent developments, and to foster new developments and collaborations
- âŠ