8,047 research outputs found
PRISM: a tool for automatic verification of probabilistic systems
Probabilistic model checking is an automatic formal verification technique for analysing quantitative properties of systems which exhibit stochastic behaviour. PRISM is a probabilistic model checking tool which has already been successfully deployed in a wide range of application domains, from real-time communication protocols to biological signalling pathways. The tool has recently undergone a significant amount of development. Major additions include facilities to manually explore models, Monte-Carlo discrete-event simulation techniques for approximate model analysis (including support for distributed simulation) and the ability to compute cost- and reward-based measures, e.g. "the expected energy consumption of the system before the first failure occurs". This paper presents an overview of all the main features of PRISM. More information can be found on the website: www.cs.bham.ac.uk/~dxp/prism
Generative Models For Deep Learning with Very Scarce Data
The goal of this paper is to deal with a data scarcity scenario where deep
learning techniques use to fail. We compare the use of two well established
techniques, Restricted Boltzmann Machines and Variational Auto-encoders, as
generative models in order to increase the training set in a classification
framework. Essentially, we rely on Markov Chain Monte Carlo (MCMC) algorithms
for generating new samples. We show that generalization can be improved
comparing this methodology to other state-of-the-art techniques, e.g.
semi-supervised learning with ladder networks. Furthermore, we show that RBM is
better than VAE generating new samples for training a classifier with good
generalization capabilities
Recommended from our members
Designing sustainable medical devices
Stakeholders in the medical device manufacturing industry are becoming more concerned about the environmental impact of their products and processes. The consumers are also becoming more aware of the negative impact that manufacturers can have on the environment. Government initiatives continue to increase environmental awareness through the development of new policy and legislation, encouraging industry to become more accountable for the environmental impact of their products and operations. The ISO 14001 standard, Environmental Management Systems-Requirements with Guidance for Use, sets guidelines to enable businesses to recognize the environmental effects of their products and processes. Departments can use the tool to set targets to lower the environmental impact and identify areas of high environmental concern when designing, purchasing, and marketing products. Research in these areas will be used to develop the environmental scoring tool to aid in the design of future sustainable medical devices
Sampled Weighted Min-Hashing for Large-Scale Topic Mining
We present Sampled Weighted Min-Hashing (SWMH), a randomized approach to
automatically mine topics from large-scale corpora. SWMH generates multiple
random partitions of the corpus vocabulary based on term co-occurrence and
agglomerates highly overlapping inter-partition cells to produce the mined
topics. While other approaches define a topic as a probabilistic distribution
over a vocabulary, SWMH topics are ordered subsets of such vocabulary.
Interestingly, the topics mined by SWMH underlie themes from the corpus at
different levels of granularity. We extensively evaluate the meaningfulness of
the mined topics both qualitatively and quantitatively on the NIPS (1.7 K
documents), 20 Newsgroups (20 K), Reuters (800 K) and Wikipedia (4 M) corpora.
Additionally, we compare the quality of SWMH with Online LDA topics for
document representation in classification.Comment: 10 pages, Proceedings of the Mexican Conference on Pattern
Recognition 201
Maximum Entropy Linear Manifold for Learning Discriminative Low-dimensional Representation
Representation learning is currently a very hot topic in modern machine
learning, mostly due to the great success of the deep learning methods. In
particular low-dimensional representation which discriminates classes can not
only enhance the classification procedure, but also make it faster, while
contrary to the high-dimensional embeddings can be efficiently used for visual
based exploratory data analysis.
In this paper we propose Maximum Entropy Linear Manifold (MELM), a
multidimensional generalization of Multithreshold Entropy Linear Classifier
model which is able to find a low-dimensional linear data projection maximizing
discriminativeness of projected classes. As a result we obtain a linear
embedding which can be used for classification, class aware dimensionality
reduction and data visualization. MELM provides highly discriminative 2D
projections of the data which can be used as a method for constructing robust
classifiers.
We provide both empirical evaluation as well as some interesting theoretical
properties of our objective function such us scale and affine transformation
invariance, connections with PCA and bounding of the expected balanced accuracy
error.Comment: submitted to ECMLPKDD 201
The Monoceros very-high-energy gamma-ray source
The H.E.S.S. telescope array has observed the complex Monoceros Loop
SNR/Rosette Nebula region which contains unidentified high energy EGRET sources
and potential very-high-energy (VHE) gamma-ray source. We announce the
discovery of a new point-like VHE gamma-ray sources, HESS J0632+057. It is
located close to the rim of the Monoceros SNR and has no clear counterpart at
other wavelengths. Data from the NANTEN telescope have been used to investigate
hadronic interactions with nearby molecular clouds. We found no evidence for a
clear association. The VHE gamma-ray emission is possibly associated with the
lower energy gamma-ray source 3EG J0634+0521, a weak X-ray source 1RXS
J063258.3+054857 and the Be-star MWC 148.Comment: 4 pages, 4 figures, Contribution to the 30th ICRC, Merida Mexico,
July 200
- …