1,177 research outputs found
Recommended from our members
Custody to community: supporting young people to cope with release: a practitioner's guide
Recommended from our members
Young people and resettlement: participatory approaches: a practitioner’s guide
Recommended from our members
Ethnicity, faith and culture in resettlement: a practitioner’s guide
Efficacy of DWT denoising in the removal of power line interference and the effect on morphological distortion of underlying atrial fibrillatory waves in AF-ECG.
Estimation of atrial fibrillatory frequency by spectral subtraction of wavelet denoised ECG in patients with atrial fibrillation
Variational Deep Semantic Hashing for Text Documents
As the amount of textual data has been rapidly increasing over the past
decade, efficient similarity search methods have become a crucial component of
large-scale information retrieval systems. A popular strategy is to represent
original data samples by compact binary codes through hashing. A spectrum of
machine learning methods have been utilized, but they often lack expressiveness
and flexibility in modeling to learn effective representations. The recent
advances of deep learning in a wide range of applications has demonstrated its
capability to learn robust and powerful feature representations for complex
data. Especially, deep generative models naturally combine the expressiveness
of probabilistic generative models with the high capacity of deep neural
networks, which is very suitable for text modeling. However, little work has
leveraged the recent progress in deep learning for text hashing.
In this paper, we propose a series of novel deep document generative models
for text hashing. The first proposed model is unsupervised while the second one
is supervised by utilizing document labels/tags for hashing. The third model
further considers document-specific factors that affect the generation of
words. The probabilistic generative formulation of the proposed models provides
a principled framework for model extension, uncertainty estimation, simulation,
and interpretability. Based on variational inference and reparameterization,
the proposed models can be interpreted as encoder-decoder deep neural networks
and thus they are capable of learning complex nonlinear distributed
representations of the original documents. We conduct a comprehensive set of
experiments on four public testbeds. The experimental results have demonstrated
the effectiveness of the proposed supervised learning models for text hashing.Comment: 11 pages, 4 figure
The Relativistic Hopfield network: rigorous results
The relativistic Hopfield model constitutes a generalization of the standard
Hopfield model that is derived by the formal analogy between the
statistical-mechanic framework embedding neural networks and the Lagrangian
mechanics describing a fictitious single-particle motion in the space of the
tuneable parameters of the network itself. In this analogy the cost-function of
the Hopfield model plays as the standard kinetic-energy term and its related
Mattis overlap (naturally bounded by one) plays as the velocity. The
Hamiltonian of the relativisitc model, once Taylor-expanded, results in a
P-spin series with alternate signs: the attractive contributions enhance the
information-storage capabilities of the network, while the repulsive
contributions allow for an easier unlearning of spurious states, conferring
overall more robustness to the system as a whole. Here we do not deepen the
information processing skills of this generalized Hopfield network, rather we
focus on its statistical mechanical foundation. In particular, relying on
Guerra's interpolation techniques, we prove the existence of the infinite
volume limit for the model free-energy and we give its explicit expression in
terms of the Mattis overlaps. By extremizing the free energy over the latter we
get the generalized self-consistent equations for these overlaps, as well as a
picture of criticality that is further corroborated by a fluctuation analysis.
These findings are in full agreement with the available previous results.Comment: 11 pages, 1 figur
The role of the family in resettlement
This practitioner’s guide unpicks different interpretations of ‘family’. It explores the family’s unique position to fulfil key characteristics that research has shown are associated with effective resettlement support. It highlights recommendations and considerations that can be adopted into the practices of those working with young people and their families. As well as outlining various ways that families can help with personal and structural support, the guide also provides tips for successfully engaging with family members and sets out ways of overcoming the challenges that exist to unlocking this important resource
TransNets: Learning to Transform for Recommendation
Recently, deep learning methods have been shown to improve the performance of
recommender systems over traditional methods, especially when review text is
available. For example, a recent model, DeepCoNN, uses neural nets to learn one
latent representation for the text of all reviews written by a target user, and
a second latent representation for the text of all reviews for a target item,
and then combines these latent representations to obtain state-of-the-art
performance on recommendation tasks. We show that (unsurprisingly) much of the
predictive value of review text comes from reviews of the target user for the
target item. We then introduce a way in which this information can be used in
recommendation, even when the target user's review for the target item is not
available. Our model, called TransNets, extends the DeepCoNN model by
introducing an additional latent layer representing the target user-target item
pair. We then regularize this layer, at training time, to be similar to another
latent representation of the target user's review of the target item. We show
that TransNets and extensions of it improve substantially over the previous
state-of-the-art.Comment: Accepted for publication in the 11th ACM Conference on Recommender
Systems (RecSys 2017
- …