3 research outputs found
Deep Canonically Correlated LSTMs
We examine Deep Canonically Correlated LSTMs as a way to learn nonlinear
transformations of variable length sequences and embed them into a correlated,
fixed dimensional space. We use LSTMs to transform multi-view time-series data
non-linearly while learning temporal relationships within the data. We then
perform correlation analysis on the outputs of these neural networks to find a
correlated subspace through which we get our final representation via
projection. This work follows from previous work done on Deep Canonical
Correlation (DCCA), in which deep feed-forward neural networks were used to
learn nonlinear transformations of data while maximizing correlation.Comment: 8 pages, 3 figures, accepted as the undergraduate honors thesis for
Neil Mallinar by The Johns Hopkins Universit
Long-Term Effect Estimation with Surrogate Representation
There are many scenarios where short- and long-term causal effects of an
intervention are different. For example, low-quality ads may increase
short-term ad clicks but decrease the long-term revenue via reduced clicks.
This work, therefore, studies the problem of long-term effect where the outcome
of primary interest, or primary outcome, takes months or even years to
accumulate. The observational study of long-term effect presents unique
challenges. First, the confounding bias causes large estimation error and
variance, which can further accumulate towards the prediction of primary
outcomes. Second, short-term outcomes are often directly used as the proxy of
the primary outcome, i.e., the surrogate. Nevertheless, this method entails the
strong surrogacy assumption that is often impractical. To tackle these
challenges, we propose to build connections between long-term causal inference
and sequential models in machine learning. This enables us to learn surrogate
representations that account for the temporal unconfoundedness and circumvent
the stringent surrogacy assumption by conditioning on the inferred time-varying
confounders. Experimental results show that the proposed framework outperforms
the state-of-the-art.Comment: 9 pages, 7 figure
To Recurse or not to Recurse,a Low Dose CT Study
Restoring high-quality CT images from low dose CT counterparts is an
ill-posed, nonlinear problem to which Deep Learning approaches have been giving
superior solutions compared to classical model-based approaches. In this
article, a framework is presented wherein a Recurrent Neural Network (RNN) is
utilized to remove the streaking artefacts from low projection number CT
imaging. The results indicate similar image restoration performance for the RNN
compared to the feedforward network in low noise cases while in high noise
levels the RNN returns better results. The computational costs are also
compared between RNN and feedforward networks.Comment: Sections II.A to II.D is taken from sections II.A to II.D of
arXiv:1904.03908 which is an unpublished article from the same author