39,372 research outputs found

    Recurrent Spatial Transformer Networks

    Get PDF
    We integrate the recently proposed spatial transformer network (SPN) [Jaderberg et. al 2015] into a recurrent neural network (RNN) to form an RNN-SPN model. We use the RNN-SPN to classify digits in cluttered MNIST sequences. The proposed model achieves a single digit error of 1.5% compared to 2.9% for a convolutional networks and 2.0% for convolutional networks with SPN layers. The SPN outputs a zoomed, rotated and skewed version of the input image. We investigate different down-sampling factors (ratio of pixel in input and output) for the SPN and show that the RNN-SPN model is able to down-sample the input images without deteriorating performance. The down-sampling in RNN-SPN can be thought of as adaptive down-sampling that minimizes the information loss in the regions of interest. We attribute the superior performance of the RNN-SPN to the fact that it can attend to a sequence of regions of interest

    Provenance analysis for instagram photos

    Get PDF
    As a feasible device fingerprint, sensor pattern noise (SPN) has been proven to be effective in the provenance analysis of digital images. However, with the rise of social media, millions of images are being uploaded to and shared through social media sites every day. An image downloaded from social networks may have gone through a series of unknown image manipulations. Consequently, the trustworthiness of SPN has been challenged in the provenance analysis of the images downloaded from social media platforms. In this paper, we intend to investigate the effects of the pre-defined Instagram images filters on the SPN-based image provenance analysis. We identify two groups of filters that affect the SPN in quite different ways, with Group I consisting of the filters that severely attenuate the SPN and Group II consisting of the filters that well preserve the SPN in the images. We further propose a CNN-based classifier to perform filter-oriented image categorization, aiming to exclude the images manipulated by the filters in Group I and thus improve the reliability of the SPN-based provenance analysis. The results on about 20, 000 images and 18 filters are very promising, with an accuracy higher than 96% in differentiating the filters in Group I and Group II

    Asymptotic Derivation and Numerical Investigation of Time-Dependent Simplified Pn Equations

    Full text link
    The steady-state simplified Pn (SPn) approximations to the linear Boltzmann equation have been proven to be asymptotically higher-order corrections to the diffusion equation in certain physical systems. In this paper, we present an asymptotic analysis for the time-dependent simplified Pn equations up to n = 3. Additionally, SPn equations of arbitrary order are derived in an ad hoc way. The resulting SPn equations are hyperbolic and differ from those investigated in a previous work by some of the authors. In two space dimensions, numerical calculations for the Pn and SPn equations are performed. We simulate neutron distributions of a moving rod and present results for a benchmark problem, known as the checkerboard problem. The SPn equations are demonstrated to yield significantly more accurate results than diffusion approximations. In addition, for sufficiently low values of n, they are shown to be more efficient than Pn models of comparable cost.Comment: 32 pages, 7 figure

    Rotation-invariant binary representation of sensor pattern noise for source-oriented image and video clustering

    Get PDF
    Most existing source-oriented image and video clustering algorithms based on sensor pattern noise (SPN) rely on the pairwise similarities, whose calculation usually dominates the overall computational time. The heavy computational burden is mainly incurred by the high dimensionality of SPN, which typically goes up to millions for delivering plausible clustering performance. This problem can be further aggravated by the uncertainty of the orientation of images or videos because the spatial correspondence between data with uncertain orientations needs to be reestablished in a brute-force search manner. In this work, we propose a rotation-invariant binary representation of SPN to address the issue of rotation and reduce the computational cost of calculating the pairwise similarities. Results on two public multimedia forensics databases have shown that the proposed approach is effective in overcoming the rotation issue and speeding up the calculation of pairwise SPN similarities for source-oriented image and video clustering

    An eIF4E-binding protein regulates katanin protein levels in C. elegans embryos.

    Get PDF
    In Caenorhabditis elegans, the MEI-1-katanin microtubule-severing complex is required for meiosis, but must be down-regulated during the transition to embryogenesis to prevent defects in mitosis. A cullin-dependent degradation pathway for MEI-1 protein has been well documented. In this paper, we report that translational repression may also play a role in MEI-1 down-regulation. Reduction of spn-2 function results in spindle orientation defects due to ectopic MEI-1 expression during embryonic mitosis. MEL-26, which is both required for MEI-1 degradation and is itself a target of the cullin degradation pathway, is present at normal levels in spn-2 mutant embryos, suggesting that the degradation pathway is functional. Cloning of spn-2 reveals that it encodes an eIF4E-binding protein that localizes to the cytoplasm and to ribonucleoprotein particles called P granules. SPN-2 binds to the RNA-binding protein OMA-1, which in turn binds to the mei-1 3 untranslated region. Thus, our results suggest that SPN-2 functions as an eIF4E-binding protein to negatively regulate translation of mei-1

    On the Relationship between Sum-Product Networks and Bayesian Networks

    Full text link
    In this paper, we establish some theoretical connections between Sum-Product Networks (SPNs) and Bayesian Networks (BNs). We prove that every SPN can be converted into a BN in linear time and space in terms of the network size. The key insight is to use Algebraic Decision Diagrams (ADDs) to compactly represent the local conditional probability distributions at each node in the resulting BN by exploiting context-specific independence (CSI). The generated BN has a simple directed bipartite graphical structure. We show that by applying the Variable Elimination algorithm (VE) to the generated BN with ADD representations, we can recover the original SPN where the SPN can be viewed as a history record or caching of the VE inference process. To help state the proof clearly, we introduce the notion of {\em normal} SPN and present a theoretical analysis of the consistency and decomposability properties. We conclude the paper with some discussion of the implications of the proof and establish a connection between the depth of an SPN and a lower bound of the tree-width of its corresponding BN.Comment: Full version of the same paper to appear at ICML-201

    Bayesian Learning of Sum-Product Networks

    Full text link
    Sum-product networks (SPNs) are flexible density estimators and have received significant attention due to their attractive inference properties. While parameter learning in SPNs is well developed, structure learning leaves something to be desired: Even though there is a plethora of SPN structure learners, most of them are somewhat ad-hoc and based on intuition rather than a clear learning principle. In this paper, we introduce a well-principled Bayesian framework for SPN structure learning. First, we decompose the problem into i) laying out a computational graph, and ii) learning the so-called scope function over the graph. The first is rather unproblematic and akin to neural network architecture validation. The second represents the effective structure of the SPN and needs to respect the usual structural constraints in SPN, i.e. completeness and decomposability. While representing and learning the scope function is somewhat involved in general, in this paper, we propose a natural parametrisation for an important and widely used special case of SPNs. These structural parameters are incorporated into a Bayesian model, such that simultaneous structure and parameter learning is cast into monolithic Bayesian posterior inference. In various experiments, our Bayesian SPNs often improve test likelihoods over greedy SPN learners. Further, since the Bayesian framework protects against overfitting, we can evaluate hyper-parameters directly on the Bayesian model score, waiving the need for a separate validation set, which is especially beneficial in low data regimes. Bayesian SPNs can be applied to heterogeneous domains and can easily be extended to nonparametric formulations. Moreover, our Bayesian approach is the first, which consistently and robustly learns SPN structures under missing data.Comment: NeurIPS 2019; See conference page for supplemen
    • …
    corecore