12 research outputs found

    Learning invariances with stationary subspace analysis

    No full text
    Recently, a novel subspace decomposition method, termed 'Stationary Subspace Analysis' (SSA), has been proposed by Bünau et al. [10]. SSA aims to find a linear projection to a lower dimensional subspace such that the distribution of the projected data does not change over successive epochs or sub-datasets. We show that by modifying the loss function and the optimization procedure we can obtain an algorithm that is both faster and more accurate. We discuss the problem of indeterminacies and provide a lower bound on the number of epochs that is needed. Finally, we show in an experiment with simulated image patches, that SSA can be used favourably in invariance learning
    corecore