1,530 research outputs found

    The Power of Linear Recurrent Neural Networks

    Full text link
    Recurrent neural networks are a powerful means to cope with time series. We show how a type of linearly activated recurrent neural networks, which we call predictive neural networks, can approximate any time-dependent function f(t) given by a number of function values. The approximation can effectively be learned by simply solving a linear equation system; no backpropagation or similar methods are needed. Furthermore, the network size can be reduced by taking only most relevant components. Thus, in contrast to others, our approach not only learns network weights but also the network architecture. The networks have interesting properties: They end up in ellipse trajectories in the long run and allow the prediction of further values and compact representations of functions. We demonstrate this by several experiments, among them multiple superimposed oscillators (MSO), robotic soccer, and predicting stock prices. Predictive neural networks outperform the previous state-of-the-art for the MSO task with a minimal number of units.Comment: 22 pages, 14 figures and tables, revised implementatio

    Opportunity vs Reality: International students and the American college experience by Nick Davini

    Get PDF

    Crystina Friese awarded a Research Experience and Apprenticeship Program (REAP) fellowship for summer 2017

    Get PDF

    Alexandra Martin Joins Anthropology as an Affiliate Postdoctoral Research Associate

    Get PDF

    Meghan Howey is interviewed by Michigan Radio (npr)

    Get PDF
    • …
    corecore