1 research outputs found

    BackPropagation Through Cyclic Structures

    No full text
    Recursive neural networks are a powerful tool for processing structured data. According to the recursive learning paradigm, the information to be processed consists of directed positional acyclic graphs (DPAGs). In fact, recursive networks are fed following the partial order defined by the links of the graph. Unfortunately, the hypothesis of processing DPAGs is sometimes too restrictive, being the nature of some real–world problems intrinsically disordered and cyclic. In this paper, a methodology is proposed which allows us to map any cyclic directed graph into a “recursive–equivalent” tree. Therefore, the computational power of recursive networks is definitely established, also clarifying the underlying limitations of the model. The subgraph–isomorphism detection problem was used for testing the approach, showing very promising results
    corecore