BackPropagation Through Cyclic Structures

Abstract

Recursive neural networks are a powerful tool for processing structured data. According to the recursive learning paradigm, the information to be processed consists of directed positional acyclic graphs (DPAGs). In fact, recursive networks are fed following the partial order defined by the links of the graph. Unfortunately, the hypothesis of processing DPAGs is sometimes too restrictive, being the nature of some real–world problems intrinsically disordered and cyclic. In this paper, a methodology is proposed which allows us to map any cyclic directed graph into a “recursive–equivalent” tree. Therefore, the computational power of recursive networks is definitely established, also clarifying the underlying limitations of the model. The subgraph–isomorphism detection problem was used for testing the approach, showing very promising results

Similar works

Full text

thumbnail-image

Archivio della Ricerca - Università degli Studi di Siena

redirect
Last time updated on 12/11/2016

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.