Recursive neural network

Source: Wikipedia, the free encyclopedia.

A recursive neural network is a kind of

distributed representations of structure, such as logical terms.[1]
Models and general frameworks have been developed in further works since the 1990s.[2][3]

Architectures

Basic

A simple recursive neural network architecture

In the most simple architecture, nodes are combined into parents using a weight matrix that is shared across the whole network, and a non-linearity such as

tanh
. If c1 and c2 are n-dimensional vector representation of nodes, their parent will also be an n-dimensional vector, calculated as

Where W is a learned weight matrix.

This architecture, with a few improvements, has been used for successfully parsing natural scenes, syntactic parsing of natural language sentences,[4] and recursive autoencoding and generative modeling of 3D shape structures in the form of cuboid abstractions.[5]

Recursive cascade correlation (RecCC)

RecCC is a constructive neural network approach to deal with tree domains[2] with pioneering applications to chemistry[6] and extension to directed acyclic graphs.[7]

Unsupervised RNN

A framework for unsupervised RNN has been introduced in 2004.[8][9]

Tensor

Recursive neural tensor networks use one, tensor-based composition function for all nodes in the tree.[10]

Training

Stochastic gradient descent

Typically,

recurrent neural networks
.

Properties

Universal approximation capability of RNN over trees has been proved in literature.[11][12]

Related models

Recurrent neural networks

artificial neural networks
with a certain structure: that of a linear chain. Whereas recursive neural networks operate on any hierarchical structure, combining child representations into parent representations, recurrent neural networks operate on the linear progression of time, combining the previous time step and a hidden representation into the representation for the current time step.

Tree Echo State Networks

An efficient approach to implement recursive neural networks is given by the Tree Echo State Network[13] within the reservoir computing paradigm.

Extension to graphs

Extensions to graphs include graph neural network (GNN),[14] Neural Network for Graphs (NN4G),[15] and more recently convolutional neural networks for graphs.

References

  1. S2CID 6536466
    .
  2. ^ .
  3. .
  4. ^ Socher, Richard; Lin, Cliff; Ng, Andrew Y.; Manning, Christopher D. "Parsing Natural Scenes and Natural Language with Recursive Neural Networks" (PDF). The 28th International Conference on Machine Learning (ICML 2011).
  5. S2CID 20432407
    .
  6. .
  7. .
  8. .
  9. .
  10. ^ Socher, Richard; Perelygin, Alex; Y. Wu, Jean; Chuang, Jason; D. Manning, Christopher; Y. Ng, Andrew; Potts, Christopher. "Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank" (PDF). EMNLP 2013.
  11. .
  12. .
  13. .
  14. .
  15. .