Abstract

While neural networks are very successfully applied to the processing of fixed-length vectors and variable-length sequences, the current state of the art does not allow the efficient processing of structured objects of arbitrary shape (like logical terms, trees or graphs). We present a connectionist architecture together with a novel supervised learning scheme which is capable of solving inductive inference tasks on complex symbolic structures of arbitrary size. The most general structures that can be handled are labeled directed acyclic graphs. The major difference of our approach compared to others is that the structure-representations are exclusively tuned for the intended inference task. Our method is applied to tasks consisting in the classification of logical terms. These range from the detection of a certain subterm to the satisfaction of a specific unification pattern. Compared to previously known approaches we obtained superior results in that domain.

Keywords

UnificationComputer scienceConnectionismInferenceTask (project management)BackpropagationTheoretical computer scienceArtificial intelligenceRange (aeronautics)Directed acyclic graphArtificial neural networkInductive biasDomain (mathematical analysis)AlgorithmMulti-task learningMathematics

Affiliated Institutions

Related Publications

Neural GPUs Learn Algorithms

Abstract: Learning an algorithm from examples is a fundamental problem that has been widely studied. Recently it has been addressed using neural networks, in particular by Neura...

2016 arXiv (Cornell University) 63 citations

Publication Info

Year
2002
Type
article
Volume
1
Pages
347-352
Citations
591
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

591
OpenAlex

Cite This

Christoph Goller, Andreas Küchler (2002). Learning task-dependent distributed representations by backpropagation through structure. Proceedings of International Conference on Neural Networks (ICNN'96) , 1 , 347-352. https://doi.org/10.1109/icnn.1996.548916

Identifiers

DOI
10.1109/icnn.1996.548916