SODA

Encoding Sequential Information in Vector Space Models of Semantics: Comparing Holographic Reduced Representation and Random Permutation

Recchia, Gabriel and Jones, Michael and Sahlgren, Magnus and Kanerva, Pentti (2010) Encoding Sequential Information in Vector Space Models of Semantics: Comparing Holographic Reduced Representation and Random Permutation. In: Proceedings of the 32nd Annual Cognitive Science Society, 11-14 Aug 2010, Portland, Oregon, USA.

[img]
Preview
PDF
248Kb

Abstract

Encoding information about the order in which words typically appear has been shown to improve the performance of high-dimensional semantic space models. This requires an encoding operation capable of binding together vectors in an order-sensitive way, and efficient enough to scale to large text corpora. Although both circular convolution and random permutations have been enlisted for this purpose in semantic models, these operations have never been systematically compared. In Experiment 1 we compare their storage capacity and probability of correct retrieval; in Experiments 2 and 3 we compare their performance on semantic tasks when integrated into existing models. We conclude that random permutations are a scalable alternative to circular convolution with several desirable properties.

Item Type:Conference or Workshop Item (Paper)
ID Code:3999
Deposited By:Magnus Sahlgren
Deposited On:24 Aug 2010 13:51
Last Modified:03 Jan 2011 10:12

Repository Staff Only: item control page