To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Recursive Binding for Similarity-Preserving Hypervector Representations of Sequences
International Research and Training Center for Information Technologies and Systems, Kiev, Ukraine; Luleå University of Technology, Luleå, Sweden.
University of California at Berkeley, Redwood Center for Theoretical Neuroscience, Berkeley, United States; Research Institutes of Sweden, Intelligent Systems Lab, Kista, Sweden.ORCID iD: 0000-0002-6032-6155
2022 (English)In: 2022 International Joint Conference on Neural Networks (IJCNN): Proceedings, IEEE, 2022Conference paper, Published paper (Refereed)
Abstract [en]

Hyperdimensional computing (HDC), also known as vector symbolic architectures (VSA), is a computing framework used within artificial intelligence and cognitive computing that operates with distributed vector representations of large fixed dimensionality. A critical step in designing the HDC/VSA solutions is to obtain such representations from the input data. Here, we focus on a wide-spread data type of sequences and propose their transformation to distributed representations that both preserve the similarity of identical sequence elements at nearby positions and are equivariant with respect to the sequence shift. These properties are enabled by forming representations of sequence positions using recursive binding as well as superposition operations. The proposed transformation was experimentally investigated with symbolic strings used for modeling human perception of word similarity. The obtained results are on a par with more sophisticated approaches from the literature. The proposed transformation was designed for the HDC/VSA model known as Fourier Holographic Reduced Representations. However, it can be adapted to some other HDC/VSA models.

Place, publisher, year, edition, pages
IEEE, 2022.
Series
Proceedings of the International Joint Conference on Neural Networks, ISSN 2161-4393, E-ISSN 2161-4407
Keywords [en]
hyperdimensional computing, vector symbolic architectures, distributed representation, hypervector, data structures, sequence representation, similarity preserving transformation, shift equivariance, recursive binding
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:oru:diva-116435DOI: 10.1109/IJCNN55064.2022.9892462ISI: 000867070904096Scopus ID: 2-s2.0-8513751938OAI: oai:DiVA.org:oru-116435DiVA, id: diva2:1902076
Conference
The International Joint Conference on Neural Networks, (IJCNN 2022), Padua, Italy, July 18-23, 2022
Funder
EU, Horizon 2020, 839179Swedish Foundation for Strategic Research, UKR22-0024
Note

The work of DK was supported by the European Union's Horizon 2020 Programme under the MSCA Individual Fellowship Grant (839179). DK was also supported in part by AFOSR FA9550-19-1-0241 and Intel's THWAI program. The work of DAR was supported in part by the National Academy of Sciences of Ukraine (grant 0121U000016), the Ministry of Education and Science of Ukraine (grant no. 0121U000228 and 0122U000818), and the Swedish Foundation for Strategic Research (SSF, grant no. UKR22-0024).

Available from: 2024-10-01 Created: 2024-10-01 Last updated: 2024-10-04Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Kleyko, Denis

Search in DiVA

By author/editor
Kleyko, Denis
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 22 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf