To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
On Design Choices in Similarity-Preserving Sparse Randomized Embeddings
Örebro University, School of Science and Technology. Research Institutes of Sweden, Kista, Sweden.ORCID iD: 0000-0002-6032-6155
Luleå University of Technology, Luleå, Sweden; RTC for IT and Systems, Kiev, Ukraine.
2024 (English)In: 2024 International Joint Conference on Neural Networks (IJCNN), IEEE, 2024Conference paper, Published paper (Refereed)
Abstract [en]

Expand & Sparsify is a principle that is observed in anatomically similar neural circuits found in the mushroom body (insects) and the cerebellum (mammals). Sensory data are projected randomly to much higher-dimensionality (expand part) where only few the most strongly excited neurons are activated (sparsify part). This principle has been leveraged to design a FlyHash algorithm that forms similarity-preserving sparse embeddings, which have been found useful for such tasks as novelty detection, pattern recognition, and similarity search. Despite its simplicity, FlyHash has a number of design choices to be set such as preprocessing of the input data, choice of sparsifying activation function, and formation of the random projection matrix. In this paper, we explore the effect of these choices on the performance of similarity search with FlyHash embeddings. We find that the right combination of design choices can lead to drastic difference in the search performance.

Place, publisher, year, edition, pages
IEEE, 2024.
Series
Proceedings of the International Joint Conference on Neural Networks, ISSN 2161-4407, E-ISSN 2161-4393
Keywords [en]
random projection, Winner-Take-All, sparse representations, hyperdimensional computing, expand & sparsify
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:oru:diva-117700DOI: 10.1109/IJCNN60899.2024.10651277ISI: 001392668204066Scopus ID: 2-s2.0-85205027306ISBN: 9798350359312 (electronic)ISBN: 9798350359329 (print)OAI: oai:DiVA.org:oru-117700DiVA, id: diva2:1919489
Conference
The International Joint Conference on Neural Networks (IJCNN 2024), Yokohama, Japan, June 30 - July 5, 2024
Funder
EU, Horizon 2020, 839179Swedish Foundation for Strategic Research, UKR22-0024; UKR24-0014
Note

The work of DK was supported by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skłodowska-Curie Individual Fellowship Grant Agreement 839179. The work of DAR was supported in part by the Swedish Foundation for Strategic Research (SSF, grant nos. UKR22-0024 & UKR24-0014) and the Swedish Research Council Scholars at Risk Sweden (VR SAR, grant no. GU 2022/1963).

Available from: 2024-12-09 Created: 2024-12-09 Last updated: 2025-03-17Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Kleyko, Denis

Search in DiVA

By author/editor
Kleyko, Denis
By organisation
School of Science and Technology
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 31 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf