To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Generalized learning vector quantization for classification in randomized neural networks and hyperdimensional computing
Department of Computer Science, Rice University, Houston, USA.
UC Berkeley, Berkeley, USA; Research Institutes of Sweden, Kista, Sweden.ORCID iD: 0000-0002-6032-6155
Berkeley Wireless Research Center, UC Berkeley, Berkeley, USA.
Redwood Center for Theoretical Neuroscience, UC Berkeley, Berkeley, USA.
2021 (English)In: 2021 International Joint Conference on Neural Networks (IJCNN), IEEE, 2021, p. 1-9Conference paper, Published paper (Refereed)
Abstract [en]

Machine learning algorithms deployed on edge devices must meet certain resource constraints and efficiency requirements. Random Vector Functional Link (RVFL) networks are favored for such applications due to their simple design and training efficiency. We propose a modified RVFL network that avoids computationally expensive matrix operations during training, thus expanding the network’s range of potential applications. Our modification replaces the least-squares classifier with the Generalized Learning Vector Quantization (GLVQ) classifier, which only employs simple vector and distance calculations. The GLVQ classifier can also be considered an improvement upon certain classification algorithms popularly used in the area of Hyperdimensional Computing. The proposed approach achieved state-of-the-art accuracy on a collection of datasets from the UCI Machine Learning Repository-higher than previously proposed RVFL networks. We further demonstrate that our approach still achieves high accuracy while severely limited in training iterations (using on average only 21% of the least-squares classifier computational costs). 

Place, publisher, year, edition, pages
IEEE, 2021. p. 1-9
Series
IEEE International Joint Conference on Neural Networks (IJCNN)
Keywords [en]
learning vector quantization, randomly connected neural networks, hyperdimensional computing, random vector functional link networks
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:oru:diva-116027DOI: 10.1109/IJCNN52387.2021.9533316ISI: 000722581700029Scopus ID: 2-s2.0-85108610763OAI: oai:DiVA.org:oru-116027DiVA, id: diva2:1898913
Conference
International Joint Conference on Neural Networks (IJCNN 2021), Virtual, Shenzhen, July 18-22, 2021
Funder
EU, Horizon 2020, 839179
Note

The work of DK was supported by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skłodowska-Curie Individual Fellowship Grant Agreement 839179. The work of BAO, JMR, and DK was supported in part by the DARPA’s VIP (Super-HD Project) and AIE (HyDDENN Project) programs.

Available from: 2024-09-18 Created: 2024-09-18 Last updated: 2024-09-18Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Kleyko, Denis

Search in DiVA

By author/editor
Kleyko, Denis
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 13 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf