NTR Webinar: Vector Symbolic Architectures – AI With Thousand Bit Random Numbers?

NTR organizes and hosts scientific webinars on neural networks and invites speakers from all over the world to present their recent work at the webinars.

On October 5 Evgeny Osipov, Lulea University of Technology, Lulea, Sweden, led a technical Zoom webinar on Vector Symbolic Architectures – AI With Thousand Bit Random Numbers?

About the webinar:

Vector Symbolic Architectures (VSAs) are a family of connectionist computational models. VSA are also known as hyperdimensional computing and is a bio-inspired family of methods for representing concepts (letters, phonemes, features) and their meanings using principles of distributed data representation. 

The term hyperdimensional computing is rooted in the observation that key aspects of human memory, perception and cognition can be explained by the mathematical properties of high-dimensional spaces (Concentration of Measure theory). 

It has been advocated that VSAs have the potential to bridge the gap between symbolic and ANNs paradigms, which make them especially appealing for building novel AI systems on unconventional (neuromorphic) hardware. 

In this talk, I overviewed the main principles of Vector Symbolic Architectures as well their most notable applications.

Webinar presentation.  

Moderator and contact: NTR CEO Nick Mikhailovsky: nickm@ntrlab.com.

Leave a Reply

Your email address will not be published. Required fields are marked *