The Orion's Arm Universe Project Forums
Artificial SNN olfactory demonstrates one shot learning - Printable Version

+- The Orion's Arm Universe Project Forums (https://www.orionsarm.com/forum)
+-- Forum: Offtopics and Extras; Other Cool Stuff (https://www.orionsarm.com/forum/forumdisplay.php?fid=2)
+--- Forum: Real Life But OA Relevant (https://www.orionsarm.com/forum/forumdisplay.php?fid=7)
+--- Thread: Artificial SNN olfactory demonstrates one shot learning (/showthread.php?tid=4617)



Artificial SNN olfactory demonstrates one shot learning - Gabrielle of Gaussia - 04-21-2020

One of the notable limitations of ANNs over the last decade has been dependence on extremely high quantities of training data for any good results on a task. One of the reasons for this is that they use brute force training methods like gradient descent to optimize the neural network in the parameter space. Another reason for this is that traditional ANNs can't exactly learn on the job because with the exception of RNNs each layer of the NN only outputs once during a particular task so there's not much room for learning based on interaction between activity and synaptic weights. There's another type of ANN called a Spiking Neural Network that doesn't work this way though. Unlike traditional ANNs SNNs are asynchronous and each neuron is constantly firing spikes and receiving input spikes from other neurons which means that you can have interaction between input spikes and synaptic weights. The main problem with SNNs is that they are much more computationally intensive to run than the current vogue types of ANNs on traditional hardware so research on SNNs on the same scale as popular ANN types hasn't really been done yet; recently though, Intel has developed neuromorphic ASICs (Application Specific Integrated Circuits) code-named Loihi that run SNNs extremely efficiently and a new SNN research community has grown around the new efficient chips. They've been developing new learning algorithms for SNNs including ones that take advantage of interaction between activity and synaptic weights. Recently using one such learning algorithm they were able to create an artificial Olfactory that could actively learn new smells after one or a few tries which is far more impressive than things like alphago in my opinion because this sort of one shot learning is often cited as one of the essential qualities of AGI since it's something that we're able to do without much effort. Anyway this sort of thing makes me much more excited that SNNs will be able to demonstrate other qualities of AGI like unsupervised learning, the ability to learn entirely new skills without direction, and the ability to combine multiple disparate skills for a new task that hasn't been encountered before. Anyway, the model ran on a single Loihi chip which is capable of only simulating 130k spiking neurons at a time, but intel recently developed a rackmounted system, Pohoiki springs which can simulate SNNs with 100 million neurons with a power budget of only 300W so I expect much more impressive research in the next 1-2 years. I really wouldn't be surprised if technology similar to this will allow for the creation of AGI in roughly the same timeframe as it occurs in OA.

TLDR Researchers working on unorthodox learning algorithms for Spiking Neural Networks have demonstrated a quality of AGI, one shot learning in recent research such as this artificial olfactory SNN.

https://www.nature.com/articles/s42256-020-0159-4


RE: Artificial SNN olfactory demonstrates one shot learning - Gabrielle of Gaussia - 04-21-2020

I'm reading through part of the study and another thing of note was that while it's difficult to train ANNs for new skills or new types of inputs even with transfer learning, it's relatively easy to do with SNNs. The artificial olfactory in the paper would use up some of a type of inhibitatory cell when it learned how to distinguish a new smell by specializing them to that smell and making them less plastic; after training a set of smells into the SNN, it could learn a new set of smells by generating a new set of inhibitory cells by simulating neurogenesis. Honestly I wouldn't be surprised if SNNs become as popular as other ANN types this decade as more and more practical applications of their advantages are found.