Quantcast
Channel: Hacker News
Viewing all articles
Browse latest Browse all 25817

Neurogenesis Deep Learning

$
0
0

(Submitted on 12 Dec 2016)

Abstract: Neural machine learning methods, such as deep neural networks (DNN), have achieved remarkable success in a number of complex data processing tasks. These methods have arguably had their strongest impact on tasks such as image and audio processing - data processing domains in which humans have long held clear advantages over conventional algorithms. In contrast to biological neural systems, which are capable of learning continuously, deep artificial networks have a limited ability for incorporating new information in an already trained network. As a result, methods for continuous learning are potentially highly impactful in enabling the application of deep networks to dynamic data sets. Here, inspired by the process of adult neurogenesis in the hippocampus, we explore the potential for adding new neurons to deep layers of artificial neural networks in order to facilitate their acquisition of novel information while preserving previously trained data representations. Our results on the MNIST handwritten digit dataset and the NIST SD 19 dataset, which includes lower and upper case letters and digits, demonstrate that neurogenesis is well suited for addressing the stability-plasticity dilemma that has long challenged adaptive machine learning algorithms.
Comments:Submitted to IJCNN 2017
Subjects:Neural and Evolutionary Computing (cs.NE); Learning (cs.LG); Machine Learning (stat.ML)
Report number: SAND2016-12514 R
Cite as: arXiv:1612.03770 [cs.NE]
 (or arXiv:1612.03770v1 [cs.NE] for this version)
From: Timothy Draelos [view email]
[v1] Mon, 12 Dec 2016 16:25:23 GMT (1502kb)

Viewing all articles
Browse latest Browse all 25817

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>