应用科学学报

• 论文 • 上一篇    下一篇

递增的稀疏神经网络研究

冯 超,李 柠,李少远   

  1. 上海交通大学 自动化研究所,上海,200240
  • 收稿日期:2007-09-14 修回日期:1900-01-01 出版日期:2008-03-31 发布日期:2008-03-31

Growing Sparse Neural Networks

FENG Chao, LI Ning, LI Shao-yuan
  

  1. Institute of Automation, Shanghai Jiaotong University, Shanghai 200240
  • Received:2007-09-14 Revised:1900-01-01 Online:2008-03-31 Published:2008-03-31

摘要: 针对稀疏神经网络应用中连接度和中间节点不易确定的问题,并根据生物神经网络的特点,提出了学习中改变神经网络连接度和隐含结点数的学习算法。模拟脑皮层由薄到厚的发育过程,根据当前的学习结果,改变网络的拓扑结构,逐步增加网络中的连接和节点,最终学习得到满意的稀疏神经网络。新算法可以用结构更简单的稀疏神经网络达到满足要求的拟合精度,并通过仿真算例进一步验证了算法的有效性。

关键词: 稀疏神经网络, 泛化, 学习算法, 连接度

Abstract:

When using sparse neural networks in practice, it is hard to choose a proper connection rate. In this work, based on new discoveries in the brain science, two new learning algorithms are developed which change the network’s connection structure at the time of learning, thus an accurate connection rate is not needed. Sparse neural networks reduce coupling among inputs so that fewer connections are needed to meet the fitting requirement. Simulation results show that the new algorithms are effective.

Key words: sparse neural network, generalization, learning algorithm, connectivily