中山大学学报自然科学版 ›› 2016, Vol. 55 ›› Issue (4): 1-10.

• 论文 •    下一篇

带后续迭代的双极S函数激励的WASD神经网络

张雨浓1,2,3,肖争利1,2,3,丁思彤1,2,3,毛明志1,刘锦荣1   

    1. 中山大学数据科学与计算机学院,广东 广州510006;
    2. 华南理工大学自主系统和网络控制教育部重点实验室,广东 广州510640;
    3. 广东顺德中山大学卡内基梅隆大学国际联合研究院,广东 佛山528300
  • 收稿日期:2015-07-07 出版日期:2016-07-25 发布日期:2016-07-25
  • 通讯作者: 张雨浓(1973年生),男;研究方向:人工神经网络、冗余机器人 ;Email:zhynong@mail.sysu.edu.cn

WASD neural network activated by bipolar sigmoid functions together with subsequent iterations

ZHANG Yunong1,2,3, XIAO Zhengli1,2,3, DING Sitong1,2,3, MAO Mingzhi1, LIU Jinrong1   

    1. School of Data and Computer Science, Sun Yat-sen University, Guangzhou 510006, China;
    2. Key Laboratory of Autonomous Systems and Networked Control, Ministry of Education, South China University of Technology, Guangzhou 510640, China;
    3. SYSU-CMU Shunde International Joint Research Institute, Foshan 528300, China
  • Received:2015-07-07 Online:2016-07-25 Published:2016-07-25

摘要:

结合Levenberg-Marquardt算法以及权值直接确定法这两种用于神经网络学习训练的方法,提出了一种带后续迭代、面向双极S (sigmoid)激励函数神经网络的权值与结构确定(weights-and-structure-determination, WASD)方法。该方法与MATLAB软件神经网络工具箱相结合,可以解决传统神经网络普遍存在的学习时间长、网络结构难以确定、学习能力和泛化能力有待提高等不足,同时具有较好的可行性和可操作性。以非线性函数的数据拟合为例,计算机数值实验和对比结果证实了WASD方法确定出最优隐神经元数和最优权值的优越性,最终得到的WASD神经网络具有更为优异的学习性能和泛化性能。

关键词: 神经网络, 权值与结构直接确定, 后续迭代, 双极S激励函数, 数值实验

Abstract:

A weights-and-structure-determination (WASD) algorithm is proposed for the neural network using bipolar sigmoid activation functions together with subsequent iterations, which is the combination of the Levenberg-Marquardt algorithm and the weights-direct-determination method for neural network training. The proposed algorithm, combined with the Neural Network Toolbox of MATLAB software, aims at remedying the common weaknesses of traditional artificial neural networks, such as long-time learning expenditure, difficulty in determining the network structure, and to-be-improved performance of learning and generalization. Meanwhile, the WASD algorithm has good flexibility and operability. Taking data fitting of nonlinear functions for example, numerical experiments and comparison results illustrate the superiority of the WASD algorithm for determining the optimal number and optimal weights of hidden neurons. And the resultant neural network has more excellent performance on learning and generalization.

Key words: neural networks, weights-and-structure-determination (WASD) algorithm, subsequent iterations, bipolar sigmoid activation functions, numerical experiments

中图分类号: