不同的神经网络训练函数trainingfunction的比较

更新时间:2023-07-30 15:02:45 阅读: 评论:0

不同的神经⽹络训练函数trainingfunction的⽐较
5. traincgb:Plwell-Beale算法:通过判断前后梯度的正交性来决定权值和阈值的调整⽅向是否回到负梯度⽅向上来.
6. trainscg:⽐例共轭梯度算法:将模值信赖域算法与共轭梯度算法结合起来,减少⽤于调整⽅向时搜索⽹络的时间.
⼀般来说,traingd和traingdm是普通训练函数,⽽traingda,traingdx,traingd,trainrp,traincgf,traincgb,trainscg,
trainbgf等等都是快速训练函数.总体感觉就是训练时间的差别⽐较⼤,还带有精度的差异.
(以上信息来⾃⽹上,忘记出处)
(以下信息来⾃MATLAB帮助⽂档,随后附有我的翻译)
nntrain
Neural Network Toolbox Training Functions.
To change a neural network’s trainingalgorithm t ainFcn
property to the name of the correspondingfunction.  For example, to u
the scaled conjugate gradient backproptraining algorithm:
Backpropagation training functions that uJacobian derivatives
The algorithms can be faster but requiremore memory than gradient
backpropation.  They are also not supported on GPU hardware.
trainlm  - Levenberg-Marquardt backpropagation.
trainbr  - Bayesian Regulation backpropagation.
Backpropagation training functions that ugradient derivatives
The algorithms may not be as fast asJacobian backpropagation.
They are supported on GPU hardware with theParallel Computing Toolbox.
trainbfg - BFGS quasi-Newton backpropagation.
traincgb - Conjugate gradient backpropagation with Powell-Beale restarts.
traincgf - Conjugate gradient backpropagation with Fletcher-Reeves updates.
水字traincgp - Conjugate gradient backpropagation with Polak-Ribiere updates.信息化管理平台
traingd  - Gradient descent backpropagation.
traingda - Gradient descent with adaptive lr backpropagation.
traingdm - Gradient descent with momentum.纸折钱包
traingdx - Gradient descent w/momentum & adaptive lr backpropagation.
trainoss - One step cant backpropagation.
trainrp  - RPROP backpropagation.
trainscg - Scaled conjugate gradient backpropagation.
Supervid weight/bias training functions
解放军歌曲
trainb  - Batch training with weight & bias learning rules.
trainc  - Cyclical order weight/bias training.
trainr  - Random order weight/bias training.
trains  - Sequential order weight/bias training.
Unsupervid weight/bias training functions
trainbu  - Unsupervid batch training with weight & bias learning rules.
家风建设心得体会trainru  - Unsupervid random order weight/bias training.
翻译:
神经⽹络⼯具箱训练函数.
设置ainFcn属性⾥的当前函数的名字可以改变神经⽹络的训练函数.举个例⼦,⽤如下代码可以把训练函数设置成scaled conjugate gradientbackprop(扩展共轭梯度反向,我乱猜的)训练函数
Backpropagation(反向传播)训练函数使⽤的是Jacobian(雅克⽐)导数.
这些算法很快,但是⽐导数反向传播法需要更多的内存.他们也不⽀持GPU.
trainlm  - Levenberg-Marquardt backpropagation.
trainbr  - Bayesian Regulation backpropagation.
反向传播训练函数使⽤的是梯度导数
这些算法没有雅克⽐反向传播的算法那么快.他们⽀持GPU,借助于并⾏运算⼯具箱(ParallelComputing Toolbox).    trainbfg - BFGS quasi-Newton backpropagation.
traincgb - Conjugate gradient backpropagation with Powell-Beale restarts.
traincgf - Conjugate gradient backpropagation with Fletcher-Reeves updates.
traincgp - Conjugate gradient backpropagation with Polak-Ribiere updates.
traingd  - Gradient descent backpropagation.
运动会广播稿200字左右
脑部核磁共振traingda - Gradient descent with adaptive lr backpropagation.
traingdm - Gradient descent with momentum.
traingdx - Gradient descent w/momentum & adaptive lr backpropagation.
trainoss - One step cant backpropagation.
trainrp  - RPROP backpropagation.
trainscg - Scaled conjugate gradient backpropagation.
权值/偏差受控训练法
trainb  - Batch training with weight & bias learning rules.
trainc  - Cyclical order weight/bias training.
trainr  - Random order weight/bias training.
trains  - Sequential order weight/bias training.
权值/偏差不受控训练法
trainbu  - Unsupervid batch training with weight & bias learning rules.
脚烂trainru  - Unsupervid random order weight/bias training.

本文发布于:2023-07-30 15:02:45,感谢您对本站的认可!

本文链接:https://www.wtabcd.cn/fanwen/fan/82/1123840.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

标签:训练   函数   梯度   算法   反向   传播   共轭   权值
相关文章
留言与评论(共有 0 条评论)
   
验证码:
推荐文章
排行榜
Copyright ©2019-2022 Comsenz Inc.Powered by © 专利检索| 网站地图