Allen's Blog

Allen's Blog

马上订阅 Allen's Blog RSS 更新: https://www.capallen.top/atom.xml

Python Code of Mini-Batch Gradient Descent

2018年12月2日 07:30

概述

Mini-Batch Gradient Descent是介于batch gradient descent(BGD)和stochastic gradient descent(SGD)之间的一种线性回归优化算法,它是将数据分为多个小的数据集(batches),每个数据集具有大致相同的点数,然后在每个数据集中应用Absolute Trick或者是Square Trick算法,来更新回归系数。

它的运算速度比BSD快,比SGD慢;精度比BSD低,比SGD高。

代码

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
import numpy as np
import matplotlib.pyplot as plt

np.random.seed(42)

def MSEStep(X, y, W, b, learn_rate = 0.001):
"""
This function implements the gradient descent step for squared error as a
performance metric.

Parameters
X : array of predictor features
y : array of outcome values
W : predictor feature coefficients
b : regression function intercept
learn_rate : learning rate

Returns
W_new : predictor feature coefficients following gradient descent step
b_new : intercept following gradient descent step
"""

# compute errors
# the squared trick formula
y_pred = np.matmul(X, W) + b #Attention:the...

剩余内容已隐藏

查看完整文章以阅读更多