2.2 理解Mini-batch

Batch gradient descent: With batch gradient descent on every iteration you go through the entire training set and you’d expect the cost to go down on every single iteration. So if we’ve had the cost f
相關文章
相關標籤/搜索