714 Views
The Data Science Lab Dr. James McCaffrey of Microsoft Research explains stochastic gradient descent (SGD) neural network training, specifically implementing a bio-inspired optimization technique called differential evolution optimization (DEO). Training a neural network is the process of finding good values for the network's weights and biases.
0 comments