Differential Evolution Optimization

added by DotNetKicks
9/7/2021 5:39:28 PM

619 Views

The Data Science Lab Dr. James McCaffrey of Microsoft Research explains stochastic gradient descent (SGD) neural network training, specifically implementing a bio-inspired optimization technique called differential evolution optimization (DEO). Training a neural network is the process of finding good values for the network's weights and biases.


0 comments