Neural Networks - Cost Function

Neural Networks - Cost Function

L = total number of layers in the network
K = number of output unit
\(s_l\) = number of units (not counting bias unit) in layer l

Notes:
  • The double sum simply adds up the logistic regression costs calculated for each cell in the output layer
  • The triple sum simply adds up the squares of all the individual Θs in the entire network.
  • The i in the triple sum does not refer to training example i

Resources:
https://www.coursera.org/learn/machine-learning/supplement/afqGa/cost-function

Comments