Cross Entropy – Is Cost Function for Neural Network Convex?

convexneural networks

My teacher proved that 2nd derivate of cross-entropy is always positive, so that the cost function of neural networks using cross entropy is convex. Is this true? I'm quite confuse about this because I've always learned that the cost function of ANN is non-convex. Can anyone confirm this? Thank you a lot!
http://z0rch.com/2014/06/05/cross-entropy-cost-function

Best Answer

The cross entropy of an exponential family is always convex. So, for a multilayer neural network having inputs $x$, weights $w$, and output $y$, and loss function $L$

$$\nabla^2_y L$$

is convex. However,

$$\nabla^2_w L$$

is not going to be convex for the parameters of the middle layer for the reasons described by iamonaboat.