What is the difference between negative log likelihood and cross entropy? (in neural networks)

What is the difference between negative log likelihood and cross entropy? (in neural networks)
Share: