Democratising Fare Storage at Scale Using Event Sourcing

From humble beginnings, Grab has expanded across different markets in the last couple of years. We’ve added a wide range of features to the Grab platform to continue to delight our consumers and…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Derivation of Back Propagation with Cross Entropy

In order to understand the Back Propagation algorithm, we first need to understand some basic concepts such as Partial Derivatives, chain rule, Cross Entropy loss, Sigmoid function and Softmax function.

A two layered Neural Network with sigmoid activations

Assuming we have already forward-passed the inputs to get some outputs at the last layer Y, we will have to calculate the loss function E and propagate the loss to all the preceding layers by changing the weights associated with each of the layers.

Knowing the cross entropy loss E and the softmax activation “ yi ‘ , we can calculate the change in loss with respect to any weight connecting the output layer using the chain rule of partial derivatives. Intuitively, we can even find the weight gradients for the whole layer using the matrix notation shown below.

Weight gradient of a weight connecting a unit L in input layer to the unit K in the first hidden layer using sigmoid activation.
Matrix notation of weights connecting the input layer and first hidden layer.

Add a comment

Related posts:

3 Aturan Penamaan Variabel pada PHP

Setiap bahasa pemrograman tentu memiliki pendeklarasian variabel. dimana variabel di gunakan untuk menampung sebuah informasi, data, ataupun nilai. untuk seseorang yang baru belajar pemrograman…

Why is everything for some and nothing for me?

Comparing yourself and your achievements with other people is considered quite normal. But at the same time, often such an assessment leads to the fact that an understanding comes that everything is…

Faith

Today is my 29th birthday. What a year it has been. I knew my 28th year was going to be something special. I never expected it to be the roller coaster ride it inevitably became. I loved this year…