Lecture 29 (Last lecture)

Muhammad Umer sani : MSDS19063

Muhammad Talha Hashmi: MSDS19088

Mukarram Ahmad: MSDS19090

Agenda: Back Propagation.

Let’s us take an example.

  • Where w are vectors of same vector.
  • Xn are inputs.

We can find back propagation by dot products.

Perceptron is something that you got by dot product.

σ (x)= 1/1+e^-2    => output σ (x^T.w)  

We can do this for any computable form.

Question: What is bootable?

Answer: Which can be computable.

Question: What is example of bootable?

Answer: Above one is an example of bootable.

If domain and range is finite then it is computable otherwise its incomputable.

Lengthy program can be neural network.

That is the tricky part.

All neural network are heuristic base. In the above example weights are missing, so we have to learn it.

We will start the optimization by minimization and maximization.

We will change neural network formulation of above equation.

That is (f’-f)^2

Question: How we can add more dimension?

Answer: We will discuss it later.

Question: Its neuron and perceptron?

Answer:  No it is different thing.

As an example

Here node is perceptron

Getting output from above example is neuron

Question: Is there any mathematical form to adding neuron to over fit values?

Answer: No there is no mathematical form.

So lose function is

L = ( f’-f  )^2  which we want to maximize

We will find the gradient is zero which is called a gradient descent.

So dot product of weights and inputs

Left Bracket:

As we have input and output so weights will be random

L (w,b = (a1^3 – y1)^2)

It’s a first last function for fixed w.

Lets find out gradient descent of this function.

δL / δw

δL / δb

we define       δj^l = δL / Zi^l = δL / δaj^l , δai^l / δzi^l

We can compute this recursively

 δ’ (x) = δ  (x) (1- δ  (x))

this function is for fix layer not for all layer So δ’ is error

Question: WB have predict value, so we also have actual value.

Answer: No, we do not have.

Question: Why we are δ calling delta error.

Answer: So let’s we will call delta an error.

Again seeing the same example

Taking derivative w.r.t   ak^l     as it will not effect the b so.

Which was our goal.

At the end two quiz’s was conducted and discuss the final syllabus, all the content of semester will be include in the exam and sir will provide the formulas sheet but every one should know the major formulas as well.

Best of luck for your exam..

4 thoughts on “Lecture 29 (Last lecture)

Leave a comment