# Neural Networks: Chapter 7 - Loss & Activation functions

### Recap

In the previous chapter, we learnt about different Neural Network architecture and we answered the three simple questions -** Which, How & What,** in case you missed it

**to read our previous article.**

*click here*### Introduction

We understood how a ** neural network works **and the intricacies of it, now the question is how do we know if the model is learning what it needs to learn to make accurate predictions.

In this article, we will look at the two important components which help make a Neural network so powerful -

- Loss function
- Activation function

### What is a Loss function?

A loss function is quite simple as it helps in evaluating how well algorithm models on a dataset. If the predictions are off, the loss function will output a number with a higher value. However, if it is good, the output will be a lower number. As you tweak pieces of the algorithm to improve the model, the loss function will help gauge if the model is getting better.

Loss functions are also known as Cost functions

### Types of Loss functions -

### What is an Activation function?

Activation functions are used to compute the weighted sum of inputs and biases, which can be used to decode whether a neuron can be activated or not.

The activation functions are also known astransfer functions, as it helps transforms output from one layer to the other.

Activation functions are responsible for calculating the sum of the product of the various weights and inputs with the bias to determine the final output value for the current hidden layer, which would be the input for the next layer.

In simpler terms, Activation function is like a gatekeeper between layers and it decides which weights and biases (i.e. learning of the model) should propagate further to other layers, significant values are allowed to pass and rest are dropped.

### Types of activation functions -

### Conclusion -

We saw how loss and activation functions play an important role in making neural nets cool!

In the upcoming posts, we will deep dive into various kinds of neural network architectures and how to select appropriate loss and activation functions based on the problem you are trying to solve.

#### References -

- Practical AI on GCP by Micheal Lanham
- Loss functions
- Activation functions