Constructing a Sigmoid Perceptron in Python

In this article, our objective is to visualize its training with the help of a sample dataset.

Intuition of Model

Activation Function

Let’s first understand the basics of the Sigmoid model before we construct it. As the name suggests the model revolves around the sigmoid formula, which can be represented as:

Our sigmoid formula comprises of the following parameters:

• X: the features of the dataset
• W: the weight vector corresponding to X
• B: the bias

The property of the sigmoid curve ( value ranging between 0 and 1 ) makes it beneficial for primary regression/classification problems.

Loss Function

Since we will be dealing with real values for this visualization, we will be using Mean Square Error as our loss function.

Code

Let us start with importing the libraries we need. Here is some (in-depth) documentation for animation: Animation, Simple animation tutorial

Now let’s list the components that our SigmoidNeuron class will comprise of

• function to calculate w*x+ b
• function to apply sigmoid function
• function to predict output for a provided X dataframe
• function to return gradient values for “w” and “b”
• function to fit for the provided dataset

Some important points regarding the class:

• Our objective is a visualization of the training, and change for each input is to be recorded. Hence no iteration through the dataset is added in the “fit” function.
• We can initialize “w” and “b” to any random value. Here we have initialized to a specific float as it provides us with the most unfit scenario for the model.

Our next step will be to create a sample dataset, create a function that will produce contour plots and create a loop to provide a dataset to the model over time.

Some resources : Mycmap , meshgrid , subplots , contourf,

Now finally we receive 120 plots that depict the training.

The last plot

Our last step is to create an animation for the training.