Perceptrons

perceptrons

perceptrons do have logic gates for AND, OR, and NOT.

However…there is no solution for XOR gate.

Until we thought about Networks of Perceptrons. Networked elements are required.

There is a need for three perceptrons to solve the XOR gate.

When the inidividual outputs of a layer of perceptrons is not needed to be visuallised, we call it a Hidden Layer.

Once you begin networking the perceptions, you can perform any boolean function.

This is deemed a multi-layer perceptron. Perceptrons are arranged in layers.

Linear Classifier

A perceptron operates on real-valued vectors.

There is a boundary where all inputs are classified as 0 or 1.

A perceptron defines a boundary (the line and the area) where on the graph its a 0 or 1 on each side of that linear classifier line.

You can create a shape with many perceptrons with their own linear classifiers. You create a boundary.

So you create a region where all perceptrons must output a 1.

Decision Boundaries

Finding and fitting a decision boundary to the data is one of the main objectives of Machine Learning

Converges, Coeficiants

For each misclassifcation, we adjust the coeficiants to move the boundary in the direction of the misclassification. If still not classified correctly, we adjust the coeficiants again. This process is called Gradient Descent.

TensorFlow Playground

Build a network that isolates the region within the region we wish to classify.

Individual perceptrons capture linear boundaries

x_input = [0.1, 0.5, 0.2]
w_weights = [0.4, 0.3, 0.6]
threshold = 0.5

def step_function(weighted_sum):
    if weighted_sum > threshold:
        return 1
    else:
        return 0

def perceptron():
    weighted_sum = 0 
    for x,w in zip(x_input, w_weights):
        weighted_sum += x*w
        print(weighted_sum)
    return step_function(weighted_sum)

output = perceptron()
print("Output:", str(output))
0.04000000000000001
0.19
0.31
Output: 0

Each iteration is increasing our weighted sum, at the end we reached 0.31 which was smaller than the threshold of 0.5, therefore the output is 0

The perceptron is behaving like a Linear Classifier