Perceptron Learning -
Problem 1: Basic Perceptron Training
Given the following training dataset for a binary classification problem:
x₁ x₂ Target (t)
1 1 1
1 -1 -1
-1 1 -1
-1 -1 -1
Initial weights: w₁ = 0.5, w₂ = -0.5, bias w₀ = 0 Learning rate: η = 0.1
Tasks: a) Apply the perceptron learning algorithm for 2 complete epochs. Show all
calculations step by step. b) What is the final weight vector after 2 epochs? c) Test the final
perceptron on all training samples and determine if it has converged.
Problem 2: Perceptron with Threshold Function
Consider a perceptron with the following parameters:
● Input: x = [2, -1, 3]
● Initial weights: w = [0.2, -0.3, 0.1]
● Bias: w₀ = -0.4
● Activation function: f(net) = 1 if net ≥ 0, else -1
● Target output: t = 1
● Learning rate: η = 0.2
Tasks: a) Calculate the net input to the perceptron. b) Determine the actual output of the
perceptron. c) Calculate the error. d) Update all weights using the perceptron learning rule.
Show complete calculations. e) Verify your updated weights by recalculating the output with
the same input.
Problem 4: Perceptron Convergence Analysis
Given the XOR problem dataset:
x₁ x₂ Target
0 0 0
0 1 1
1 0 1
x₁ x₂ Target
1 1 0
Initial weights: w₁ = 0.5, w₂ = 0.5, w₀ = 0 Learning rate: η = 0.1 Activation: f(net) = 1 if net >
0, else 0
Tasks: a) Apply the perceptron algorithm for the first 4 training samples (one epoch). Show
all weight updates. (4 marks) b) Explain why this problem cannot be solved by a single
perceptron. Support your answer with geometric reasoning. (2 marks)
Problem 5: Perceptron Decision Boundary
A trained perceptron has the following parameters:
● Weights: w₁ = 0.6, w₂ = -0.8
● Bias: w₀ = 0.2
Tasks: a) Write the equation of the decision boundary.
b) Find the x₂-intercept and x₁-intercept of the decision boundary.
c) Classify the following points: (1, 0), (-1, 1), (0.5, -0.5). Show calculations.
Problem 6: Learning Rate Effect
Consider the training sample: x = [1, -2], target = 1 Current perceptron weights: w = [0.3,
-0.1], bias w₀ = 0.2
Tasks: a) Calculate the output and error for this sample.
b) Update the weights using learning rate η = 0.1.
c) Update the weights using learning rate η = 1.0 (starting from original weights).
d) Compare the two results and explain the effect of learning rate on weight updates
Problem 7: A McCulloch-Pitts neuron needs to implement the logical AND function with
three inputs.
(a) Design the McCulloch-Pitts neuron by determining appropriate weights and threshold.
Show that your design correctly implements the AND function for all possible input
combinations.
(b) Explain why a single McCulloch-Pitts neuron cannot implement the XOR function. What
fundamental limitation does this reveal?
Problem 8: A perceptron is trained to classify points as either above (class +1) or below
(class -1) a line in 2D space. The training data is:
Pattern x₁ x₂ Target
1 2 3 +1
2 1 1 -1
3 3 2 +1
4 0 2 -1
Initial weights: w₁ = 0.2, w₂ = 0.1, bias w₀ = 0.3, learning rate η = 0.4
(a) Apply the perceptron learning algorithm for the first two patterns. Show weight updates
after each pattern.
(b) Determine if these data points are linearly separable by sketching them on a coordinate
system.