0% found this document useful (0 votes)
15 views8 pages

Understanding Neural Networks A Python Implementation

Uploaded by

Akash Kolinco
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
Download as pptx, pdf, or txt
0% found this document useful (0 votes)
15 views8 pages

Understanding Neural Networks A Python Implementation

Uploaded by

Akash Kolinco
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1/ 8

Understanding Neural

Networks: A Python
Implementation
This document explores a Python implementation of a simple neural network for binary classification. We'll
examine the network architecture, training process, and prediction capabilities. The code demonstrates key
concepts like forward propagation, backpropagation, and the use of activation functions in a concise yet
powerful neural network model.

by Kolinco Mondol Akash 221-15-5444


Neural Network Architecture and Data
Preparation
The neural network implemented in this code has a 3-4-1 architecture, meaning it has 3 input nodes, 4 hidden nodes, and 1
output node. This structure is suitable for binary classification tasks with 3-dimensional input data.

The training data (X_train) consists of 8 samples, each with 3 binary features. The corresponding labels (y_train) are binary
values. A single test sample (X_test) is provided to evaluate the model's performance after training.

Network Parameters Activation Function Weight Initialization


• Input nodes: 3 The network uses the sigmoid Weights are initialized randomly using
• Hidden nodes: 4 activation function and its derivative numpy's random number generator
for both hidden and output layers. with a fixed seed (42) for
• Output nodes: 1
The sigmoid function maps input reproducibility. This ensures
• Learning rate: 0.1
values to the range (0, 1), making it consistent results across multiple runs
• Momentum: 0.01 (unused in suitable for binary classification tasks. of the code.
current implementation)
Training Process and Model Evaluation
The neural network is trained using a simple implementation of the backpropagation algorithm. The training process involves forward propagation to
compute predictions, followed by backpropagation to update the weights based on the computed error.

1 Forward Propagation
Input data is propagated through the network, applying weights and activation functions at each layer to compute the final output.

2 Error Calculation
The difference between predicted and actual outputs is calculated to determine the model's error.

3 Backpropagation
The error is propagated backwards through the network, adjusting weights to minimize the loss function.

4 Weight Update
Weights are updated using the calculated gradients and the specified learning rate.
code:

You might also like