Skip to content

Hariprashad-Ravikumar/Neural-Network-from-Scratch-with-NumPy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Neural Network from Scratch with NumPy

A minimal two‑layer neural network implementation in pure NumPy (no TensorFlow or PyTorch) trained on the MNIST digit‑recognizer dataset. This repository demonstrates how to build, train, and evaluate a simple two‑layer feedforward neural network from scratch using only NumPy. You will see how to:

  • Load and preprocess the MNIST “Digit Recognizer” dataset.
  • Initialize weights and biases with Xavier (Glorot) initialization.
  • Implement forward propagation (ReLU activation in the hidden layer, softmax output).
  • Compute cross‑entropy loss.
  • Implement backward propagation and update parameters via gradient descent.
  • Track training cost and validation accuracy over epochs.
  • Generate predictions on the test set for Kaggle submission.

No high‑level ML frameworks (TensorFlow, Keras, PyTorch) are used—only NumPy for numerical operations.

Made a short video walking through the entire build from data preprocessing and weight initialization to forward & backward propagation and training:

About

Neural Network from Scratch with NumPy

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published