Working through Neural Networks from Scratch by Harrison Kinsley and Daniel Kukiela, I implemented every part of a neural network manually using Python and NumPy including neurons, activation functions, loss functions, backpropagation, and optimizers such as SGD, RMSProp, and Adam. My goal was to develop a deep understanding of how neural networks operate internally, beyond the abstractions provided by high-level libraries like TensorFlow and PyTorch.
This project reinforced my knowledge of vectorized computation, matrix algebra, and gradient-based optimization, while building strong intuition for the mathematical and computational structure of deep learning models. The final implementation can train both classification and regression networks entirely from first principles.
Reference:
Kinsley, H., & Kukiela, D. (2020). Neural Networks from Scratch in Python. Retrieved from https://nnfs.io
Github Repo: ethanlewellin/NNS