MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
- Updated
Jan 20, 2017 - Python
MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
Implementation of key concepts of neuralnetwork via numpy
Library which can be used to build feed forward NN, Convolutional Nets, Linear Regression, and Logistic Regression Models.
My beautiful Neural Network made from scratch and love. It plays the game Flappy-Birds flawlessly, in 3 to 9 generations!!
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
Neural Network from Scratch with Python
3-layer linear neural network to classify the MNIST dataset using the TensorFlow
Add a description, image, and links to the xavier-initializer topic page so that developers can more easily learn about it.
To associate your repository with the xavier-initializer topic, visit your repo's landing page and select "manage topics."