Short description for quick search
- Updated
Jan 31, 2019 - Python
Short description for quick search
Regularized Logistic Regression
Implemented a neural network from scratch in Python with just NumPy, no frameworks involved.
An OOP Deep Neural Network using a similar syntax as Keras with many hyper-parameters, optimizers and activation functions available.
PyTorch implementation of important functions for WAIL and GMMIL
A "from-scratch" 2-layer neural network for MNIST classification built in pure NumPy, featuring mini-batch gradient descent, momentum, L2 regularization, and evaluation tools — no ML libraries used.
The module allows working with simple neural networks (Currently, the simplest model of a multilayer perceptron neural network with the backpropagation method and the Leaky ReLu activation function is used).
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
A framework for implementing convolutional neural networks and fully connected neural network.
Repository for Assignment 1 for CS 725
Fully connected neural network with Adam optimizer, L2 regularization, Batch normalization, and Dropout using only numpy
Multivariate Regression and Classification Using a Feed-Forward Neural Network and Gradient Descent Optimization.
Implementation of optimization and regularization algorithms in deep neural networks from scratch
Mathematical machine learning algorithm implementations
Implementation of linear regression with L2 regularization (ridge regression) using numpy.
This repository contains the second, of 2, homework of the Machine Learning course taught by Prof. Luca Iocchi.
Multivariate Linear and Logistic Regression Using Gradient Descent Optimization.
This repository contains the code for the blog post on Understanding L1 and L2 regularization in machine learning. For further details, please refer to this post.
Investigating the effects of an innovative L2 regularization approach in a neural network model, replacing the traditional summation of squared weights with a multiplicative interaction of weights, to assess its influence on model behavior and performance.
Add a description, image, and links to the l2-regularization topic page so that developers can more easily learn about it.
To associate your repository with the l2-regularization topic, visit your repo's landing page and select "manage topics."