Short description for quick search
-
Updated
Jan 31, 2019 - Python
Short description for quick search
Regularized Logistic Regression
An OOP Deep Neural Network using a similar syntax as Keras with many hyper-parameters, optimizers and activation functions available.
PyTorch implementation of important functions for WAIL and GMMIL
A framework for implementing convolutional neural networks and fully connected neural network.
Repository for Assignment 1 for CS 725
Fully connected neural network with Adam optimizer, L2 regularization, Batch normalization, and Dropout using only numpy
Multivariate Regression and Classification Using a Feed-Forward Neural Network and Gradient Descent Optimization.
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
Implementation of optimization and regularization algorithms in deep neural networks from scratch
Mathematical machine learning algorithm implementations
Implementation of linear regression with L2 regularization (ridge regression) using numpy.
Multivariate Linear and Logistic Regression Using Gradient Descent Optimization.
This repository contains the second, of 2, homework of the Machine Learning course taught by Prof. Luca Iocchi.
This repository contains the code for the blog post on Understanding L1 and L2 regularization in machine learning. For further details, please refer to this post.
Investigating the effects of an innovative L2 regularization approach in a neural network model, replacing the traditional summation of squared weights with a multiplicative interaction of weights, to assess its influence on model behavior and performance.
Add a description, image, and links to the l2-regularization topic page so that developers can more easily learn about it.
To associate your repository with the l2-regularization topic, visit your repo's landing page and select "manage topics."