Skip to content

Example of MLP architecture #93

@pplonski

Description

@pplonski

Thank you for this package. I'm looking for some example on how to implement simple MLP (Multi Layer Perceptron) with this package. Any code snippets or tutorials are welcome.

Below is some code that I glue, but I have no idea on how to do backpropagation, I would like to have fit() method implemented.

Thank you!

from numpy_ml.neural_nets.losses import CrossEntropy, SquaredError from numpy_ml.neural_nets.utils import minibatch from numpy_ml.neural_nets.activations import ReLU, Sigmoid from numpy_ml.neural_nets.layers import FullyConnected from numpy_ml.neural_nets.optimizers.optimizers import SGD optimizer = SGD() loss = SquaredError() class MLP: def __init__(self): self.nn = OrderedDict() self.nn["L1"] = FullyConnected( 10, act_fn="ReLU", optimizer=optimizer ) self.nn["L2"] = FullyConnected( 1, act_fn="Sigmoid", optimizer=optimizer ) def forward(self, X, retain_derived=True): Xs = {} out, rd = X, retain_derived for k, v in self.nn.items(): Xs[k] = out out = v.forward(out, retain_derived=rd) return out, Xs def backward(self, grad, retain_grads=True): dXs = {} out, rg = grad, retain_grads for k, v in reversed(list(self.nn.items())): dXs[k] = out out = v.backward(out, retain_grads=rg) return out, dXs

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions