Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sequential MLP implementation #31

Open
AndreiMoraru123 opened this issue Feb 27, 2023 · 0 comments
Open

Sequential MLP implementation #31

AndreiMoraru123 opened this issue Feb 27, 2023 · 0 comments

Comments

@AndreiMoraru123
Copy link

AndreiMoraru123 commented Feb 27, 2023

Maybe not PR worthy, but I guess one can abstract the MLP implementation even more, making use of the layers instead of number of inputs and outputs yet again, since each individual layer already knows them.

As such, I wrote it as:

class MLP:

  def __init__(self, layers):
    self.layers = layers

  def __call__(self, x):
    for l in self.layers:
      x = l(x)
    return x

  def parameters(self):
      return [p for layer in self.layers for p in layer.parameters()]

by which you can define a network more intuitively, much like PyTorch's Sequential:

n = MLP([Layer(3, 6), Layer(6, 3), Layer(3, 1)])

To be even more rigorous, a dimension assertion can be added in the __init__:

class MLP:

  def __init__(self, layers):
    self.layers = layers
    for i in range(1, len(layers)):
      assert layers[i-1].nout == layers[i].nin

for which I would have to store the nin & nout for the layers in the as well:

class Layer:

  def __init__(self, nin, nout):
    self.nin = nin
    self.nout = nout
    self.neurons = [Neuron(nin) for _ in range(nout)]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant