-
-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Learning end2end with a neural network #17
Comments
Hi @jonnor ,
I do plan to add some examples as jupyter notebooks but I'm currently busy at other projects. For example, you want to train a model that will predict the activations, and learn a shared non-negative template jointly, then you can do something like this: import torch
from torch import nn
from torch import optim
from torchnmf.trainer import BetaMu
from torchnmf import NMF
#pick an activation function so the output is non-negative
H = nn.Sequential(AnotherModel(), nn.Softplus())
W = NMF(W=(out_channels, in_channels))
optimizer = optim.Adm(H.parameters())
trainer = BetaMu(W.parameters())
for x, y in dataloader:
# optimize NMF
def closure():
trainer.zero_grad()
with torch.no_grad():
h = H(x)
return y, W(H=h)
trainer.step(closure)
# optimize nueral net
h = H(x)
predict = W(H=h)
loss = ... # you can use other types of loss here
loss.backward()
optimizer.step()
optimizer.zero_grad() |
Hi @yoyololicon - thank you for the response and example code! In this framework, the loss would be something that compares the output of the NMF (decomposed and re-composed)? Like RMS as a simple case, or a perceptual metric for something more advanced? |
@jonnor |
Hi, thank you for this nice project.
Could one connect a neural network to the NFM module, and learn them at the same time? Any example code or tips on how to do that?
I am interested in using a convolutional neural network frontend on spectrogram data, and capture a bit more complex activations than single stationary spectrogram frames.
The text was updated successfully, but these errors were encountered: