Closed
Description
Hello,
I've been playing with this architecture on nanoGPT. While I can get other architectures to play nicely there (e.g. RMT), I'm really struggling to get GLA to perform well.
Do you have any tips or code for training? For example, do you have a repository you recommend or key hyperparameter differences to normal transformers?
Thanks!
Metadata
Metadata
Assignees
Labels
No labels