-
Notifications
You must be signed in to change notification settings - Fork 173
Fix KL computation for dof != 1 #246
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Apologies, this includes changes from #245 because I screwed up my git commits. Also, the tests fail with |
Wonderful, thanks for tracking down the bug! I guess I forgot to implement the ddof parameter in the positive gradient? So then I suppose the KL divergence was wrong in all the cases up until now? Also, I see two PRs now, and the other one seems to have only one commit. Should I close the other one and accept this one? |
Yes, this PR includes both changes -- I wanted to make two separate PRs but screwed something up with my git branches. The other one is not needed anymore. But what's happening with the tests? |
Yes. The gradient was implemented correctly, but the KL divergence was computed wrongly, and only gave the right answer for dof=1. |
Okay, I guess we can close the other PR then. I'm looking at the errors now, and I can't really see anything obvious. The
Perhaps there was a recent change in |
I believe I've fixed this in #249. Could you please rebase this onto the latest master, and then hopefully, everything passes. |
Yes but I am traveling now so not sure how soon I'll be able to get to it.
If you can do it yourself, then please go ahead.
…On Sat, 19 Aug 2023, 12:20 Pavlin Poličar, ***@***.***> wrote:
I believe I've fixed this in #249
<#249>. Could you please
rebase this onto the latest master, and then hopefully, everything passes.
—
Reply to this email directly, view it on GitHub
<#246 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACEN75Z2STE2H2RTWFJF26LXWCHNRANCNFSM6AAAAAA27WIVAQ>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
4016054
to
83ba581
Compare
Sorry for the delay, setting this up so that I can push to your PR is a pain, but hopefully, the tests pass now. |
Great, thanks a lot for taking care of it! And apologies that I could not fix it myself quickly enough. Only back to my computer now. |
No worries, thanks for finding the fix! |
Could you tag it as 1.0.1 at some point? Would be cool if this bug fix gets into conda. |
I'm sorry that this took so long. I've now released v1.0.1 with this fix. |
This should hopefully fix #238 and in addition implement exact KL computation with arbitrary dof.