Skip to content

Potential bug in NegativeLogLikelihood impl #150

@hweom

Description

@hweom

Just started to explore deep learning and chose Juice as the starting framework, since I want to stick with Rust. Since I'm pretty new to the domain, it might be just my mistake.

I was looking at the NegativeLogLikelihood::compute_output() and I think there is a bug. Instead of

        for &label_value in native_labels {
            let probability_value = native_probabilities[label_value as usize];
            writable_loss.push(-probability_value);
        }

it should be

        let mut offset = 0;
        for &label_value in native_labels {
            let probability_value = native_probabilities[offset + label_value as usize];
            writable_loss.push(-probability_value);
            offset += self.num_classes;
        }

Otherwise we're comparing all labels in the batch with the first output from the batch?

Interesting that I've tried changing it and there were no noticeable effect on MNIST example...

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions