You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Trying with varying datasets of 2 inputs, if the total length of the dataset exceeds 100, with normalized inputs of -1.00 to 1.00, the predict method will tend to return NaN. Default htan is being used. Increasing hidden layers and units seems to help in increasing the total length, but still is this to be expected, I would assume that the neural network should be able to come up with at least some sort of prediction after several iterations.
The text was updated successfully, but these errors were encountered:
Apologies, it seems to have been an issue with the inputs being incorrectly normalized from my end, after I just retried and took a deeper look. So the inputs were always being set to 1. Still, even if all inputs are always a constant number, is this something that should still happen, where it returns NaN? Feel free to close this issue after answering :)
It seems like under certain circumstances some rounding error creeps in and damages everything. I tried network with 20 hidden layers of 20 units, and it's almost always returned same result for any input. Also after different invocations of learning the same data I get different predictions, is it expected behaviour?
Trying with varying datasets of 2 inputs, if the total length of the dataset exceeds 100, with normalized inputs of -1.00 to 1.00, the predict method will tend to return NaN. Default htan is being used. Increasing hidden layers and units seems to help in increasing the total length, but still is this to be expected, I would assume that the neural network should be able to come up with at least some sort of prediction after several iterations.
The text was updated successfully, but these errors were encountered: