You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In your paper and my tests, inference takes about 86 ms on 4090.
But It's much longer than one frame with 50 fps or even 30 fps.
The robot moves jerkily because of this and stops waiting until a new inference have been made.
Is there a way to make the robot move smoothly? Can better GPU reduce inference time lower than frame time?
The text was updated successfully, but these errors were encountered:
In your paper and my tests, inference takes about 86 ms on 4090.
But It's much longer than one frame with 50 fps or even 30 fps.
The robot moves jerkily because of this and stops waiting until a new inference have been made.
Is there a way to make the robot move smoothly? Can better GPU reduce inference time lower than frame time?
The text was updated successfully, but these errors were encountered: