Skip to content
This repository was archived by the owner on Mar 19, 2024. It is now read-only.

Explore ShardedDDP reduce_buffer_size setting to config #177

Closed
wants to merge 2 commits into from

Conversation

prigoyal
Copy link
Contributor

@prigoyal prigoyal commented Feb 8, 2021

Summary: as title, in VISSL, we need to set the reduce_buffer_size=0 as there are parameters that are not actually being used and find_used_parameters is something not handled by shardedDPP. setting buffer size to 0 will all reduce the gradients immediately instead of bucketing them

Differential Revision: D26276800

Differential Revision: D26236748

fbshipit-source-id: 521c73bdf4fdab110b8e10a20c83da80e9bf0fa2
Summary: as title, in VISSL, we need to set the `reduce_buffer_size=0`  as there are parameters that are not actually being used and `find_used_parameters` is something not handled by shardedDPP. setting buffer size to 0 will all reduce the gradients immediately instead of bucketing them

Differential Revision: D26276800

fbshipit-source-id: bd976270f20839a9a3c9b057106a6e91f1c18a34
@facebook-github-bot facebook-github-bot added CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported labels Feb 8, 2021
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D26276800

facebook-github-bot pushed a commit that referenced this pull request Feb 9, 2021
Summary:
Pull Request resolved: #177

as title, in VISSL, we need to set the `reduce_buffer_size=0`  as there are parameters that are not actually being used and `find_used_parameters` is something not handled by shardedDPP. setting buffer size to 0 will all reduce the gradients immediately instead of bucketing them

Reviewed By: min-xu-ai

Differential Revision: D26276800

fbshipit-source-id: 4bbe5a6e3a2b36b8a55abb6e120368025356db17
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants