Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Set computation dtype for PP weights #1350

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

gobbleturk
Copy link
Collaborator

@gobbleturk gobbleturk commented Mar 6, 2025

Description

We saw extra memory usage where it was hard to fit PP=21 TP=4 for llama405B with fsdp_ag_once which is surprising - memory usage should be dominated by AG the weights + gradients over FSDP which should be roughly 405B / (21 *4) * 4 = 20GB, but instead were OOMING but a little bit over 32GB on trillium. We had a f32 copy of the fsdp-gathered grads.

Notice 1 Once all tests pass, the "pull ready" label will automatically be assigned. This label is used
for administrative purposes. Please do not add it manually.

Notice 2 For external contributions, our settings currently require an approval from a MaxText maintainer to trigger CI tests.

Tests

Ran on default 1B model on my v4-devbox with ici_pipeline=2, and pipeline_fsdp_ag_once=[True,False]

  • pipeline_fsdp_ag_once=True, old (f32): trace 10.3 GB
  • pipeline_fsdp_ag_once=True, new (bf16): trace 9.5 GB
  • pipeline_fsdp_ag_once=False, old(f32): trace 9 GB
  • pipeline_fsdp_ag_once=False new(bf16): trace 8.5 GB

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed.

@gobbleturk gobbleturk force-pushed the mattdavidow-pp-weights-compute-dtype branch from 12aa39b to 8803c09 Compare March 6, 2025 04:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants