-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Planner for PipeDream-2BW #57
Comments
You can use this function for planning: https://github.com/msr-fiddle/pipedream/blob/pipedream_2bw/planner/planner.py#L33. For the performance and memory cost functions, you might want to use direct measurements (from running a 100 or so iterations for the respective configuration). |
Thank you for your very quick answer! I was wondering how I can get values for some arguments: computation_time_per_block, num_parameters_per_block, num_activations_per_block, and output_activation_size. More specifically,
I appreciate if you could answer these questions. |
And yes, we're assuming that these are transformer models where the transformer blocks are repeated some number of times. |
Thank you again for your kind support! I have a related question about PipeDream (the former version). When I try, the optimizer (
Does the When I set a certain amount of GPUs (8/16/32) to train resnet, most of generated configurations are blocked soon after training starts. |
May I know if this is still available for non-commercial usage now? |
In pipedream_2bw branch, we found the runtime that implements PipeDream-2BW.
However, no explanation is given about the planner.
Can we use the planner?
The text was updated successfully, but these errors were encountered: