ModuleNotFoundError: No module named 'colossalai.zero.init_ctx' #3852
humanintel
started this conversation in
Community | General
Replies: 2 comments
-
same question |
Beta Was this translation helpful? Give feedback.
0 replies
-
modify this file line 18 ,and try again
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
When I tried to run a sample training script with ColossalAI, I've got the following error message:
xxx@a930ba8955c6:~/colossal/data$ bash run.sh
WARNING:torch.distributed.run:
Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed.
Traceback (most recent call last):
File "/home/xxx/colossal/ColossalAI-Examples/language/gpt/train_gpt.py", line 17, in
from colossalai.zero.init_ctx import ZeroInitContext
ModuleNotFoundError: No module named 'colossalai.zero.init_ctx'
Traceback (most recent call last):
I installed ColossalAI both from source and using pip, and don't know what might cause this. Any thoughts?
Thanks,
Beta Was this translation helpful? Give feedback.
All reactions