You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm working around a neuroscience study, the goal is to calculate the WTC of brain signal between two people. The data includes 34552 points for each person.
While calculating the data on Windows, the program reported:
Error: cannot allocate vector of size 9.5 Gb
On CentOS, R was killed by OOM.
Is there any approach to finish the calculation?
The text was updated successfully, but these errors were encountered:
I currently had this issue, all i did was to find more computing power. My University has this virtual computing network that allowed me to use more power to process wtc of 50k data points.
I'm working around a neuroscience study, the goal is to calculate the WTC of brain signal between two people. The data includes 34552 points for each person.
While calculating the data on Windows, the program reported:
On CentOS, R was killed by OOM.
Is there any approach to finish the calculation?
The text was updated successfully, but these errors were encountered: