Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem about process large amouts of data #78

Open
Losses opened this issue Jun 17, 2017 · 1 comment
Open

Problem about process large amouts of data #78

Losses opened this issue Jun 17, 2017 · 1 comment

Comments

@Losses
Copy link

Losses commented Jun 17, 2017

I'm working around a neuroscience study, the goal is to calculate the WTC of brain signal between two people. The data includes 34552 points for each person.
While calculating the data on Windows, the program reported:

Error: cannot allocate vector of size 9.5 Gb

On CentOS, R was killed by OOM.

Is there any approach to finish the calculation?

@Gafre22
Copy link

Gafre22 commented Nov 1, 2024

I currently had this issue, all i did was to find more computing power. My University has this virtual computing network that allowed me to use more power to process wtc of 50k data points.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants