You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Well, I think it's about time to at least start thinking about the elephant in the room. WDTE uses a lot of RAM very quickly. Optimization hasn't really been a big priority during the initial implementation phase, since things have been changing around so much, but I think the base language itself is pretty fully worked out at this point, so it might make sense to start trying to figure out a good, clean way to fix this little issue.
I recently attempted to implement the Rosetta Code problem for picking a random line from a file. Part of the problem involves picking lines something like 10,000,000 times and seeing how things work out. Well, when I tried to run a test of the WDTE script for doing it, it ran for well over an hour and had still only gotten through the first few thousand iterations. It was also using several gigabytes of RAM. Speed has never been a particular priority of the language, but this is clearly unacceptable.
The text was updated successfully, but these errors were encountered:
Well, I think it's about time to at least start thinking about the elephant in the room. WDTE uses a lot of RAM very quickly. Optimization hasn't really been a big priority during the initial implementation phase, since things have been changing around so much, but I think the base language itself is pretty fully worked out at this point, so it might make sense to start trying to figure out a good, clean way to fix this little issue.
I recently attempted to implement the Rosetta Code problem for picking a random line from a file. Part of the problem involves picking lines something like 10,000,000 times and seeing how things work out. Well, when I tried to run a test of the WDTE script for doing it, it ran for well over an hour and had still only gotten through the first few thousand iterations. It was also using several gigabytes of RAM. Speed has never been a particular priority of the language, but this is clearly unacceptable.
The text was updated successfully, but these errors were encountered: