-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PHP script timeout #4
Comments
Hi! Have you done any work on modifying the code? I don't think it's something too difficult to accomplish in one day and test out. I too think this could be optimized. Here is an interesting read: |
The reason it wasn't implemented that way is to leverage the "power" of simultaneous chunk uploads. The issue is that apparently, uploading multiple chunks simultaneously is not helpful in most scenarios. Great library still, possibly not what you were looking for. |
@arthur-white Definitely a great library. Thanks for the link, exactly my thoughts. I haven't done any work on the PHP code yet. My PHP skills are very limited, so I don't think I can modify the code (PHP) the way I want it. I can help modifying the JS code though, but I'm not sure if a behavior like that is intented by the developer at all. If it is it probably gets implemented in the next version. |
It is not terribly hard although daunting to a novice php dev but although I am not an expert, I think it should be as simple as getting the chunk (it's path is stored inside $_FILES['file']['tmp_name']) and appending it to the what has been uploaded so far. Something like this could work:
Of course, now you have the job of making the files upload one at a time (easy, set simultaneousUploads to 1) and doing the same for chunks? It should be pretty straight forward as ngFlow already breaks the file to chunks on the client side. If you need assistance, feel welcome to ask on stackoverflow.com and link me to the question, I'll debug it with you (I might need this myself in the near future with some luck). You can also check the code vvllaadd posted in the bottom of that thread, not if it works but it might be ready for production. |
It's probably just me, but uploading larger files (3GB+) is not really possible currently. Chunking files and uploading each junk to the server works just fine, even for very large files. But the larger the file the more chunks need to be put together by the server. This process seems to be very time consuming (in PHP) and if it takes too long the script just timeouts.
For some people it's possible to just increase the timeout limit, but even then I'm not sure if that's the best solution. So I'm wondering if we could just bypass this problem by uploading the chunks in the correct order and appending them to a "temp" file (after a chunk was successfully uploaded). In this case there shouldn't be a timeout problem at all, no matter how large the file is. At least with my very limited understanding of PHP.
If there aren't any drawbacks I'm wondering if you could imagine adding something like that?
The text was updated successfully, but these errors were encountered: