Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PHP script timeout #4

Open
demrks opened this issue Sep 9, 2014 · 4 comments
Open

PHP script timeout #4

demrks opened this issue Sep 9, 2014 · 4 comments

Comments

@demrks
Copy link

demrks commented Sep 9, 2014

It's probably just me, but uploading larger files (3GB+) is not really possible currently. Chunking files and uploading each junk to the server works just fine, even for very large files. But the larger the file the more chunks need to be put together by the server. This process seems to be very time consuming (in PHP) and if it takes too long the script just timeouts.

For some people it's possible to just increase the timeout limit, but even then I'm not sure if that's the best solution. So I'm wondering if we could just bypass this problem by uploading the chunks in the correct order and appending them to a "temp" file (after a chunk was successfully uploaded). In this case there shouldn't be a timeout problem at all, no matter how large the file is. At least with my very limited understanding of PHP.

If there aren't any drawbacks I'm wondering if you could imagine adding something like that?

@arthur-white
Copy link

Hi! Have you done any work on modifying the code? I don't think it's something too difficult to accomplish in one day and test out. I too think this could be optimized. Here is an interesting read:
flowjs/flow.js#4
If it will not help you resolve this, at least it would perhaps enlighten you about large uploads.

@arthur-white
Copy link

The reason it wasn't implemented that way is to leverage the "power" of simultaneous chunk uploads. The issue is that apparently, uploading multiple chunks simultaneously is not helpful in most scenarios.
So you can't append if you don't know the order of the chunks but you are not getting a great boost so it is possible the issue will be resolved in the next version (?) .

Great library still, possibly not what you were looking for.

@demrks
Copy link
Author

demrks commented Oct 8, 2014

@arthur-white Definitely a great library. Thanks for the link, exactly my thoughts. I haven't done any work on the PHP code yet. My PHP skills are very limited, so I don't think I can modify the code (PHP) the way I want it. I can help modifying the JS code though, but I'm not sure if a behavior like that is intented by the developer at all. If it is it probably gets implemented in the next version.

@arthur-white
Copy link

It is not terribly hard although daunting to a novice php dev but although I am not an expert, I think it should be as simple as getting the chunk (it's path is stored inside $_FILES['file']['tmp_name']) and appending it to the what has been uploaded so far.

Something like this could work:

//untested
$chunk = $_FILE['file']['tmp_name'];
$fd = fopen('uploaded_so_far_' . $uniqueIdentifier, 'a');
fwrite($fd, file_get_contents($chunk));
fclose($fd);

Of course, now you have the job of making the files upload one at a time (easy, set simultaneousUploads to 1) and doing the same for chunks? It should be pretty straight forward as ngFlow already breaks the file to chunks on the client side. If you need assistance, feel welcome to ask on stackoverflow.com and link me to the question, I'll debug it with you (I might need this myself in the near future with some luck).

You can also check the code vvllaadd posted in the bottom of that thread, not if it works but it might be ready for production.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants