You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hello colinmarc, I am encountering a problem with read specific amount of bytes ,say 128MB, from a FileReader.
While the file is 1000MB, I tried to read the file by a loop and 128MB for each iteration except the last one;
but I found that each iteration can only returned 128KB;
and I tried to read 100KB by a loop;
but I found that not each iteration return 100KB consitently, some calls only returned 8192KB;
I found an annotation in block_read_stream.go:
// Always align reads to a chunk boundary. This makes the code much simpler,
// and with readers that pick sane read sizes (like io.Copy), should be
// efficient.
I am wondering if the phenomenon of Read inconsitently is related to the chunk boundaring mentioned above, if it is, I aslo wanna know how that make io.Copy more efficient?
I tried ReadAt also, which can make my need satisfied, but I am still curious about the underlying logic of func Read, thank you!
The text was updated successfully, but these errors were encountered:
hello colinmarc, I am encountering a problem with read specific amount of bytes ,say 128MB, from a FileReader.
While the file is 1000MB, I tried to read the file by a loop and 128MB for each iteration except the last one;
but I found that each iteration can only returned 128KB;
and I tried to read 100KB by a loop;
but I found that not each iteration return 100KB consitently, some calls only returned 8192KB;
I found an annotation in block_read_stream.go:
I am wondering if the phenomenon of Read inconsitently is related to the chunk boundaring mentioned above, if it is, I aslo wanna know how that make io.Copy more efficient?
I tried ReadAt also, which can make my need satisfied, but I am still curious about the underlying logic of func Read, thank you!
The text was updated successfully, but these errors were encountered: