Forums Register Login

Efficient IO for large files(as large as 20 to 40 MB)

+Pie Number of slices to send: Send
Hi,
I was wondering if there is way of doing IO efficiently on large files. Generally, we use buffer streams...which reads/loads everything in the memory and then we perform the transformation and write the transformed file back. The problem here is if there are 100s of users using the same file or different large files concurrently, it can take up a lot of memory to load all files and perform the transformation. I thought of using file streams, but that can slow down the entire processing because of read/write from disk. I was wondering if there was an efficient way of doing this in the memory itself. Any help shall be appreciated.
+Pie Number of slices to send: Send
 

Ak Rahul wrote:Hi,
Generally, we use buffer streams...which reads/loads everything in the memory .



Not true. Buffered streams hold some of a file/stream in a buffer in memory but not usually the whole stream content unless the stream is shorter than the buffer size. If the whole stream content is in memory then you must have allocated the space and filled it.
When evil is afoot and you don't have any arms you gotta be hip and do the legwork, but always kick some ... tiny ad:
a bit of art, as a gift, the permaculture playing cards
https://gardener-gift.com


reply
reply
This thread has been viewed 652 times.
Similar Threads
FileInputStream
How to prevent outOfMemoryErrors when performing XSLT transformations
Regular expressions, StringBuffers and OutOfMemoryErrors
Reading large txt files
how to resolve OutOfMemoryError
More...

All times above are in ranch (not your local) time.
The current ranch time is
Mar 28, 2024 06:27:45.