The original issue was created in
https://coderanch.com/t/448993/Performance/java/JAVA-HEAP-SIZE-files-byte, but I want to move this issue here, as it s more I/O-related:
I have a file which should be converted into a byte[]-Array.
All works fine. But when the file is larger than the maximum
JAVA HEAP SIZE than this error occurs
Caused by: java.lang.OutOfMemoryError: Java heap space
I use the Apaches Commons "FileUtils.readFileToByteArray()"-Method to convert from File to byte[].
Also, the Apaches "IOUtils.toByteArray(new FileInputStream(myfile)))" returns a java.lang.OutOfMemoryError.
I know, I have to set the "Java heap space" to a higher value, but this is not always possible.
Is there a way to handle such big files without exceeding the "Java heap space".
I also tried this
http://balusc.blogspot.com/2007/07/fileservlet.html, but without success.
I also tried it this way:
but java.lang.OutOfMemoryError occurs.
Is there a way,
maybe with the new NIO-API to handle such scenarios in a performant way? java.nio offers ByteChannel and FileChannel, maybe converting from FileChannel to ByteChannel, but how?