Hello,
I have a WAR-Application.
I have a site in which Users can upload files (java.io.File).
These files are converted into a byte[]-Array to store it as a BLOB in the database. All works fine. But when the file is larger than the maximum JAVA HEAP SIZE than this error occurs:
I use the Apaches Commons "FileUtils.readFileToByteArray()"-Method to convert from File to byte[]. Also, the Apaches "IOUtils.toByteArray(new FileInputStream(myfile)))" returns a java.lang.OutOfMemoryError.
I know, I have to set the "Java heap space" to a higher value, but this is not always possible.
Is there a way to handle such big files without exceeding the "Java heap space".
I also tried this http://balusc.blogspot.com/2007/07/fileservlet.html, but without success.
Is there a way, maybe with the new NIO-API to handle such scenarios in a performant way?
I have a WAR-Application.
I have a site in which Users can upload files (java.io.File).
These files are converted into a byte[]-Array to store it as a BLOB in the database. All works fine. But when the file is larger than the maximum JAVA HEAP SIZE than this error occurs:
I use the Apaches Commons "FileUtils.readFileToByteArray()"-Method to convert from File to byte[]. Also, the Apaches "IOUtils.toByteArray(new FileInputStream(myfile)))" returns a java.lang.OutOfMemoryError.
I know, I have to set the "Java heap space" to a higher value, but this is not always possible.
Is there a way to handle such big files without exceeding the "Java heap space".
I also tried this http://balusc.blogspot.com/2007/07/fileservlet.html, but without success.
Is there a way, maybe with the new NIO-API to handle such scenarios in a performant way?