For the high performance and optimized caching mechanism, we're cashing java objects in-memory.
Following are the code snippets which I wrote to do in-memory compression / decompression of the java objects.
I'm compressing java objects in-memory and storing compressed data in bytes array, whenever it's required to use, I just uncompress it in memory and use it and then again dump (garbage) it when use is finished.
Instead of keeping real objects, I'm keeping compressed object bytes in memory to save huge memory.
I'm getting deadly java error while decompressing data, the error is as under,
Caused by: java.util.zip.ZipException: oversubscribed dynamic bit lengths tree
I don't know what's wrong with it.
I tried creating many sample programs also with huge data, but it worked fine in case of standalone applicatoin but in J2EE application its not working with real time data.
Following is the code to compress and decompress data.
Following is the code to compress the data
Object obj = new String("hello world"); // some objects ByteArrayOutputStream baos = new ByteArrayOutputStream(); GZipOutputStream gos = new GZipOutputStream(baos); ObjectOutputStream oos = new ObjectOutputStream(gos); oos.writeObject(obj); byte compressedData = baos.toBytesArray(); oos.close(); gos.close(); baos.close();
Following code is to decompress the data
ByteArrayInputStream baos = new ByteArrayInputStream(compressedData); GZipInputStream gos = new GZipInputStream(baos); ObjectInputStream oos = new ObjectInputStream(gos); obj = oos.readObject(); oos.close(); gos.close(); baos.close(); System.out.println(obj);
Does any body have any idea about how to get a rid of this problem.
If it does happen due to GZip input/output stream, is there any alternate solutions to compress/decompress data and store as a bytes array in memory.