I am using a MySQL database with the org.gjt.mm.mysql.Driver JDBC driver. I have a result set from the db, but when I call getBlob it throws an java.lang.OutOfMemoryError. Has any one seen this before, or have any suggestions on what may cause it? Thanks
It's most likely caused, as you might expect, by running out of memory. How big a blob are you retrieving? Assuming you have enough memory on your machine, you can increase the amount allocated to Java using the -Xms (for initial memory) and -Xmx (for maximum memory) switches to the JVM. Eg., java -Xms 64m -Xmx 128m MyProgram
I have a project with VERY large blobs (up to 50 Mb) and mysql 5.0.
It is very important that they are put in the database, because theyre relationship to other domain elements is important to maintain (so storing a pointer to the file on disk is absolutely not an option).
Will I run into any issues with JDBC ? Im using it with 1-2 M files right now with no problems...but is their a client side limit to how much data jdbc allows to be sent out from the jvm ?
I know that on the server side you can increase the "max_packet_size" variable to very high numbers, but are their limitations as far as what you can do from the client side ?
Are Blobs are implemented with inputStreams ? If so JDBC could handle an arbitrarily large byte[], as its blob content - since InputStreams can be arbitrarily large.
It wouldn't surprise me if some drivers somewhere don't fully conform to the spec and "cheat" and really load the full Blob into memory, but most of the major DBs do it right. You might test this early in your project though...