java -jar "%DB_HOME%\lib\derbyrun.jar" server shutdown
If I instead continue to add objects, I eventually get an OutOfMemoryError:
c:\Program Files (x86)\Apache Software Foundation\Tomcat 5.5\webapps\Bball\WEB-I
NF\classes>java -jar "c:\Program Files (x86)\Sun\JavaDB\lib\derbyrun.jar" server start
Security manager installed using the Basic server security policy.
Apache Derby Network Server - 10.4.2.0 - (689064) started and ready to accept connections on port 1527 at 2009-07-12 03:26:03.708 GMT
Exception in thread "DRDAConnThread_4" java.lang.OutOfMemoryError: Java heap space
at org.apache.derby.iapi.error.ErrorStringBuilder.appendln(Unknown Source)
at org.apache.derby.iapi.services.context.ContextManager.cleanupOnError UnknownSource)
at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown ource)
at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
Do I need to do something like:
inside the writeExternal() method? Or is there something else, that needs to happen to prevent this exception?
That is, the real error is being obscured. It's trying to form the error and running out of memory, probably because it's doing something like dumping a query that itself is massive.
The short-term fix is to increase the amount of memory allowed to Java. You're probably at the 64MB default client limit!
Then you can read the error message, then you can figure out the real problem.
Or it might just be that you need 100MB of RAM instead of 64MB, so just increasing memory is itself enough.
Using the task manger I found that the JDBC was using about 1GB of memory which doesn't seem right since when I closed it, the hard drive space it used dropped to a few tens of MBs..
I've since found that using out.flush() at the end of each class's writeExternal method has decreased the memory usage to about 500MBs...But that seems very high for a database holding about 20MBs of data...If I shut down the database are then reopen it and start my processing updating it & writing to it, it's memory usage pretty quickly grows back to 500MBs.
A major portion of the database access is updating BLOBs (that combined take about 3-4MBs of disk space). I've noticed that as the updating process runs, it gets faster. Could it be that the database is buffering all the structures that are written to it? If so, is it reasonable for the structures to be this much larger? And if my database grows, won't the buffering cause more out of memory exceptions?