|Registered:||Jun 07, 2008|
|Given in last 30 days||0|
|Received in last 30 days||0|
|Last 30 days:||0|
William Brogden wrote:It occurs to me that you should be building your extract from the database as a Serializable HashMap in the first place - skipping any creation and parsing of strings and never writing that text file.
William Brogden wrote:Let the order of items in the String be that of the database columns and you essentially have high speed in memory lookup of a row.
William Brogden wrote:If we really have to optimize the memory used, lets look at the need for Strings in the first place. If all your characters are in the ASCII set, ie one byte, perhaps your hash value could hold a byte instead of a String, using half the memory. The keys would still be String but the value would only be turned into a String as needed.
William Brogden wrote:
How big is the file of extracted database rows? That should give an idea of maximum memory use.
fred rosenberger wrote:do you need to read the whole file into memory? Can you read one line, process it, then read the next?
Kees Jan Koster wrote:Dear Max,
How much memory are you giving this application? How much of that is used? A simple test might be to double the heap space for the app and see if that brings down the GC times.
fred rosenberger wrote:You seem to imply that your hash is working, and that things are fairly well spread out.
William Brogden wrote:there are no database connections that remain open
Do I understand you to say you are opening a new DB connection for each line and then closing it?
Tim Holloway wrote:One thing that's worth looking at is how "perfect" your hashes are. If the Hashtables are sub-optimal, you'll spend more time chasing overflow links.
Tim Holloway wrote:Hopefully, you have some way to compile your rules. If you have to interpret them each time you use them, that can be VERY expensive.
William Brogden wrote:How many database connections do you have open "at one time"?
Are you using a connection pool?