• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Caching record indices

 
Dushy Inguva
Ranch Hand
Posts: 264
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hello all,
I was looking at this thread, and found that people were acquiring a read lock on the db to do an unlock (cos it does a checkRecordPresent)
I have done it slightly differently, i maintain the indices of all the records in memory. (free, alive). Am i using too much memory ???
Another thing is, Caching. I'm using a plain old WeakHashMap for caching. (synchronized wrapper around it). But, i don't completely like it. (cos its serial access). I have something of a MultiReadSingleWriteSynchronizer (Philippe Maquet has it) myself. But mine is called LockManager (yes i use the same thing to access the database). Since my Cache is a interface, i could provide a more elaborate caching mechanism. Right now, i do not do any cache size control. (WeakHashMap remember?)
Do i need to do it? Ideally, i could support both forms of caching, and allow the user to choose one or the other in the database GUI.
Any thoughts ?
Thanks
Dushy
 
Andrew Monkhouse
author and jackaroo
Marshal Commander
Pie
Posts: 12007
215
C++ Firefox Browser IntelliJ IDE Java Mac Oracle
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Dushy,
I have done it slightly differently, i maintain the indices of all the records in memory. (free, alive). Am i using too much memory ???

For the number of records you currently have? No.
For the number of records that might occur before the IT Manager gets a clue, and moves to a commercial database - probably not.
Another thing is, Caching. I'm using a plain old WeakHashMap for caching.

Why a WeakHashMap? Surely the records exist in the cache or they don't. What are you having as your WeakReference?
Regards, Andrew
 
Dushy Inguva
Ranch Hand
Posts: 264
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks for the reply Andrew,
Why a WeakHashMap? Surely the records exist in the cache or they don't. What are you having as your WeakReference?

My DefaultCache implementation just puts the keys and values in a WeakHashMap. I am not explicitly using any WeakReferences. But, i make sure that the keys are not strongly referenced from anywhere else. This should be fine right?
But, right now, it is synchronized. I am not sure if i have to go in for a more elaborate caching mechanism which allows
1. Concurrent read - exclusive write
2. Better cache size control mechanisms (LRU algorithm?)
Thanks
Dushy
 
Andrew Monkhouse
author and jackaroo
Marshal Commander
Pie
Posts: 12007
215
C++ Firefox Browser IntelliJ IDE Java Mac Oracle
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Dushy,
I think a WeakHashMap is wrong for what you are doing.
You could add all your records to the WeakHashMap, then 10 seconds later the garbage collector runs, and all the records would be removed since there is no strong reference to any of them.
Perhaps a better solution for you would be using a SoftReference. From the API: "Soft reference objects, which are cleared at the discretion of the garbage collector in response to memory demand. Soft references are most often used to implement memory-sensitive caches." Sound like what you are trying to achieve?
Regards, Andrew
 
Dushy Inguva
Ranch Hand
Posts: 264
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks Andrew,
Oops... Should be a more through in reading the docs
Thanks
Dushy
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic