• Post Reply Bookmark Topic Watch Topic
  • New Topic

In-Memory to On-Disk Database  RSS feed

 
Ram Raju
Greenhorn
Posts: 17
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I have an application which recieves bursts of data that it has to update to a database. The frequent calls are getting expensive. I want to update to an in-memory database and do a batch update to the on-disk database from in-memory, based on size/no of records or time. I am hoping this would greatly improve the performance. Any suggestions or can anyone point me to a similar implementation. Any help on how to proceed is appreciated.
 
Ilja Preuss
author
Sheriff
Posts: 14112
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Did you already take a look at the batch processing feature of PreparedStatement?
 
Stan James
(instanceof Sidekick)
Ranch Hand
Posts: 8791
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Just curious ... how does "expensive" manifest itself? What are the symptoms?
 
Ram Raju
Greenhorn
Posts: 17
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thankyou for your replies. I recieve data as messages over various channels which I persist in a database. So I dont know when the next may come if it comes. When I recieve a burst (say > 500) of messages, the write to database lags significantly (takes a couple of mins to write 500 times to MySql).
 
Ilja Preuss
author
Sheriff
Posts: 14112
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Originally posted by Ram Raju:
When I recieve a burst (say > 500) of messages, the write to database lags significantly (takes a couple of mins to write 500 times to MySql).


That is far too long. I doubt that the database is the bottleneck in this case.

Did you already profile your code?
 
Roger Chung-Wee
Ranch Hand
Posts: 1683
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Maybe it is just a case of opening a new Connection for each update. Can you please post your code?
 
William Brogden
Author and all-around good cowpoke
Rancher
Posts: 13078
6
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I hope you have separated the Thread and code for writing to the database from the code that collects updates. Trying to do a batch update with a collection of messages that keeps expanding sounds like a potential for trouble.
Why would you want an "in-memory database" anyway? Why not just save the raw form (or partially processed form) of the update messages in a collection of some sort, when time to update the database, pass the collection to a separate Thread and start a new collection.
Bill
 
Ram Raju
Greenhorn
Posts: 17
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thankyou for your replies. The more I think about it, the in-memory sounds like a bad idea, particularly when the system crashes and there is no way to restore the lost data. I use hibernate for ORM and let spring handle the connections. So not much I can do there. Will profile the code. Thanks for all the comments.
 
steve souza
Ranch Hand
Posts: 862
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I agree that profiling is needed. Several minutes for 500 rows seems VERY slow.

I don't think this is your problem, but thought I would mention that running multiple updates in a transaction can often increase performance.
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!