My application will need to parse some files which will generate more then 1 milion of records in the database.
Using StatelessSession from hibernate and postgresql i've reach the time of 70 second for 1 million records (an entity has 4 characters column and the bigint primary key).
I have 2 questions:
1) Since I didn't work with so many data until now, do you think that this is a good time?
2) Do you have any suggestions to improve this time?
William P O'Sullivan wrote:Is this just the parsing process?
WP
For now i have just created one million object similar with what i will have from the files and saved them. I'm just concerned about the inserting into database process. The reading from the file will be fast. If there are any problems i can use multithreding for the parsing, but the database insert (Java object -> database row process) is usually the bottleneck.
Yes It's in memory. I'm just instantiating the object into a for, setting them some values in function of the for index and save them into database.
I know that the file processing will add some extra time, but now i'm more concerned about the saving in database process (i have lots of ideea of how i can speed up the file processing but for the database saving this is all i have got until now).
Using multithreading and StatelessSession i have now obtained a time of 35 seconds for 1 millions records. It can go even lower if i use more threads but for now is enough
Post by:autobot
The two armies met. But instead of battle, they decided to eat some pie and contemplate this tiny ad:
a bit of art, as a gift, the permaculture playing cards