• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
  • Campbell Ritchie
  • Liutauras Vilda
  • Jeanne Boyarsky
  • Devaka Cooray
  • Paul Clapham
  • Tim Cooke
  • Knute Snortum
  • Bear Bibeault
Saloon Keepers:
  • Ron McLeod
  • Tim Moores
  • Stephan van Hulst
  • Piet Souris
  • Ganesh Patekar
  • Frits Walraven
  • Carey Brown
  • Tim Holloway

Handling massive number of records

clojure forum advocate
Posts: 3479
Mac Objective C Clojure
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
In our application we have read one day record from MySQL database, make a little change on each retrieved record and then save the records to a new database.
Please note that each day holds a massive number of records (could be a million records).
What is the best strategy to implement this?
I'm thinking to use MySQL limit keyword to read each 100 row, do you have any better approach?
And what about saving the retrieved/modified each 100 row?
Using a batch?
Posts: 4107
Google Web Toolkit Eclipse IDE Flex
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
If it's in millions of records I hope you have the two databases physically sitting near each other or network latency is going to be high. For such a situation though, you may consider performing a backup up the database into a single file, transferring the file to the target database machine, then unloading/converting it there.

It seems like a lot of work but databases are known to thrash for high number of records, and getting JDBC to do batch records optimally can be hard.
With a little knowledge, a cast iron skillet is non-stick and lasts a lifetime.
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!