Win a copy of Functional Reactive Programming this week in the Other Languages forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

save large amount of rows

 
Doua Beri
Ranch Hand
Posts: 60
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi. I am building an application where users have the posibility to export some information from a mysql table(information is return by a select command) into a file(probably a csv).
I read some tutorials I founded on the net , but there the saving part is done by mysql server and I'm not interested in this solution because the users are conected to a remote mysql server.

The problem I have is that I can save the select if returns a small amount of rows. but some queries returns 200k+ rows and I want to keep the memory as much free as I can.

Is there any trick or any way I can read information from a large table?

Thank you.
 
Karan Johar
Greenhorn
Posts: 25
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
you can first do a count query to find how many rows may be returned, then if the rows are many, you can fetch a few at a time like 10 k and save. this way you will never have lot of rows in memory.
 
Doua Beri
Ranch Hand
Posts: 60
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
so you're saying to put mysql return with a limit( for example 10k) and save them. after that i'll put to return another 10k and so on?

hmm.. with other words to split my large query result into small results.

Ok. I will try and see what happens.

Thanks
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic