Win a copy of Practical SVG this week in the HTML/CSS/JavaScript forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic

storing data in xml file

 
ajse ruku
Ranch Hand
Posts: 196
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi all,

If my application can not establish connection with the server it is storing 8000 xml files(one file per record).After connection with the server resumes it sends these xml files to server one by one and simultaneously deletes these files.

I want to know ,wheather it is a good design or not.My requirement is to store 8000 records at the local machine when server is down and then send these records one by one when server comes up.Please reply if anybody has something in this respect.

With regards,
Ajse
 
Stan James
(instanceof Sidekick)
Ranch Hand
Posts: 8791
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
"Store and forward" strategies like this are common enough. I've used them when agents in the field do data entry disconnected from the network and batch load their day's data when they get back to the office. And when the office network shared a phone line with voice and fax. (And we wrote code on stone tablets.)

8,000 small files sounds slow and messy on disk. I'd think about one large file or a local database. Then again, when you read, transmit and delete one at a time you have a good checkpoint for how much has been done so far. If you lose the network connection again or kill the program in mid-batch, you know just where to restart. Worst case you re-transmit one file. So it might be a fine design after all.
[ November 30, 2005: Message edited by: Stan James ]
 
Sri Ram
Ranch Hand
Posts: 118
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
So it might be a fine design after all.

It is Stan, This is actually done in my Application. Mine is a banking application where customer fill in Application form which is stored in agents PC as a xml Files. Probably some 2000 - 3000 Application gets generated per day.
It takes approximately 30 Secs for one record to get uploaded even if with lots of constraint checks. Infact more than 30 tables gets affected when one record is processed.
But this is done in off Peak load time periods since the number of records inserted is more and performance is also greatly improved when number of people accessing the system is less.
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!