• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
  • Campbell Ritchie
  • Liutauras Vilda
  • Jeanne Boyarsky
  • Devaka Cooray
  • Paul Clapham
  • Tim Cooke
  • Knute Snortum
  • Bear Bibeault
Saloon Keepers:
  • Ron McLeod
  • Tim Moores
  • Stephan van Hulst
  • Piet Souris
  • Ganesh Patekar
  • Frits Walraven
  • Carey Brown
  • Tim Holloway

Tackling large size XML files

Posts: 16
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I have a requirement to port large amount of data from one DB to another via XML. We are estimating the size of the XML file (my XML includes data from more than a score tables) in the filesystem to be somewhere near 20MB. In the process of uploading, if I have to load this XML file to memory as a DOM object, then it will take up almost 40MB. (or more ?). Is there any way to avoid loading the whole XML but extract only portions ( say, table by table...Due to some technical limitations, we found the option of having separate XML files for each table not viable.)and then extract data phase by phase ? SAX and jDOM have been suggested as possible solutions...I am new to both and as there is a time constraint, will appreciate if anyone can throw more light on the best approach possible
Ranch Hand
Posts: 45
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I was faced with a similar problem-- a rdf file containing thousands of objects and I solved it with SAX. Use SAX to parse the file. This way you avoid creating the whole tree in memory. And to create a large document use a simple PrintWriter, otherwise use JDOM.
Don't get me started about those stupid light bulbs.
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!