I am aware that DOM is a more expensive way to parse an XML document. Not thinking this would be a problem anyway, i built an import program that imports information about employees and organization units for a big company. The input XML has grown and the other day, i suddenly had an error message that says "java.lang.OutOfMemoryError".
This happens in the javax.xml.parsers.DocumentBuilder.parse() method.
Right now, i am trying to verify that the reason for this error is that the XML file as grown too large, but people are telling me this should not be a problem. The file contains approximately 13.900 employees with seven XML tags each, first name, last name, id, e-mail address, two organization id:s and a leader code. The file also contains 1.290 organization units with five tags each: leaders first and last name,
unit name, unit id and the leaders id. The file size is 4.542 kb. My machine has 512 mb of memory.
Now i need to know: have i crossed the limit of what a DOM object can handle? Is there any way to get around this problem other than using a SAX sequential parsing or breaking up the file (which is very complicated, unfortunately).
Is the limit of what a DOM object can take related to the memory of the machine or is the limit in the XML framework? Are there any other parameters that can affect this situation?
[ June 10, 2002: Message edited by: Mats Andersson ]
[ June 10, 2002: Message edited by: Mats Andersson ]