hi,
I have a situation here, I am currently storing huge amounts of data (half GB, one GB at most 2 GB) in text file (csv style)�and then I parse them using simple
java streams �I would read them once and calculate some summaries and fill in oracle table. However, now I am thinking of storing the data in XML format instead of text format and use SAX for parsing , the file size would surely shoot up ..maybe double �.but more important is parsing performance �is XML suited for this amount of data ?? will SAX parsing be any better than simply reading text file using java streams and tokenizing them ??
Can some one please throw some light on this issue
thanks,
.....jw