We have data of size 10 TB(terabytes), stored in multiple disks. Metadata (data describing data like filename, its location, author, description etc.) can go in GB(gigabyes) say 5 GB. To develop a web based application, should metadata be stored in xml files or in a database like oracle, mysql etc.
Since data is going to increase in future, scalability is required. Which approach will give better performance?
It will be like a user wants to find data matching a particular criteria e.g. all files generated between specified start date and end date, extracting required data and analysing it to give statistics, generate plot etc. At runtime, we are generating results, so user should get good performance.
As xml file will be larger, so can't use DOM, but Is using SAX parser scalable and gives good performance?
Originally posted by Jeanne Boyarsky: Ashish, Databases are designed for search. There are performance optimizations, such as indexes. While XML allows search, it involves reading the whole file. This is going to be slower than an index.
A counter argument would be that the file system plus something like Lucene would give a far quicker (and richer) search capability than an RDBMS can provide.
Replicating an XML document structure in database entites is a lot of maintenance. I'd avoid it if at all possible.
Does your data require any referential integrity or other constraints? If no, then I'd go for the file system every time.