XML Provider Performance

Topics: General
Jun 24, 2011 at 10:57 AM


I'm considering Composite C1 for an upcoming project. This is a read intensive application - in other words, the finished site will have a much higher number of reads compared to writes. I'm trying to make my mind up between SQL server and XML.

Like most developers, I guess, my initial reaction to the idea of using an XML based system was "no way" - but when I got to read your comments here: http://docs.composite.net/C1/Data/DataFAQ.aspx?q=Should+I+use+XML+or+SQL+Server%3f I decided to give more consideration as I am really attracted to the idea of reading the content into memory  and serving from there as this will make things really fast.

My question here relates to a comment in the FAQ: "If you have very large amounts of data or you have frequent data updates performance can become an issue" - now, I fully understand the comment about frequent updates, and that isn't an issue for me. BUT, I don't quite get the "very large amounts of data" - what constitutes a large amount of data? Is the issue the initial read time so that once the XML file is parsed, the site will perform well? Is the issue when the amount of data in XML files exceeds web server memory?

Sorry to ask difficult questions, but I really like what I see of Composite C1 and I'd like to understand the boundaries at an early stage

Many thanks


Jun 24, 2011 at 11:03 AM

It would be memory available to the application that define 'very large amounts of data' is - if there is a physical limit that is lower than available memory I am not aware of it.

Jun 24, 2011 at 11:09 AM

Thanks - this is exactly the answer I'd hoped for!