One of the neater properties that XML has is the aforementioned portability. That's pretty cool, but most databases allow one to export data from them without tremendous difficulty. To me, the real reason is when you have data that doesn't fit the standard table format.
A recent project of mine was to migrate an existing set of static HTML pages into a template-driven, dynamic set of pages. The interesting property that the existing information had was that it had content at varying levels; that is, section 1.1 had some information, while in other areas it nested as deep as 22.214.171.124.1 In addition, any section could have a quiz and/or workbook associated with it. This lent itself, overall, to a structure which would be hard to implement efficiently with standard tables. It proved much easier, conceptually, to plonk all of the data in one XML file, read it at startup, and just grab the needed data out of the data structure thus created.
Generally, though, I would agree with you -- most databases are optimized for getting data in and out, and accessing data fast when need be. When the data structure gets hairy, though, I may ask you to pass the XML, please.
perl -pe '"I lo*`+$^X$\"$]!$/"=~m%(.*)%s;$_=$1;y^`+*^e v^#$&V"+@( NO CARRIER'