Well, for one thing, any text inside the XML document will be parsed, at vast expense in parser time and object creation.
The great disadvantage to using the SOAP body for large amounts of text was recognized early - leading to SAAJ and similar kludges.
I clearly remember people screaming on SOAP newsgroups about requests taking many minutes to process - because the XML parser was grinding away at some huge payload that really didn't belong in the SOAP body at all.
Also why RESTful architectures are preferred if you don't absolutely need SOAP WS-*
posted 7 years ago
Just reading a bit more into this. According to Martin Kalin in 'Java Web Services Up and Running' base 64 encoding result in Data bloat results in payloads of up to 1 / 3 larger than raw data. Therefore using a binary attachment via SAAJ with no encoding is actually a smaller over all payload than a SOAP message with embedded base 64 encoded data.
The downside with going with raw data is that it means the receiver has to deal with them and covert into something meaningful.
It is also hard to use this approach with a document-style service. There are also inter operability issues.
MTOM is backed by W3C. It is used XOP (XML Optimized Packaging) to optimize the payload. So in terms of efficiency and interoperability it wins.