I've been working with IT for about fifteen years. I remember in 1998, when I and some of my colleagues was analyzing and testing all the application (the millennium problem - remember ?). At that time, we identified 105.000 applications in the bank where I worked. More than one hundred thousand individual applications was working together - or not. That has always been one of the biggest and most expensive problems in IT. To make applications work together. Interchange data.
Some of the applications was highly integrated. But some of the applications was either old or to complex or maybe it just couldn't pay to integrate them.
I remember on several occasions, where we needed to transfer data from one application to another. This could be ad-hoc or on regular basis. We always had to use the lowest common denominator: CSV (comma separated values) - or just plain text.
It was obvious to us, that if we needed to get better data interchange, we had to find some kind of standard data structure. I guess thats what data architecture is about. Very soon we made applications based on the same basic set of coding rules and data structures. No big deal really - as long as you do it from the beginning and as long as you stick to the one standard. If one single application didn't follow the 'rules', it would ruin the completeness of the application system.
But what if we had, lets say.... two different sets of rules for code and data? We could still keep the bank open. But it would require another set of rules... for the bridge between the two sets of application systems. This sometimes happens when two banks decide two merge. But the bridging will always be a temporary thing. It's too expensive in the long run. Eventually one of the two systems will have to give up.
Two standards is not a good solution if we look at the long run. It can only be a temporary solution. CSV is too expensive.
Oh, by the way, the big problem was the 105.000 existing applications :-)