I think you are really asking a broader question, "is having a strict definition of a file format a good thing for a rapidly evolving project".
To answer your immediate question, though: yes, they are. The XML schema gives you a strict definition of the format, answer a lot of questions about validity, provides great documentation, and allows you to confidently know that a specific version of the document has a specific form.
They are not the be-all and end-all, though: they define structure, not semantics, so you can still change the "meaning" of a tag between version of the document without having to change the schema. That causes just as much trouble.
To answer the question I think you are asking: yes, the XML schema is a good thing.
It is forcing you to address a painful fact, which is that your data exchange is constantly changing versions, and that means you have to adapt your system to account for that.
If you only had the IOS model, where you take a "rough guess" at what this version means, you open the door to all sorts of trouble in the long run. For example, it becomes trivial for someone to assume that "element foo being present means version 1.2, so tag bar means ...".
That is great, right until version 2.0 adds back the foo element, with a different meaning, and tag bar isn't even there. Welcome to "inconsistent behaviour" city.
If you use XML without schemas, or JSON, or something else that doesn't impose that cost on you then only a tiny bit of the problem goes away. You still have to deal with all four versions of the input, but you have less tools to help you out.
You should, in my opinion, generally prefer to make the pain of changes proportional to their real, long term cost. Changing the data exchange format has a high long term cost - you have to deal with compatibility, with data upgrades, and that sort of thing.
If that costs little you will be tempted to do it a lot, and then will pay the maintenance cost tomorrow. If it costs more now, you might think harder - can you do more than one thing in this change? Can you get away without it? Can you do it smarter?
In short, I think your real problem is that your file format changes often, not that you used (or didn't use) XML schemas.
(Also, are your users really happy if the server or client randomly drops content that is represented in the newer version? I would be surprised if that remains true forever - another of those long term costs you need to recognise somewhere...)
XML has two very important attributes that make it attractive for data transfer between heterogenous systems:
- You can pass it through firewalls, and
- You can usually find reader/writer libraries already written to create and parse it.
If you're looking for something less verbose that still has both of these attributes, you can try using JSON.
If you're simply transferring data between two homogeneous databases on the same network, there are probably easier ways. For example, Microsoft SQL Server has at least three different ways to transfer data between databases: Bulk Insert, SSIS, and Replication.
Best Answer
You can put each set of classes of a schema into an assembly of its own. Then, load the right assembly dynamically at runtime after you identified the schema version. This technique is often used to create plug-ins (see here for an example).
To avoid the need of having all of your 200 classes implementing an interface, your import code can make use of the C# 4.0
dynamic
keyword for accessing schema objects, regardless from which assembly they are loaded, as long as the method signature does not change between versions. See here for an example.The drawback is that you will loose type-safety, you will have to decide if this is acceptable in your case.