Java – Does decoupling trump DRY in REST

apicouplingdryjavarest

I am building a REST API to expose most of functionality of an existing Java API. Both APIs are for internal use within my organization; I do not have to design for external use. I have influence over both APIs but am implementing the REST one. The Java API will continue to be used for local applications (it's not being "retired"), but the REST API will be used for significant new development.

Some of the Java API classes are simply data (beans with properties, getters, setters). And at least some of these make sense to transmit (in some form) over the REST API as data (which will be marshalled to XML or JSON). For example, a class that stores info about a server machine. I am faced with the following choice for these data classes: Do I…

  1. expose the original Java class (or a subclass) directly in the REST API, or
  2. make a new data transfer class (DTO pattern) specifically for the REST API?

Either way I'll have REST data transfer classes; the question is whether to annotate the originals or create new ones (which may be near copies of the originals). There may be other choices, but I'll focus mainly on those two.

Arguments for #1:

  • DRY (don't repeat yourself)
  • Faster to implement
  • Easier to upgrade REST API

Arguments for #2:

  • What if REST API needs to be versioned separately from Java API? (This is somewhat likely.)
  • What if there are significant changes to the Java data classes such as removal of properties, adding of behavior, or changes to class hierarchy? (This is also somewhat likely.)

Bottom line is that it seems like a tradeoff between DRY (#1) and decoupling (#2).

I'm leaning toward starting with #1 and then if problems arise moving to #2 later, following the agile guideline of not building what you can't prove you need. Is this a bad idea; should I start with #2 if I think I may end up there anyway?

Are there major arguments/consequences missing from my lists?

Best Answer

Good question, simply put, decouple. That's the way to go here, you do not want to be tied to the version of java.

There's one scenario you don't decouple: if your technology will allow for non-type-specific objects to be sent over the wire, that is, if you can use the current java objects now as a YAGNI, and the replacement with a different type that you custom up will be a simple drop-in that won't break anything due to type information going over the wire. So, basically if type information doesn't cross the wire you could YAGNI on this.

Just be very careful and watchful that any upgrades to your java version don't change any of these objects. Otherwise just decouple now and don't worry about it, I guess it depends on how much time you have if you have a choice.

However, if type information goes across the wire in your protocol then a flat drop-in of new types when the underlying java types change may not be possible and may rather be a larger effort. If that's the case, going YAGNI now means accruing technical debt and risk related to your underlying technology.

Personally I would just decouple now.

Also, DRY doesn't play into this because you didn't write the underlying pieces so you aren't duplicating code and therefore won't have the pains of bugs repeated all-over which is the main concern of DRY (and general maintainability issues which again you won't have because you don't have duplicates to maintain)

Related Topic