R – The N-Layer POCO/ DTO quandary

netormseparation-of-concerns

When there were only evil datasets and the microsoft application blocks your transfer objects between layers would be either datasets/datatables or DTO/POCO. I belong to the gang that likes using DTO/POCO.

Now with this sudden wave of mapping layers like SubSonic, Entity Framework, NHibernate etc, should I still be using my favourite POCOs?? I do this mostly and when working with ASP.net webforms 99% times end up using ObjectDataSource for binding to controls and the features specific to each type.

Should I give up this love for POCO and pass around IQueryables or Entities or things like that and make use of other DataSource objects??

What are the pros and cons of using these objects instead of DTOs ?? How will it hit my app design and performance?

EDIT: When will I get to use the other datasources like Linq Datasorce and Entity datasource etc?

Best Answer

Long live POCOs and DTOs. They are lightweight, easily serializable, strongly typed, bindable. Never had any performance issues with them, always made my higher level code cleaner.