R – Switching data access strategy far into a project

auto-generatedata-access-layermanualnet

In a project we have implemented the data access layer (DAL)
with a visual designer that auto-generates a lot of code
(in our case: strong-typed DataSets and DataSetTableAdapters in .NET).

However, using source control
I find it troublesome to edit and add new things to the DAL.
We have started coding new data access
by manually writing the SQL statements
(in our case: ADO.NET SqlCommands etc) which seems cleaner to me to edit,
and especially to see the changes in via source control.

But I'm also worried about mixing the methods of data access.
What would you suggest?
Stick with the auto-generation method,
continue converting to 'manual' SQL statements
when changes are needed,
or something else?

Edit: Inspired by the nice answers
that address the general problem of switching data access strategy,
I have generalized the formulation of the question.

The handling of the model data is not very object-oriented.
We use .NET DataTables instead of custom objects.

Best Answer

If the choice is between converting to manual ADO or continue using datasets+table adapter, I think you're better of staying with the datasets. You get free CRUD by using it, and thus less time used on creating and maintaining sql which doesn't give any value.

By the way you phrase your question, it doesn't sound like you're going for a more object oriented approach either, which could be an argument for going away from the dataset+table adapter approach.

You might want to do some research/prototyping into the OR-mapper + pure objects domain as well, if you have increasingly complex business logic to handle. It's less effective for the RAD approach though. I'd check out Linq 2 sql (if you have a simple schema/object structure and are happy with 1:1 mapping between object and table) or NHibernate if I were you. Entity Framework isn't mature enough. The next version will be better, but it is still potentially a long time coming.

Related Topic