As long as you can turn on MSDTC on all of the clients and servers that the transaction involves, MSDTC works like magic. It is about as simple as wrapping all of your calls in a transactionScope, and setting a condition for transactionScope.Complete() to be called. Definitely look into the security/performance issues this can cause in certain circumastances though. It is generally inadviseable to operate transactions over the internet, since it can increase the potential for attacks, but it works fine on an intranet.
It depends a bit on how much abstraction you need. Everything is a compromise; for example, EF and NHibernate introduce great flexibility for representing the data in interesting and exotic models - but as a result they do add overhead. Noticeable overhead.
If you don't need to be able to switch between database providers, and different per-client table layouts, and if your data is primarily read, and if you don't need to be able to use the same model in EF, SSRS, ADO.NET Data Services, etc - then if you want absolute performance as your key measure you could do far worse than look at dapper. In our tests based on both LINQ-to-SQL and EF, we find that EF is significantly slower in terms of raw read performance, presumably due to the abstraction layers (between storage model etc) and materialization.
Here at SO, we are obsessive-compulsive about raw performance, and we're happy to take the development hit of losing some abstraction in order to gain speed. As such, our primary tool for querying the database is dapper. This even allows us to use our pre-existing LINQ-to-SQL model, but simply: it is heaps faster. In performance tests, it is essentially exactly the same performance as writing all the ADO.NET code (parameters, data-readers etc) manually, but without the risk of getting a column name wrong. It is, however, SQL based (although it is happy to use SPROCs if that is your chosen poison). The advantage of this is that there is no additional processing involved, but it is a system for people who like SQL. Which I consider: not a bad thing!
A typical query, for example, might be:
int customerId = ...
var orders = connection.Query<Order>(
"select * from Orders where CustomerId = @customerId ",
new { customerId }).ToList();
which is convenient, injection-safe, etc - but without tons of data-reader goo. Note that while it can handle both horizontal and vertical partitions to load complex structures, it will not support lazy-loading (but: we're big fans of very explicit loading - fewer surprises).
Note in this answer I am not saying that EF isn't suitable for high-volume work; simply: I know that dapper is up to it.
Best Answer
I would like to point out that Entity Framework (full name: ADO.NET Entity Framework) is an ORM (Object Relational Mapper) that uses ADO.NET under the hood for connecting to the database. So the question "should we use ADO.NET or EF?" doesn't really make sense in that respect. Unless you re-architect your application, adding EF to the mix is simply adding another layer on top of ADO.NET.
I sincerely doubt that ADO.NET is the source of your performance problems, however. It sounds like the problem exists in your stored procedures themselves.
I think Luc Franken's comments are on the right track, though. You need to measure and determine exactly where the delays are happening. Then concentrate on fixing exactly that problem. Anything else is groping around in the dark.