The "fastest" way to do data operations is with a Sql Connection working with a sproc (assuming the sprocs are efficient). Period. Using an ORM like Linq to Sql, Entity Framework, nHib will not make your code run faster. Furthermore, if not done properly, there may be many db calls for a single task - something which would not be an issue with "normal" data access.
Does that mean ORMs are useless? Definitely not. As you said, creating a custom data layer would require a lot of extra coding. And ideally, that would mean a lot of testing. And not just simple unit testing, the number of integration tests should increase too. There's a productivity issue to consider here.
ORMs - when used properly - can maintain a decent performance level (although slower than raw data access) while at the same time promote some patterns that means the overall quality of code would increase. It could promote consistency and easier testability. You could, for example, keep your data access in a repository and the business code in a service layer. That would mean you could unit test your service layer and need integration tests for the data layer. Granted, something like this could definitely be done without an ORM - but it's far easier using an ORM.
ORMs are slower than raw data access, yet they contribute to manageability and testability. I use ORMs extensively and if the performance hit is holding you back, consider that the .Net framework or the JRE is many times slower than raw c code and a lot lot slower than x86 assembly.