My opinion is that code first's automatic database creation is only for development. I answered similar questions on Stack Overflow where I described both how to upgrade the database and why is automatic functionality bad in production:
Upgrading the database is semi-manual task. There should be no automatic untested magic behind - moreover EF 4.1 currently doesn't have any such magic available (there is only some presentation about features ADO.NET team is working on).
You can also check this question to more better understand how are web sites upgraded.
Personally, I've tried making one huge schema for all my entities on a fairly complex but small project(~300 tables) . We had an extremely normalized database (5th form normalization (I say that loosely)) with many "many to many" relationships and extreme referential integrity enforcement.
We also used a "single instance per request" strategy which I'm not convinced helped either.
When doing simple, reasonably flat "explicitly defined" listings, lookups and saves the performance was generally acceptable. But when we started digging into deep relationships the performance seemed to take drastic dips. Compared to a stored proc in this instance, there was no comparison (of course). I'm sure we could've tweaked the code base here and there to get the performance improved, however, in this case we just needed performance boost without analysis due to time constraints, and we fell back to the stored proc (still mapped it through EF, because EF provided strongly typed results), we only needed that as a fall back in a few area's. When we had to traverse all over the database to create a collection (using .include() unsparingly), the performance was noticeably degrading, but maybe we were asking too much..
So based on my experience, i would recommend creating a separate .edmx per intent. Only generate what you'll be using based on the scope of that need. You may have some smaller scoped .edmx files for purposed tasks, and then some large ones where you need to traverse complex relationships to build objects. I'm not sure where that magic spot is, but I'm sure there is one... lol...
Honestly though, aside from a few pitfalls which we kind of saw coming (complex traversing), the huge .edmx worked fine from a "working" perspective. But you'll have to watch out for the "fixup" magic that the context does behind the scene's if you don't explicitly disable it. As well as keeping the .edmx in sync when changes to the database are made.. it was sometimes easier to wipe the entire surface and re-create the entities, which took like 3 minutes so it wasn't a big deal.
This was all with EntityFramework 4.1. I'd be really interested in hearing about your end choice and experience as well.
And regarding you're question on nHibernate, that's a can of worms question in my opinion, you'll get barking on both sides of the fence... I hear a lot of people bashing EF for the sake of bashing without working through the challenges and understanding the nuances unique to EF itself.. and although I've never used nHibernate in production, generally, if you have to manually and explicitly create things like mappings, you're going to get more finite control, however, if you can drag n' drop , generate, and start CRUD'ing and querying using LINQ, I could give a crap about granularity.
I hope this helps.
Best Answer
The answer is YES, of course, you can use Code First in production. I have seen a lot of projects that succeed in it. You could use EF's migration engine: there are lot of examples around. Or you could give a try to another migration framework, like FluentMigrator.
And there is no danger of using migrations in production if you follow good practices in deployment: e.g. you have DEV, STAGE servers where you could test your migrations before production, + you make backups of database before each PROD release. Ideally, when all these processes are managed from your Continuous Integration server.