Copying 100 billion records per day to Oracle – how

oracle

I have received a request to make a 100 billion records database available on Oracle, too. Optimizing using replication is not an option, unfortunately, so I guess the easiest solution is to create a new Oracle database, and copy all the data, and do this once per day.

What kind of Oracle server would do this well? Is there anything specific that I need to take care of in this regard?

Best Answer

Not enough detail to give a quality answer, but I think 'server' is going to be 'servers'.

If you have 100,000,000,000 records at 100 bytes each, that's 9,536,743 MB per day without any incidental I/O for indexing, etc. Divide that by the number of seconds in a day and you get 110 MB per second. Even that is assuming even distribution and a full 24 hours. That's right about at the theoretical max for GigE.

In other words, you're going to be maxing 'normal' bandwidth and disk I/O even with these simple assumptions.

Something tells me that you really want to think this design through.