I work in a small programming team supporting a larger organisation. This year our manager has decided we are going to use Oracle Apex technologies to handle the vast majority of our company data.
This would be ok, except we only have one Apex server. Our manager has decreed that everything happens in that one instance. Our team is developing apps, while our manager demo's them, and our internal clients use them, which for obvious reasons is already causing problems!
I can only expect this to get worse as we become more heavily invested in Apex, the apps get more complex and the number of users grows. I've heard that best practice is to have separate development, testing and production environments but why is this the case?
The question: Why should we have separate development, testing, and production environments?
Best Answer
You have several activities going on concurrently:
Do you want all of these happening in the same environment? Do you want the business to grind to a halt because a new test has pushed your servers into swapping on hard-drives and are consuming every core on the processor? Do you want your tests to grind to a halt because a developer made a convoluted fork-bomb out of a scaling experiment? Do you want code that you thought worked because of a developer's twine and duct-tape in the tests to run in production? Do you want developers working with potentially sensitive production data (I know this isn't a concern in all businesses, but it is in a lot of them)?
What prevents these issues from happening?
Separate environments.
So what do you need?
You need separate environments.
To put it formally
You need separate environments for the following reasons:
For your context, a new technology platform
Maybe this isn't truly production yet (since it's a relatively new platform), but you'll get your separate environments when the business starts to rely on it and they are either wise enough to foresee the risk or realize it by learning the hard way.