I recently worked for a company that had a single environment, yes, even development was done against production databases. I worked there for awhile and at first, I was aghast at how weird it was. Then when I left and went back to multiple environments; I am again aghast at how complex it is — not to mention that there isn’t much ‘force’ to push you to learn how to recover from mistakes in production (leading to more downtime than I ever saw with one environment). I was curious about when multiple environments became ‘standard’, but it appears to be a thing for as long as the internet can remember. Can someone who has been writing software since before the internet remember why we started to do things this way?
Story Published at: April 6, 2023 at 01:10PM
Story Published at: April 6, 2023 at 01:10PM