I disagree with Brian's answer on that other question.
As far as I know, there's no implicit concept on any other language. The change history for Scala implies that implicits were a generalization of view bounds, and a view bound is, itself, a generalization of automatic type conversion, which is very common indeed.
Implicits then enabled type classes, but I'd be very surprised if that was the original intent.
Edit
The release notes for Scala 2.0 (2006) say:
Views in Scala 1.0 have been replaced by the more general concept of
implicit parameters
That doesn't mean, of course, that implicit parameters were introduced with the goal of replacing views.
However, Odersky clearly likes it when once concept can replace multiple ones. In that sense, it may well be the case that Odersky wanted type classes, but did not want to introduce a mechanism to handle that exclusively, and, therefore, came up with something else that let him remove one concept (Views) and replace it with a more generic concept that handles both views and type classes.
In case anyone is interested, Poor Man's Type Classes, referred to by Brian back at Stack Overflow, is dated 2006 as well. Clearly, Odersky was aware of the link between implicits and type classes when he introduced them.
Yet, I stand by my claim. :-)
Undefined behavior is used in situations where it is not feasible for the spec to specify the behavior, and it has always been written to allow absolutely any behavior possible.
The extremely ultra-loose rules for UB are helpful when you think about what a spec conforming compiler must go through. You may have sufficient compiling horsepower to emit an error when you do some bad UB in one case, but add a few layers of recursion and now the best you can do is a warning. The spec has no concept of "warnings," so if the spec had given a behavior, it would have to be "an error."
The reason we see more and more side effects of this is the push for optimization. Writing a spec conforming optimizer is hard. Writing a spec conforming optimizer which also happens to do a remarkably good job guessing what you intended when you went outside the spec is brutal. It is much easier on the compilers if they get to assume UB means UB.
This is especially true for gcc, which tries to support many many instruction sets with the same compiler. It is far easier to let UB yield UB behaviors than it is to try to grapple with all the ways every single UB code could go wrong on every platform, and factor it into the early phrases of the optimizer.
Best Answer
The origins of indented code probably can be found in ALGOL: