Is it still worthwhile to learn, for example, how to tune a 555 timer with resistors and capacitors, when you can write a timer program for a microcontroller in a human-readable programming language?
Or, to put it another way, are there problems that ICs are good for that microcontrollers are not?
Mostly, microcontrollers have replaced discrete ICs. I find that even if I could design a circuit with a 555, it's likely that the same circuit will need to be tweaked in a few weeks to do something else, and a micro preserves that flexibility.
But there are a few exceptions.
Discrete logic is still faster than most microcontrollers. The propagation delay and switching times for discrete logic are in the 1-10 ns range. To match that with a microcontroller, you have to be able to implement whatever logic you need in 1 instruction, and have a clock in the 100 MHz to 1 GHz range. You can do that, but maybe not on a breadboard in your garage.
A good example of this is the HCTL2020 quadrature decoder. It takes in two series of pulses and tells you which way your motor is spinning. It's implemented as a nonprogrammable chip for the sake of speed.
Another interesting area where both digital logic and and microcontrollers fail is in signal filtering. If you have an analog signal that you want to filter digitally, you have to sample it at some rate. No matter how fast you sample it, noise in the signal that appears at frequencies more than half your sampling frequency will get aliased down to lower frequencies, where it may interfere with your signal. You can solve this problem with a low-pass filter, made of a cap and a resistor, before your sampling occurs. After the sampling, you're screwed. (Of course, it's frequently the case that the noise won't overlap your signal in frequency, and then a digital filter will work great.)