Other pins on the same chip carry logic level signals, which will cause measurable currents into the input impedances of those pins, as well as further switching activity within the DAC.
Those currents will cause voltage drops across the GND bond wires.
If it's a high resolution DAC, (above 16 bits), those voltage drops can be comparable to the analog output signal, and considerably larger than the output signal by the time you reach 20 bits.
Remember that the digital input signals are a million times larger in amplitude, (for a 20 bit ADC), with fast switching edges, and in close proximity to the analog output and ground.
Now separating analog and digital grounds can minimise the pollution on the analog ground, but even so, they will be connected at some point, and without extraordinary care, some coupling between them will occur.
Providing both true and inverted analog outputs is relatively cheap and simple. They both contain this noise, as they are both referenced to the same analog ground. But it is common mode noise, allowing a differential amplifier to eliminate this noise in a location relatively remote from the DAC itself.
Best Answer
First, some microcontrollers DO have D/A converters. However, these are far less common than A/D converters.
Aside from the technical issues, the main reason is market demand. Think about it. What kind of application would require a real D/A? It is quite rare to want a micro to produce a reasonably high speed analog signal unless the point is signal processing. The main market for that however is audio, and that needs a lot more resolution than you can build with the same process used to make the digital microcontroller. So audio will use external A/Ds and D/As anyway. DSPs that are intended for such applications have communication hardware built in to talk to such external devices, like I2S.
Otherwise for ordinary control applications, the strategy is to convert to digital as early in the process and then keep things digital. This argues for A/Ds, but D/As are useless since you don't want to go back to analog.
Things that microcontrollers typically control are controlled with PWM (PulseWidth Modulation). Switching power supplies and class D audio inherently work on pulses. Motor control, solenoid control, etc, is all done with pulses for efficiency. You want the pass element to be either fully on or fully off because a ideal switch can't dissipate any power. In large systems or where input power is scarce or expensive (like battery operation), the efficiency of switching systems is important. In a lot of medium cases the total power used isn't the issue, but getting rid of wasted power as heat is. A switching circuit that dissipates 1 W instead of 10 W may cost a little more in electronic parts than the 10 W linear circuit, but is a lot cheaper overall because you don't need a heat sink with associated size and weight, possibly forced air cooling, etc. Switching techniques also are usually tollerant of a wider input voltage range.
Note that PWM outputs, which are very common in microcontrollers, can be used to make analog signals in the unusual cases where you need them. Low pass filtering a PWM output is the easiest and nicest way to make a analog signal from a micro as long as you have sufficient resolution*speed product. Filtered PWM outputs are nicely monotonic and highly linear, and the resolution versus speed tradeoff can be useful.
Did you have anything specific in mind you wished a micro had a D/A converter for? Chances are this can be solved with low pass filtered PWM or would need a external D/A for higher resolution*speed anyway. The gap between filtered PWM and external is pretty narrow, and the type of applications that actually need such a signal is also narrow.