Noise is added as RSS (Root Sum of Squares) but the noise itself is calculated as the sqrt (variance) - AKA the 1 sigma - the standard deviation.
Vn = sqrt(average((X-u)^2)) -> u = mean(X)
or
Vn = sqrt(average(X^2) -(average(X)^2)
I think this is a case of just confusingly worded Wikipedia articles. That passage would seem to suggest that if you ever have two noise temperatures then you can just add them together. As you've aptly explained, that doesn't make any sense.
Rather, \$T_{sys}\$ is a figure of merit that's calculated by measuring the noise added by some component. The input noise is \$T_{ant}\$. After passing through some component, the noise will be \$T_{eq}\$, which must be equal to or greater than \$T_{ant}\$. And the difference is \$T_{sys}\$, by definition.
If \$T_{sys} = 0\$, you have an ideal component which adds no noise.
If \$T_{sys} \ll T_{ant}\$, you have a realistic component which adds only negligible noise, and the signal to noise ratio (SNR) is not significantly decreased. Like a good LNA.
Thus \$T_{sys}\$ makes a convenient figure of merit: by comparing it with the input noise temperature it's easy to see how relevant the noise added by this component will be. If the input noise is already high there's not much reason to spend more money on components with a lower \$T_{sys}\$.
By using a LNA with a very low \$T_{sys}\$, the signal and the noise can be amplified with a minimal decrease in SNR. Once that amplification is done, the input noise (\$T_{ant}\$) is much higher (because all the noise power was amplified), so now all the components that follow can have a much higher \$T_{sys}\$ (and thus lower cost) without having an unacceptable impact on SNR.
Best Answer
If the noise sources are uncorrelated, then the power (variance) of its sum equals the sum of their powers (variances), because all the co-variances between them are zero, and only the variances remain.
An example with two noise sources (with zero mean):
\$\sigma_N^2 = E[N^2] = E[(X+Y)^2] = E[X^2]+E[Y^2]+2E[XY] = \sigma_X^2 + \sigma_Y^2 +2\sigma_{XY}\$
If \$X\$ and \$Y\$ are uncorrelated then \$\sigma_{XY}=0\$ and \$\sigma_N^2 = \sigma_X^2 + \sigma_Y^2 \$, thus:
\$\sigma_N = \sqrt{\sigma_X^2 + \sigma_Y^2} \$, where \$\sigma_N\$, \$\sigma_X\$ and \$\sigma_Y\$ are the RMS values of \$N\$, \$X\$ and \$Y\$.
It can be demonstrated for any number of sources, if required. The result ends up being the same: power of the sum equals the sum of the powers.