However, when I move to the remote location, even though the wifi signal strength is either 3/4 or 4/4 bars, the packet loss is massive, and the latency varies wildly where for a stretch it will report pings of about 50ms, but then will drop to 800+ and about 10% are just timed out packets altogether.
if the signal strength is good, why would this distance create such horrible packet loss?
Signal strength is only one metric... also consider signal to noise ratio, which is often the problem for scenarios like this.
Wifi latency and packet loss are cousins of each other. 802.11 frames contain a sequence number that is ACK'd... if the sequence number isn't ACK'd (due to loss or a bit error in the original frame), then the sender attempts to retransmit the frame a certain number of times. These 802.11 retransmissions show up as increased latency or outright packet loss if the interference is bad enough.
I have literally seen 802.11g latency that is over 40 seconds (yes... seconds) when I'm only 50 feet from the LWAP. That particular environment had a lot of tools that also operated in the 2.4GHz bands, so obviously the potential for errors was quite high.
what could cause such interference? I'm in a residential neighborhood, but there is absolutely nothing, line-of-sight between the laptop and the signal repeater other than 2 walls (with whatever electrical wiring is in them).
Wifi operates in (mostly) open spectrum bands from the FCC... bluetooth, microwave ovens, phones, toy cars, we can only speculate about the source of the interference.
You could try using a directed antenna with a focused beam (i.e. a yagi, or a cantenna) on your stations... those might help if the interference is not in the direct path to your wifi source.
Finally, if you have a wireless sniffer or access to a linux system (suggestion: Backtrack Linux LiveCD) then you can diagnose your wifi problems with Wireshark / tshark. Cisco also has a good reference for Wireshark 802.11 display filters, which help filter out noise so you can focus on the problems at hand.
A CCI signal of -80dBm is high enough to cause problems. If the APs hear each other at around -80, then a client in between the two APs would hear both APs at a higher level. That's where co-channel interference happens.
The problem is not the APs, but the clients. CCI doesn't just corrupt frames (any noise will do that). CCI interferes with the clear channel assessment (CCA) function in each client. When a client is ready to transmit, it checks to see that the channel is clear (that no one else is transmitting). If the signal strength is strong enough that a radio can decode signals as 802.11, then it will fail the CCA test and the radio will defer transmitting. A nearby transmitter on the same channel will make the client wait for a clear channel even though the AP it's associated is ready to listen, reducing throughput.
The co-channel signal strength should be 20 dB below the AP strength at edge of the cell. Since the cell boundary is usually -67 dBm, CCI should be -87 dB or less.
Best Answer
The medium for wireless communication is air/space which is limited. There is only so much of it. The more clients connecting the more medium is used less medium there is for others. The signal strength would not really be compromised. The signal strength depends on the room parameters (humidity, type of walls, other radio frequencies in operation). There is a limited amout of clients that can connect. The limit is generally bound to the specs of the access point, but in essence would be limited to the amount of available bandwidth/space in the air within that given frequency. So I think to answer your question, the more users connecting to an access point the less performance in general you can expect, this resulting because of shared bandwidth/medium (space). Signal strength is more of a constant, how far that signal gets or how many clients your access point can serve is based on other factors.