I've got a WPA wireless network set up that utilizes a total of 4 WRT54GL routers running Tomato 1.21 as access points (on channels 1, 6, and 11 [the two APs furthest from each other both utilize 11). Each AP generally supports about 5 clients at a time (though sometimes as high as 10). Generally, everything is working, but the clients furthest from the access points occasionally lose their connections, and some users have reported periods when they are completely unable to obtain an IP. I am virtually certain that this is based on poor signal strength resulting from distance from the APs and/or RF interference from other APs in the building. From the Device List in tomato, I'm able to see that the users having connectivity issues generally show an RSSI of around -89dBm with a quality of 10 (the Noise Floor is -99dBm). I have done everything possible to improve signal strength by router placement, optimizing channel usage based upon RF surveys, using 84mw transmit power, and upgrading to high-gain omnis. This leaves nothing to do apart from tweaking the advanced wireless settings to marginally improve problems related to weak signal and/or RF interference, so I've been reading everything I can find on these boards and elsewhere about changing Fragmentation Threshold, RTS threshold, and beacon interval. However, I'm left with the following questions. 1) There seems to be disagreement about Fragmentation threshold and RTS threshold settings. Generally, I understand what the settings do, but I am reluctant to change them when there doesn't seem to be a consensus about exactly what they should be. Some (including the Linksys Technical Troubleshooting Wizard) recommend that both be set to 2304. I have also seen people insist that Fragmentation be set to 2306 and RTS to 2304. A few recommend 2306 for both thresholds, and some advise 2306 for Fragmentation and 2307 for RTS (though by my limited understanding, it simply disables RTS when the value is higher than the fragmentation threshold value). Which of these settings is best? And more importantly, WHY is it the best? 2) With respect to beacon interval, I'm a little less clear as to exactly WHY this would improve connectivity, mostly because it's hard to understand why a client would see much difference between, say 10 beacons a second and 20 beacons a second). I've seen both 75ms and 50ms recommended to replace the default of 100ms. For a network of my size (4 APs, averaging 5 users each), will increasing the number of beacons (and hence the RF traffic even when the network is idle) pose a problem? If not, should I use 75ms or 50ms? Will more frequent beacons decrease the lifespan of the radio? Since these settings will affect all users, I want to make sure that I'm using settings that will be beneficial on the whole. The last thing I want to do is inadvertantly make things worse, and since I can't test things directly from the standpoint of each user, I was hoping that someone could explain to me in greater detail what settings I should try and, more importantly, why. Thank you for your help!