Hello, and I already read many information about power setting in mWs, but still don't know some things. First, I have WRT54G 2.0, and WAP54G 2.0(same CPU). Linksys says: WRT54G 2.0 typical power is 15 dBm = 31,6 mW. It is FIRMWARE limited only, and the hardware can do more, but they don't know how is that. Broadcom spec. says: typical power for G - 15 dBm, for B - 20 dBm. I'm using DD-WRT 2.3SP1 on both devices, the router is outdoor, in a box with 8 dBi omni antenna. The WAP54G is in a box, outdoor with stock antennas ~ 500 m far from the router. With 251 mW in WAP54G I have - 76 signal, 22% quality 36 Mb only 10 TX errors. With 86 mW in WRT54G I have - 77 signal, 22% quality. Now, If the chip works on 31 mW, then everything above this is noise and distortion or not ? When I put the router with omni on 251 mW the speed slow down from 54 to 24 Mb. M-r BS said me that is because the distance in too small and signal distortion. I'm using the 13 channel, and I am own on it. So, what is better ? More power - more noise, but strong signals and probably interference risk with my own devices on that channel(1 router, 3 WAP54s,5 wireless cards), or little power - little noise, but weak signals and clear, pure channel for me and my devices. The cards seems to be 63 mW if it real. So what is better choise: 31 mW - clear channel, but weak signal with omni 84 or 100 mW(max 120 mW, I don't want more) - more noise, interference, heat, distortion, signal reflection, network lags, but strong signal. I know that the best amplifer in radio is the antenna, but I don't have money for more and better antennas. And is someone know what is the real, quality limit of power on the broadcom chip ? They report 15 dBm. Is hardware can do more without distortion and signal reflection? If the hardware can't do more than 15 dBm, then everything above 15 dBm is bad, bad choise? I hope you'll explane me... Thanks. And again, what is the hardware power limit in mW. 251 is a driver limit, what about the hardware ?