Discussion in 'MustDie Firmware' started by blackpantherus, Jul 21, 2005.
What is the best value for the transmitter power? It depends in what?
:? Can you clarify you're Question? :?
well i hear alot of people that dont want to take a risk and blow the router go with 43mw but that is just them. i have 120mw on one router and 150mw on the other router. they rin fine, just get a bit hot but that is solved with a fan/heatsink.
ok I have the linksys WAP54G and I have just update mustdie firmware on my access point. I see by default the value of transmitter power is 22 and I can adjust it between 1 to 84...
My question is : What is the best value for transmitter power to have best signal?
it really depend on your needs. if 22 is doing goos night now then leave it. if you need to go higher then go higher untill you hit a value that is good enough.
It depends on too many things to actually be able to pin it down exactly. But there are some fairly easy guidelines which we could talk about.
First, understand that there are two radio transmitters involved. Lets generalize and say that one is an AP and one is a Client, even though they might be called different things if we use other configurations.
If you increase the transmit power of your AP, and do not also increase the transmit power of your Client, if all things are equal, at some distance between the AP and the Client the signal strength received by the AP will not be sufficient. The signal strength received by the Client is just fine, but your connection drops anyway. No increase in AP power output is going to keep it from dropping the connection at that distance.
If all things actually were equal the two signals would drop out at just about the same distance. But of course all things are never equal! Hence, sometimes it happens that the AP is still receiving a good signal, but the Client is not, and the connection is dropped. In that one single case it is true that if you increase the power output from the AP you will see a benefit!
So the question is actually, how often is that likely to happen, and at what power levels does it happen?
In fact it is common for path fades etc to result in a 3 dB difference in signals. But the difference is very rarely ever larger than 6 dB, simply because most changes will affect both the transmit and the receive path equally.
Given the above it is probably useful to increase the power output from your AP by 3 dB over what it takes to normally have equal signals at the Client and the AP. It is less useful but still perhaps on rare occasions worth having a 6 dB increase power in power from the AP. (Note that increasing power also increases interference to others, so your rare benefit should be balanced against the potential for not so rare interference...)
Now, if we assume (and this may not be correct at all!) that the default AP power is just about right, then twice the power (3 dB) is good, and four times the power (6 dB) is probably not really of any value.
The typical AP defaults to 28 milliWatts output, and it might very well provide noticable results to set it to 56 mW. If you insist on getting every last ounce of benefit, go for 112 mW. More than that is just massaging your ego and annoying the neighbors.
Personally, I can't see setting it higher than maybe 84 mW, which is about 4.8 dB.
All of that said, if you have a point-to-point link, and use high gain directional antennas... go for 251 mW. But that must be done on both ends to benefit from it.