One of the big advantages of creating Web products is that effectiveness can be immediately measured. One lucrative example of this is click-through web advertising.
Another example is A/B testing where you can put up two different versions of a web page, each visitor randomly seeing one version or the other. You can then measure the comparative effectiveness of each version of the web page, measured in terms that you care about, such as the percentage of visitors that buy your product.
Here is one example from the Firefox download page
The only difference between these two versions is that the download button in the first has the text "Try Firefox 3" while in the second it has "Download Now - Free". Over almost 300,000 trials the first one resulted in 9.7% of the visitors downloading Firefox, while the second one resulted in 10.1% downloads. By repeatedly doing A/B tests, changing one thing at a time, you can incrementally increase your conversion rate -- using actual evidence rather than guesses or conventional wisdom.
This and many other examples are on the abtests.com web site, which provides a fascinating insight into this technique. Here people share the results of their A/B tests, rarely available because most companies regard them as proprietary.
What is particularly interesting is that these are not all good A/B tests. Some are badly designed, and some have such small sample size that the results are not statistically significant. By reading through the comments that people make you can learn best practices to follow and also bad practices to avoid.
All in all abtests.com is an interesting site, and I hope people keep on sharing tests there.