I think I understand how they're rated, please correct me if I'm wrong.
Most good receivers use a test tone covering all frequencies between 20Hz and 20,000Hz. As they turn up the volume they test the level of distortion. Typically when between 0.05-0.07% distortion is found they stop and take the power output at that point. They then set the volume control to 0db for that point.
Some receivers test all channels at once, some one at a time giving an inaccurate power rating.
Some receivers test the distortion level at a higher rate. Sony for ex, tests their speakers at a distortion rate of 0.6% (Almost ten times more distortion), and their ES line at 0.15%.
Two questions...
1) Why isn't there an industry standard for power ratings so customers aren't blindsided by an all-in-one Onkyo system that states 1000W total power, but is being measured one channel at a time with a 1Khz test tone?
2) In cases where the distortion rate is allowed to be higher to rate the power, don't those receivers experience clipping as they near 0db, possibly damaging the speakers?
Pay attention to the all channels [5 channel] rating and/or output at clipping. This will tell you what you need to know about a receivers power reserves. Onkyo and Yamaha deploy a protection circuit that cuts power in 5 channel mode after a few seconds in order to avoid overtaxing the power supply and overheating etc. It is debateable whether this is a problem in real world usage but both have plenty of power in most situations and have no problems in 2 channel mode at all. It would be nice to have a uniform standard but in the meantime look at the all channels ratings and of course let your ears be your guide.