Should I set my gain at the frequency that has the lowest ohm load or the highest? I've been thinking about this for a while. Most of the time I just use a 50Hz tone at 0dB.
set the gain on the amp to match the INPUT line voltage, not the output voltage. You're dealing with two different parts of the amp here. If the amp has an adequate power suply, it shouldn't clip on the output at any load, as long as the input signal isn't clipping, and the gain on the amp matches that input voltage.
If you want to use the output voltages though, then use the lowest impedance. that will be the amp's highest output.
When I check the output voltage it reads 12v or so. I'm looking to get 600w at 2ohms so thats about 34.6v. I know that 12v is not right. Thats less then 50w a sub, and it sounds like more then that. The DMM is set to AC and everything should be right I changed batts and everything.
volts * amperes = watts by ohm's law, but for the output I'd multiply the result for the input by 0.8 for a class D amp to account for the 80% efficiency