Hey guys, i want to adjust my amp settings to minimize distortion. Now on my amp i have the possibility to adjust sensitivity according to voltage output. Now do I adjust the amps sensitivity according to rated output power or the birthsheets peak power?
I would adjust it to the stated output. For example if its a 1000 rms amp at 1 ohm but the birth sheet says more, just do it to the 1000 rms. That'd be the optimum way to reduce distortion/chance of clipping.
Now i got a procedure to do this can you confirm its right? Turn up the volume on hud to 3/4 with a 1khz sine wave, with the speakers disconnected measure the output voltage from each channel and increase the voltage sensitivty on the amp to match the optimum output voltage using the multimeter.
So if an amp is rated 50watts @ 4ohm than p=v^2/r v=14.14 volts?
so if im getting 12v i should increase the sensitvity to 2.14 to get a total of 14.14v right?
Use a 50 Hz sine wave. You can find them online for free. Depending on the voltage output from your HU you want to choose the high or low setting on the amp.