Say you have a 500w amplifier, and its 2 ohm stable. And a sub that needs about 500w. Would It be worse on running it @ 1 ohm set to putout 500w, or running it @ 4 ohms putting everything you have in it to push 500w ??? Just something I was wondering and it was bothering me.
i think what hes trying to say is he has a dvc2 ohm sub that plays at 1 or 4 ohm. he has an amp rated for 500rms at 2ohms. he wants to give the sub 500rms. does he play at 1ohm and clip sub or at 4ohm turn everthing up and still clip sub. get a new amp is my opinion
I dont have a problem with any of my stuff. Its just something I was kinda wondering. Someone asked me that question and I wasnt too sure on it. I know both are bad, but will a clipped signal hurt your amp more or your sub? I was just using those specific details for an example, nothing particular.
Makes sense. I just wasnt sure how bad a clipped signal was on an amp compared to a sub. I know clipping it from a voltage drop is like death, but I just wasnt too sure about a clipped signal. If an amp goes bad from having a clipped signal, will it just fry and be over with all at once, or would the sound change slightly over time?