Ok, let me first state that I know very little about the actual functional specifications of the internals of an engine. Though I am trying to learn, it is pretty much a foreign language to me (computer tech/network engineer). What I am trying to understand, is engine compression, but not a basic concept - that I think I have (pre-supercharging, change the pistons to reduce the compression to allow for the post-supercharged compressed air volume), but more of a "why"?
Let me explain where my confusion stems from (using an E-Force for reference).
The stock 3v Mustang GT compression ratio is 9.8:1 (according to various web sites). The E-Force (and many other superchargers) boast as a "bolt-on", not really eluding to the need for changing the internals (to forged), nor dropping the compression (a Livernois block I looked at was a 9.1:1 or 9.7:1). I assume in the Livernois block mentioned, the 9.1:1 is for supercharged applications, and the 9.7:1 is N/A.
So, understanding that boost=more compressed air=more air in the cylinder to fire/expand on ignition, why do you need to lower the compression; in other words, why lose horsepower/torque to gain it again. Why not just leave the compression the same?
Assuming (yes, I make a lot of assumptions) this is due to some natural limit of maximum compression in a cylinder before it essentially blows apart, is there a compression ratio-to-boost "chart" anywhere that could possibly give a guideline as to what compression ratio to aim for for a given range of horsepower?
Reason for asking: I recently called an engine builder here in Houston and got a quote on swapping out the internals in my stock 3v block to forged. They mentioned dropping the compression (don't remember to exactly what), but I'm wondering if they base their compression ratios on the 1200-1500 HP engines they've built in the past. Perhaps the compression doesn't need to be AS low for me (I feel like I'll probably be happy with 600-700 HP), and making it that low would actually "cost" me some power.
On the flip-side, is there any HP "level" where it is essentially understood that you DO need to change the compression ratio? If the E-Force and many others advertise themselves as applicable "bolt-on" power-adders, then they also (in my opinion) state they are "safe" for the stock compression ratio of 9.8:1. I'm assuming this is for the stock low-boost pulley only, but when would it then truly become necessary to drop the compression ratio inside the cylinder?
As I said, I know very little to nothing about the inner workings/specifications for the function of the engines. The only way to gain knowledge is to ask questions, and though it may seem like a stupid question to some, I wouldn't ask if I honestly didn't have the slightest idea. I only hope I didn't make this too confusing for those who would like to help me understand.
Let me explain where my confusion stems from (using an E-Force for reference).
The stock 3v Mustang GT compression ratio is 9.8:1 (according to various web sites). The E-Force (and many other superchargers) boast as a "bolt-on", not really eluding to the need for changing the internals (to forged), nor dropping the compression (a Livernois block I looked at was a 9.1:1 or 9.7:1). I assume in the Livernois block mentioned, the 9.1:1 is for supercharged applications, and the 9.7:1 is N/A.
So, understanding that boost=more compressed air=more air in the cylinder to fire/expand on ignition, why do you need to lower the compression; in other words, why lose horsepower/torque to gain it again. Why not just leave the compression the same?
Assuming (yes, I make a lot of assumptions) this is due to some natural limit of maximum compression in a cylinder before it essentially blows apart, is there a compression ratio-to-boost "chart" anywhere that could possibly give a guideline as to what compression ratio to aim for for a given range of horsepower?
Reason for asking: I recently called an engine builder here in Houston and got a quote on swapping out the internals in my stock 3v block to forged. They mentioned dropping the compression (don't remember to exactly what), but I'm wondering if they base their compression ratios on the 1200-1500 HP engines they've built in the past. Perhaps the compression doesn't need to be AS low for me (I feel like I'll probably be happy with 600-700 HP), and making it that low would actually "cost" me some power.
On the flip-side, is there any HP "level" where it is essentially understood that you DO need to change the compression ratio? If the E-Force and many others advertise themselves as applicable "bolt-on" power-adders, then they also (in my opinion) state they are "safe" for the stock compression ratio of 9.8:1. I'm assuming this is for the stock low-boost pulley only, but when would it then truly become necessary to drop the compression ratio inside the cylinder?
As I said, I know very little to nothing about the inner workings/specifications for the function of the engines. The only way to gain knowledge is to ask questions, and though it may seem like a stupid question to some, I wouldn't ask if I honestly didn't have the slightest idea. I only hope I didn't make this too confusing for those who would like to help me understand.