Sometimes it happens that you come to understand something that you immediately
realize everyone must have known all along, so you kind of keep your mouth shut
and carry on, feeling a tad sheepish but glad you did finally get it. (At
least, I hope it happens to other people!)
This happened to me lately when I wondered why 90 psi is considered good
compression on my old tractor engine - then it struck me - it's only 6:1. My
revelation was that (I thought) compression ratio is simply multiplied by
atmospheric to get to whatever psi I should see in an engine - ignoring losses
due to friction and all that stuff; I mean, just conceptually. Ever simple!
D'oh!
But then not a month later - I saw a posting by someone on another board that
totally deflated my theory. He claimed that there are such losses that in the
ideal case, in my 6:1 example, you could see 180 psi or so.
I don't get it. I thought I had it all figured out, and life was pretty good
there for a while. My question is, how could it NOT be that atmospheric
pressure is being multiplied by the compression ratio to end up at the psi I
see on my compression gauge?
Please, scientists, if you could keep it easy, I would like an explanation.
Again, I don't care about valves have to be open so long, or rings have to be
by Total Seal, just if you could clear my thinking, my life will be good again.
TIA,
Jim (who was hoping he would see 150 psi on his rebuilt TR3 engine) Wallace
|