Actually, I don't think this is just a theory. The cooling capacity will
be reduced once the rate exceeds a certain point. As a former chemical
engineering student, I recall learning this in a class on heat transfer
(yes, it was one whole semester just on this topic). It relates to the
behavior of the flow of coolant changing from what is called a "laminar"
flow to a "turbulent" flow. There was a time when I might have dug up my
old textbook which I am sure I still have. But with the Internet, the
need for using such an old carbon based form of research is no longer
necessary. Perhaps a search of these terms I mentioned will give you
something more definitive than my recollections of a class a few decades
in the past.
David Councill
67 BGT
72 B
dcouncill@msubillings.edu
-----Original Message-----
From: owner-mgs@autox.team.net [mailto:owner-mgs@autox.team.net] On
Behalf Of Barrie Robinson
Sent: Tuesday, December 06, 2005 10:11 AM
To: mgb-v8@autox.team.net; mgs@autox.team.net
Subject: Cooling related to water flow
Some while back there was some discussion related to the flow rate of
water
through the radiator. Some opinions were that if water was passed
through
the radiator at a hefty flow it would reduce the cooling capability -
thus
not a good idea. My theory is that such a premise does not (and forgive
the pun) hold water. Perhaps the reduced cooling was due to water pump
cavitation and not because of the flow rate. If someone would point
me
towards any authoritive dissertations I would be most grateful and thank
you in advance for any responses.
Regards
Barrie Robinson
|