I’m just a fabricator and accordingly lacking in theoretical understanding. Maybe ya’ll can help me with this question about measuring vacuum.
I’m running a vacuum pump/bagging system at 1000m (3300 ft) elevation. Not super high, but high enough to to lose a little atmospheric pressure.
If I had a gauge that was calibrated at sea level, and brought it here, it would read about 85mm Hg (3.5" Hg) in normal atmosphere, without the pump on, right? And then if I had a pump that could achieve a 750 mm Hg vacuum, then the gauge would hit 750 mm Hg with the pump on? And the clamping force would be about 10 kPa less.
So what happens to a calibrated gauge? Seems to me that since the gauge is measuring a relative difference, if it’s zeroed at 1000m then I would only see a possible 675 mm Hg with the 750mm Hg pump running.
I’ve got two gauges, and at my (worn out) pump’s max vacuum one reads 680 mm Hg and the other reads 630mm Hg. Both are calibrated with the rubber plug. Obviously they are cheap gauges and one or both are wrong, but I’m just wondering what I could see theoretically.
Thanks!