I was doing an experiment on water hammer in pipes. During the experiments, in several cases, the transducer reported negative pressure, as low as -1.9 bar gauge! But I have read that it is not possible for the pressure to practically go beyond absolute zero. I thought it might be due to some calibration issues of the transducer, and recalibrated and double checked the experimental values. I did the same experiment again. But the pressure shot down to 1.9bar or 190 Kpa below gauge, when cavitation occurred inside the pipe.
The pressure transducer i am using is a 0-30 bar transducer (forgot the company name), and i am measuring the values at 50,000 samples per second. the operating pressure inside the pipe in steady flow conditions is 1.7bar gauge and it is being shown correctly.
Can anyone give a possible explanation for the same?
The graph denotes the actual pressure vs time graph obtained during the analysis. the sampling frequency is 100000Hz and the Amplitude shows pressure value in bar, in gauge scale. The time scale is shown in 10microseconds value
