I have a Dakota Ultrasonics ultrasonics gauge. We test almost every block that comes though. Some need the bores shifted while boring to preserve the major thrust, some don't qualify, some are OK.

My gauge allows me to physically calibrate the observed reading and sound velocity against a known measurement on the block. The front main web with both sides machined is a good place to do this. Mine also has a "probe zero" function that must be performed before each use. Simply measure the spot, test with the ultra sonic, then calibrate the observed reading.
Changing the observed measurement to the actual measurement will automatically change the sound velocity in the gauge. This will always vary depending on nickel, tin and lead content of the iron of each individual block.

Assuming that every block uses the sound velocity of .1825 in/us leaves a LOT of room for error. Cast iron can vary from 0.138-0.220 in/us.

To me, a data sheet without the sound velocity information (verified by a comparative analog measurement) written right on it, would be unless.

I had a shop bring me a block with a data sheet shoeing some cylinders in the .070” range. He didn't believe it, so I did my calibration and test. I found the block to be well over .125" everywhere.

Moral of the story; probe zero, calibrate your sound velocity, record all data.





Last edited by cedarmachine; 04/15/15 09:03 AM.