- Joined
- Mar 22, 2014
- Messages
- 5,366
My Rockwell tester kept, indoors in a clean place away from the shop with central heating and cooling. ISO recommends testing at 23°C +/- 5° C
Thermostat reads 23.3° C / 74°F
I was just using it earlier today, read everything fine.
Used the $30 indenter trended 1rc low.
Switched back.
Everything back to normal.
Same temperature, same day, same tester, same operator, same procedure.
.
Thermostat reads 23.3° C / 74°F
I was just using it earlier today, read everything fine.
Used the $30 indenter trended 1rc low.
Switched back.
Everything back to normal.
Same temperature, same day, same tester, same operator, same procedure.
.
I use a LOT of precision measuring instruments in my work, though a hardness tester isn't one of them. My hand held rockwell tester has to be calibrated every use, to allow for temperature differences. Many of the precision gages at work are "user calibrated" no less than once per shift. Air spindles for bores, slide gages for ODs, indicator mics, regular mics, height stands, roll testers for gears, all are checked and rechecked during normal use. When you're measuring +-.0001, and .4999-.5002 is a tolerance you see nearly every day, it's that critical. Temperatures affect readings.
If your tester is not in a temperature controlled environment, that might be part of the new issue since you changed indenter. I'd calibrate the tester to read what the test block is verified to, and roll with it. After all, if you have a 60Rc test block, you want your machine to read 60Rc when testing it. If it does, you ARE in line with other testers. Like a lot of other measuring devices, hardness testers only show us a correlation between a known value and an unknown one, as your test block and your knife steel. "If my machine reads 60HRc on a verified 60HRc block, then I can be reasonably certain that when it reads 62 on my knife tang, it is, within acceptable tolerances, 62HRc."