Pardon my French!
This is a term used by tool & die makers to indicate unobtainable levels of (perceived) precision. Why do I bring this up?
Last week, I was advising a reader on selecting pin gages for use in measuring chamber throats. The discussion revolved around which gages to buy, and whether or not he needed both plus- and minus-tolerance gages (no, in case you’re wondering.) He was concerned about their variance of .0002″ (that’s 2/10,000th of an inch, or 1/20th of the thickness of an average human hair. In machinist parlance, that would be “2 tenths.”) As I explained to him, in practice it’s not really possible to measure to that level.
As I thought about my answers to his questions, I flashed back to a conversation related to the posts I’ve made about measuring tools. A fellow who identified himself as a gunsmith recently contacted me to argue about my advocacy of quality measuring tools. “I don’t need any of them overpriced tools – I use [insert name of well known retailer of low end Chinese tools here], and I can measure down to a ten-thousandth!” I asked him if what he was measuring was under the same environmental conditions as the calibration on his micrometer, and he replied “my mic reads to a tenth – it don’t need to be calibrated!”
When a measuring instrument is calibrated – that is, checked against known standards and certified as to accuracy – the environmental conditions of that calibration are recorded. The calibration is really only valid for those same conditions; if the temperature goes up or down, that accuracy is not guaranteed.
How much different does a change in temperature make? I did a little experiment. I got out my Grade 2 Brown & Sharpe gage blocks, and picked out the .125″ block. (The tolerance for Grade 2 blocks is +/- .000002″, or two-milliionths of an inch.) On the calibration certificate, it gives you the deviation from the nominal dimension in millionths of an inch for each block. In the case of my .125″ block, it has no variance – in other words, it is guaranteed to measure .125000″ at 68 degrees F. Coincidentally, that is the temperature that my shop generally maintains outside of the coldest winter and warmest summer months.
After checking the temperature, I pulled out my best Etalon (Swiss) micrometer and the .125 block. I handled the mic with gloves while I secured it in its stand; the block was handled with insulated tweezers (yes, there are such things.) I measured the block under these conditions, and not surprisingly it measured .1250″ on the nose.
I took the block out of the micrometer, and held the non-measuring surfaces between by thumb and forefinger for about a minute, then remeasured. Guess what? Just that small amount of heat had caused the gage to grow to a bit more than .1251″ (a typical mic only measures to a ten-thousandth, and this fell just between the .1251″ and .1252″ marks.) Had I held on to it longer, it would have grown a bit more. Had I held the mic in my hand while measuring, it too would have been set “off” by the expansion of its metal frame.
That’s why they’re called “bullshit tenths” – because, without knowing exactly the temperature of both the micrometer and work, and at what temperature the micrometer was last calibrated, you really don’t know to the ten-thousandth of an inch how big that part really is. In other words, until you’ve met all of the above, you can’t measure to a ten-thousandth of an inch, no matter how optimistic you are!
Since pin gages are usually held in the hand, as is the piece to be measured, it would not be possible to get closer than several ten-thousandths. Factor in the other environmental variables, it’s clear that a) the gages are more accurate than they need to be for the job asked of them; b) you can’t measure to the limit of the gages, so you don’t need both the plus and minus coverage; and c) worrying about their allowed +/- .0002″ isn’t at all productive. Save your stomach lining for more important things.
Hope this all makes sense!
-=[ Grant ]=-