Hello all. I am just looking for clarification here because we have to get some gauges made for this component to check true position and the gauge drawing we received a while back has a true position tolerance of 0.010mm for each individual pin to the datum. Our gauge needs to be made with a tight OD tolerance to ensure the concentricity measurement is correct, but also needs to check the bolt circle diameter to ensure true position. As you can imagine, 10µm is very difficult to hold on 6 individual pins when manufacturing a gauge. When I brought this up to our parent company, they told us that it was designed correctly to DIN 1938-1. He said the tolerance is so low because the pins can be at the largest or smallest size, or even different per PIN. The pins on the gauge are Ø10.015 ± 0.003mm, but can be worn up to Ø10.005. The ID on the gauge (to check the OD on component) is 94.998 ±0.003mm, but can be worn up to Ø95.008mm.
My question is....is there any reason this true position tolerance on the gauge has to be that tight according to that DIN spec? What specifically in the DIN spec calls out for this to be that tight? We have a copy of the standard, but just skimming through I was not able to find a requirement.