Probe Calibration Standard Deviation & Artifact for verification

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Probe Calibration Standard Deviation & Artifact for verification

    HI everyone. Happy New Year !!
    I've done several review & search several topic that related with this, but can't find any.

    1)For standard deviation of the probe tip calibration, usually where is the value come from? Any related specs for reference?
    Or Is it according to CMM limits or depends on us to set the value?
    Currently I'm using PCDMIS 2016.0, & the default value is 0.01

    2)I'm still new with NADCAP M&I requirement , so for artifact that been used for verification, is it really required us to use big artifact for my big cmm machine?
    Currently I'm using Hexagon Global Advantage





  • #2
    1) As far as I know about the standard deviation, usually you want the value to be as close to absolute zero as possible. I think its usually determined by the main or lead programmer. I don;t think there is any reference document on that. Depending on part tolerances, number of points, touch trigger or scanning, and comparison to hand tools, you can figure out how much deviation is acceptable more or less. I have less than 0.005" is acceptable on my touch trigger cmms and 0.003" for my scanning cmms. Anything higher that than I do a through cleaning of probes, cmm head, scales, re home the CMM and restart the probe calibration.

    2) Not sure about NADCAP be the same size sphere can be used on any size machine. I don't think bigger machines get bigger calibration spheres in order to operate. Steel spheres are preferred for touch trigger and ceramic are preferred for scanning as per Hexagon.

    Comment


    • acgarcia
      acgarcia commented
      Editing a comment
      it could be lower, most of the time the std is in the 5 decimal place, under 0.0005".

    • JEFMAN
      JEFMAN commented
      Editing a comment
      depending on the number of hits, but I would say it's very very large as a calibration acceptance...

    • Aliffadhol
      Aliffadhol commented
      Editing a comment
      Actually the artifact are a sample part that being used for inspection. Looks similar for part validation to ensure results from inspection are same as per drawing

  • #3
    I think that std dev result (sigma) on a calibration gives something like 95% of hits are in the range ±2 sigma around measured diameter.
    So if you get 0.0005 mm (which is a normal result for a LSPX1 with a short tip), it means that 95% of the radii measured around the sphere are within ± 1µm.

    Comment


    • #4
      You might find something here: https://www.npl.co.uk/resources/gpgs There is a Good Practice Guide No. 42 that covers CMM Verification elements.
      Horizontals - Dual Arm 40-17-21 / Single Arm 30-14-16 - PCD 2013MR1 (64bit)

      Comment


      • #5
        My 30+ year old Validator is set to 0.005mm for max STDDEV, and I generally get 0.002 to 0.003mm
        sigpic
        Originally posted by AndersI
        I've got one from September 2006 (bug ticket) which has finally been fixed in 2013.

        Comment


        • KIRBSTER269
          KIRBSTER269 commented
          Editing a comment
          which is more dependable, you or the Validator?

        • Matthew D. Hoedeman
          Matthew D. Hoedeman commented
          Editing a comment
          that would be me, if you count all the years I've been running it. I've missed less days than it did before it got a retro-fit.

      Related Topics

      Collapse

      Working...
      X