Topic | Calibration/verification AC far out of spec?

Home Forums Mooshimeter Support Calibration/verification AC far out of spec?

Viewing 4 reply threads
  • Author
    Posts
    • #2524 Reply
      vortexico
      Guest

      I am one of the few people who have a Fluke calibrator (5522A) at my disposal.
      I ordered 2 meters and I see that at both meters the AC ranges (voltage and current) do not comply to the 1% spec.

      Sammy linked an excel which had a few measurements in it that were measured with a Fluke 5080a. Those were in spec.

      I measured Meter 1 first, but I couldn’t generate a report of it, so its a screenshot
      Today I measured Meter 2 and did it in the right software, so I could generate a nice calibration report.
      Yes, I work in a Fluke calibration laboratory…

      As you can see: the only signals that are in spec are the 50Hz ones, and even at 500Hz the signals do not meet the tolerance.

      I made a few mistakes myself, but it costs to much time to do it all over again.
      1. IDC 100mA: I used a too low resolution (100mA instead of 100.0mA)
      So 1 digit is 100% of tolerance (1 digit is 1% of reading)
      we always generate with 1 digit more,
      but than 1 digit is still 10% of 1% spec. Hence the ?
      2. 10k Ohm range: same story: I used 9kOhm instead of 9.000kOhm
      3. 0 Ohm: I used a too small tolerance. I should have measured in 0.0 Ohm
      Real value was 0.7 Ohm btw.

      @James
      – what did you do to verify the meters?
      – how did you come up with 1kHz bandwith?

      Another thing: I think the VAC/IAC ranges are expressed in Vpp and Ipp
      But the measurements are in Vrms and Irms.
      So the ranges should actually be 0.8V, 42V and 420V
      same goes for IAC of course.

    • #2545 Reply
      admin
      Keymaster

      Hi Vortexico,

      Thank you so much for hooking up to the calibrator! This is really helpful information.

      Regarding AC calculation: By default the AC calculation is done by sampling at 4kHz with a sample buffer 256 samples long. The calculation is done on board the meter itself between the two zero crossings nearest the edges of the buffer.

      The 4kHz rate performs well for 60Hz waveforms, which is what the vast majority of users do. But you’re right to point out that it doesn’t behave well at higher frequencies, and when I was validating the numbers before the meter went to production I was using 8kHz sample rate. I don’t want to make more trouble for you, but if it’s not hard I would love to see what the calibrator numbers look like when measuring at 8kHz and 256 sample depth.

      So to try to directly answer your questions:

      – what did you do to verify the meters?
      With the pre-production prototypes, I hooked a few up to a signal generator in parallel with a known good meter – I think a Fluke 115 – and swept through a range of amplitudes and frequencies while checking the outputs matched within 1%.

      The final calibration is all DC though, so not all the meters that go in to the wild have their bandwidth checked.

      – how did you come up with 1kHz bandwith?

      By design this is where the low pass filters on the inputs cut off by 1% (or 0.5%… I would need to dig through my notebooks to remember).

      I tested it experimentally as described above.

      – Another thing: I think the VAC/IAC ranges are expressed in Vpp and Ipp
      But the measurements are in Vrms and Irms.

      I understand why you would think that because I’m reusing the DC labels, but no the range is actually expressed as RMS. So the 600V range can actually measure DC up to around 1000V

      Thank you for using the calibrator, it’s a level of hardware I haven’t had access to when designing this meter. Those AC numbers >60Hz are disappointing so I’d like to get to the bottom of this. If you can I’d really like to know how it behaves with the sample rate pumped to 8K.

      Best
      ~James

    • #2560 Reply
      Anonymous
      Guest

      You mean you measure at the 2 zero crossings of the 4kHz sample frequency right?
      Because at the zero crossings of the signal would be strange.
      So you actually measure with 8kS/s then?

      …60Hz waveforms, what the majority…
      Americans… :P Most of the world uses 220V-240V @ 50Hz :P
      But that shouldn’t make a big difference in the measurements.

      *****************************************************
      About my measurement methode: (Interesting to know for most people)
      I connect the Fluke 5522A calibrator directly with the mooshimeter through a shielded test cable.
      Only for current I use standard twisted banana cable because shielded cables are not made for higher currents.
      For lower resistance (<10k) I use compensated 2 wire (4 wire to the calibrator) which is generated with 1mOhm accuracy

      For high resistances with tight tolerances (the high end meters) we use special measurement banana cables which are better insulated.
      That is where leakage current comes into play. We keep the + cable apart from the – cable of course.
      Our accredited lab also has to

      The calibrator generates the resistances, DC/AC voltages and currents with very nice specs.
      For example I generate the 240mV with 0.00242%, the 9A with 0.0556% and 1MOhm with 0.0034%

      Besides that is the 5522 every year measured in our accredited lab.
      The measured deviations are logged to ini files and used in the shown uncertainty on the report.
      The interesting part: it doesnt matter if it actually generates 1.1V when we want 1V,
      as long as it only drifts 13ppm per year, we still have an accuracy of 0.0013%

      I can take a few more measurements at 8kHz with 256 samples,
      even better would be 8kHz/512 samples since that would be the same time period as at 4kHz.

      *****************************************************
      about the 1kHz:
      The spec website states:
      Frequency: Better than 1% accuracy up to 1kHz
      Sampling : 4kHz analog bandwidth for most measurements

      *****************************************************
      Your measurement methode:
      I dont know what the amplitude tolerance of your frequency generator is,
      but I guess about 2% assuming it is recently callibrated.
      The PM5136 has for example 2% AC voltage accuracy

      Fluke 115:
      45 – 500Hz : 1% + 3 digits
      500Hz – 1kHz: 2% + 3 digits
      Since the bandwith of the meter is only 1kHz you have to cross reference it to know what is does at 1kHz.
      So I think you measured with 3% tolerance…

      This is the reason why cheap DMM’s often dont meet their specs:
      they didn’t care or they didnt have the equipment to verify the specs.

      Of course I dont blame you for not having such costly equipment :P
      We can buy a small house with that money… (its 70k with the 1.1GHz scope option)

      So to be continued…

    • #2561 Reply
      admin
      Keymaster

      Hi Vortexico,

      “You mean you measure at the 2 zero crossings of the 4kHz sample frequency right?
      Because at the zero crossings of the signal would be strange.”

      The RMS is calculated between two zero crossings of the signal. Catching a partial cycle throws off the RMS reading.

      “So you actually measure with 8kS/s then?”
      Not sure I understand. Right now the only way to measure with 8ksps is to set it manually through the app. I did early testing at 8ksps as well.

      Actually now that you’ve got me thinking about it, there’s a filter that was in early versions of the firmware that didn’t get added back in later that might account for some of the error at higher signal frequencies… in early versions of the firmware the zero crossings were fractional – the firmware was clever enough to say the zero crossing happened 5/16ths of the way between samples 4 and 5, for example. This got taken out somewhere along the line, I think because processing time became an issue and because the effect was minimal at 50/60Hz, but that might be having a larger effect at higher frequencies. Let me take a look.

      “Americans… :P Most of the world uses 220V-240V @ 50Hz :P”
      Don’t worry! I know! I just didn’t think it was an important distinction to make in this discussion :)

      “about the 1kHz:
      The spec website states:
      Frequency: Better than 1% accuracy up to 1kHz
      Sampling : 4kHz analog bandwidth for most measurements”

      Yup. If I can’t get this figured out and there’s a mistake, I’ll downgrade the frequency spec. Obviously I’d prefer not to do that, I think this is a firmware issue that can be hunted down.

      “…Since the bandwith of the meter is only 1kHz you have to cross reference it to know what is does at 1kHz.
      So I think you measured with 3% tolerance…”

      Yes, the setup I described is 3% tolerance. I did some other tests too which had much better tolerance. Sorry, it’s been almost a year since I was doing this validation so it’s not all on the top of my head.

      I checked my lab meter and oscilloscope against a traceable 0.01% DC reference immediately beforehand (voltagestandard.com sells a couple of nice options). I think once I got above 100Hz I switched over to calculating RMS on my oscope. I don’t think I calibrated for signal attenuation on my oscope because I wasn’t really worried about it, it should be flat up way past 10MHz so 1kHz didn’t worry me. So yes, my initial description was of a 3% check but I did some more that were much better. I wasn’t really expecting this depth of review :)

      Anyway, I’m very interested to know what you find at 8ksps. You’ve got me thinking about AC measurement past 60Hz again. I am not in a place where I can check right now but I have a strong suspicion that the AC error you’re seeing is numerical in nature and patchable. I’ll let you know what I find when I can get to it (probably early next week).

      Thanks again
      ~James

    • #2580 Reply
      Anonymous
      Guest

      I promised you a few more measurements on 8kHz
      See here the difference
      The differences are not very large, but you can see at 8kHz sample the values are a bit more towards what they should be. Still more than 5% off though.

      I don’t know what the best algorithm for calculating rms is, and if it is true rms or not, but I assume you chose the best one.
      That I don’t have the in-depth knowledge is probably the reason why I don’t understand that you sample at the 2 zero crossings of the signal.
      Isn’t it sufficient enough to calculate a running average (of the absolute values of course).
      Lets say calculate the average of the last 256/512 samples every 64 samples.
      Or average 64 samples and use it as 1 value in the running average.
      (an 8 value running average should give the same result as above, but probably with less processor time)

      About equipment again:
      watch out with using scopes as reference as well. Digital scopes tend to have an DC accuracy of 1.5 – 2.5% + some zero error. (even the high end ones)
      I would recommend to search for a 6.5 digit bench DMM.
      Most of the times those are accurate and have a low drift over the years.

Viewing 4 reply threads
Reply To: Calibration/verification AC far out of spec?
Your information:




This site is protected by reCaptcha and the Google Privacy Policy and Terms of Service apply.