Tag: Ut61E

UT61E Protocol Description

Posted by on March 18, 2013

Intro:

The protocol used by the UT61e is quite simple, each packet contains 14 bytes. It constantly streams packets as the screen is updated at around 2 packets a second. The 14 bytes are basically a string where the range, digits, function, status are contained.

The serial interface settings are:

Describing the protocol:

This snippet shows you how the data is separated, I handle the 14 bytes as a string, which simplifies the process of separating each portion and since this is a low sampling rate application with low priority, it’s not a resource hog.

I believe the last two digits are the “end of packet” limiter (CRLF) however I currently cannot test this, I wrote the code a long time ago and I forgot to comment on this slight detail, but as you can see I’ve defined the CRLF contants, so it must be there.

A list of constants:

Here are some constants from my UT61e (unpublished) library…

That’s all for now, I currently don’t have a serial interface to test with (Ain’t got the USB cable either) so I can’t finish the library as to post it, but hopefully I’ll get it done eventually.

The supplied software by UNI-T is pretty bad and it’s Windows only, hence the drive to write my own.

As it is, it should give someone a head-start if they’re about to write their own front-end. On the datasheet of the UT61e controller it’s all explained in fine detail, but I can’t recall the number at the moment.

This whole thing was part of a bigger picture, but I had to give up the concept due to lack of funding.

Alright, enough of this shoulda, coulda, woulda!

Cheers,
Gus

UT61E Calibration – It’s easy as 1,2,3…

Posted by on June 19, 2012

Where it all began…

Alright, my UT61E seemed to be off in the DCV mode by a few millivolts, nothing to be worried about but me being me I went ahead and got some precision voltage references, built a voltage reference box and went ahead to calibrate it to within 0.05%.

There doesn’t seem to be a lot of information on how to calibrate most meters, unless they are high end and high volume (such as some flukes, but then again you’d typically send it out for calibration to get it certified…) — So I’m going to describe the process and where to locate the right pot. Thankfully though in this case the silkscreen did most of the work for me, but you may still be wondering “is this the right pot?” – Why, yes it turns out it is.

Before you begin:

It’s imperative to have a precision voltage reference or a known good, calibrated meter and a stable voltage supply to work with. It’s also ideal to have more than one reference to test with.

Make sure you use short test leads to connect to your reference — If there’s no compensation going on at the reference. Another important aspect of this process is to know your references specs and to actually perform the calibration in a place with a temperature between 20 and 25 degrees celsius.

Another important aspect:

Keep in mind your battery should be new or within 80% of it’s charge before you begin with the calibration, I disregarded this and kept my poker face (and my game face is as good as any professional or online poker player out there) throughout the process.

Turns out this meter is quite stable and has a broad operating voltage with very little deviation compared to my other meters. I will plot a comparison chart demonstrating this in the near future.

Usually the calibration is performed at the base range, for example in this case the 22.000 count meter has a 2.2000v base range. So I began calibrating with my 2.048v reference.

 

There’s an “auto calibration” mode in this meter, but it doesn’t seem to perform what it claims. Despite fiddling with the buttons I never managed to notice a single difference in the readout, what exactly is it calibrating anyway?.

 

From what I can see the “CAL” display appears when it’s simply gathering the initial MIN, MAX and PEAK values when you press the pertinent button, it’s not performing a self DCV calibration whatsoever.

 

 Calibration Steps:

  1. Remove the 3 screws  from the back, pop the battery cover out.
  2. Remove the back cover by lifting the back and wiggling it upwards.
  3. Locate the multi-turn potentiometer near the resistor array.
  4. Power up the meter and proceed to DCV mode. At this point you can choose to use your battery or a regulated power supply set to nominal voltage.
  5. Connect your precision voltage reference.
  6. Fiddle with the pot until you meet your voltage standard. Let it set for a few minutes, then re-adjust if needed.
  7. If you have other references ( in my case 2.048v, 4.096v, 10.000v ) It would be convenient to use them as well, this will allow you to compare against the other ranges and calibrate so that you meet your standard in between.
  8. Make sure you switch ranges manually, the readout should be within the range’s maximum resolution.
  9. Put it back together, double check to see that the cover shielding makes contact with the spring. Otherwise the meter will be prone to EMI.
  10. Enjoy your DCV calibrated meter!

 

You should repeat the verification process at least once a month for all your meters, unless certified — in which case don’t touch it!. And yes, I lied about the steps being 3.

 

Notice: If one of your ranges is way-off, something horrible happened to your precision resistor array, you may be able to offset by soldering a potentiometer across the right leads, but this is something I haven’t had the horror to play with so far.

 

Some photos of the process:

1 screw, the battery cover pops out, two more screws and the lid is released.

Looks OK, I don’t know why some people complain about it, the construction quality is adequate for the price.

This is the DCV Calibration potentiometer. A bit out of focus too.

 

That’s a neat battery holder!

 

Good ´nuf. Note to self: remove dust from in between the display next time.

 

That’s all for now, enjoy!

 

PS: If anyone wants the schematics to the reference box, I wouldn’t mind uploading them; Just ask. But keep in mind the magic happens in the precision reference ICs, the rest of the circuit is just to ensure proper stability and power input protection. The former is rather important, you don’t want your reference to be oscillating, which was the case with the original design, ouch. A bit of extra capacitance on the outputs solved it though.