Electronic – How to use an ADC with a non zero lower reference voltage input

adcdc/dc convertervoltage measurement

I need to convert, linearly, a voltage in the range of about 1v to 2v into the range 0 to 1v for input to an ADC.

This is the background: I fiddle with electronics and programming as a hobby to give me interesting and challenging projects — but on this one I am stumped.

I have built a power supply to give me 0 to 1 amp, approx, and want to add an ammeter to it. I will use a Sparkfun ACS712 Low Current Sensor Breakout board which I can adjust to give a voltage output range from 1.000 to 2.024 volts over the input range of 0 – 1024 mA, so that a 10 bit ADC with these as its lower and upper reference voltages will give me a nice 1mV per mA per bit output. I will be happy with a 1% or so accuracy.

I intend, if I can, to use a Microchip PIC12F675 to do this conversion, but it allows adjustment of only the upper reference value. To display the current reading I will use an I2C link to a Sparkfun 7-Segment Serial Display COM-11442.

So my question is: please, how do I convert the 1.000 to 2.024 voltage range into 0.000 to 1.024 to suit the ADC?

As an alternative, is there a different small PIC that I could use, which has an ADC with a variable lower Vref to a 10 bit ADC, and also an I2C serial module? I have looked on Microchip's website, but the needle, if it is there, is in a bewilderingly impenetrable haystack of data.

I want to use a PIC because I can program them easily in Assembler, which is a process I view with somewhere between enjoyment and love, and I have the software and hardware to do it.

This is a specific question, please don't suggest other ways I can sample the current, I can do that myself. Thank you.

Best Answer

You said you only care about 1% accuracy, which is less than 7 bits of the full range. You can therefore use the 1.000-2.024 voltage directly. Even if you have a 10 bit A/D with a 0-3.3 V full range, you still get about 320 counts, which is more than 3 times your requirement. There is no need to shift or scale anything.

If you use a divider to create Vref+ instead of using the 3.3 V supply internally, then you get even more resolution. If you can bring it down to 2.1 V, for example, to leave a little margin, then you get 500 counts over your range. Thats lots more resolution than accuracy unless you use a separate precision reference. Consider that a divider made from 1% resistors will cause significantly more error than a 10 bit A/D using the reference. To get 1% accuracy, using a fixed external reference is probably the simplest way. A 2.048 V reference is almost perfect here.

Some PICs do have a optional Vref- input, but tying it to anything other than ground is going to decrease accuracy. Basically you'd be tradeing off accuracy to get more resolution, which makes no sense when you already have lots of resolution and accuracy is on the edge.

Your desire to get the raw A/D counts to represent some arbitrary "round" value is silly. Don't burden your measurement system with having to meet this arbitrary spec. Do the best job of taking the measurement, then the rest is simple conversion in firmware. You have a digital processor that can easily apply a scale and offset instantaneously in human time. The conversion to decimal will probably take more cycles, although that will be instantaneous in human time too.

Basically, think about what you really want to get out, proritize your requirements accordingly, and don't specify implementation details (like what one A/D count should represent). Your top priority should be accuracy, given your specs, since everything else pretty much falls out with a 10 bit A/D.

Related Topic