Electronic – How to drop 19VDC from a 0-60VDC source to get an accurate linear 0-48VDC signal

dcsensorsignalvoltage

I've been using the Stack Exchange website for years, but never asked or answered a question, so first I'd like to say thank you for all of the help over the years. On to the problem:

I want to measure the output voltage of a pair of solar panels that can have a peak open circuit voltage of more than 48VDC. I want to use that with an LM3914N to indicate output between 19VDC and a maximum indicated 48VDC – but I want the circuit to be able to handle up to 60VDC from the solar panels. This means that the voltage divider will "see" from 0-29VDC for a solar panel output in the 19-48VDC range, and divide that by 5.8 to get the driving signal for the LM3914N.

Right now, I have a breadboard prototype that uses a 19V Zener diode (avalanche diode) in series with a ~75KΩ total resistance voltage divider to provide a 0VDC signal at ~19VDC panel output and increases to 5VDC at 48VDC panel output. The current through the zener and resistor voltage divider is <600µA maximum, and the LM3914N input signal impedance is relatively high (~20KΩ), so power dissipation isn't a problem at these levels.

My question is: Is there a more precise, efficient, SIMPLE, and preferably, inexpensive method of dropping the panel output by (as close as is practically possible) exactly 19VDC under the fairly wide input range from 0VDC to 60VDC? Is there a way to use something like a TL431 in place of the zener to do that? (The TL431/TL431A from TI, OnSemi, and Fairchild are only rated for a maximum cathode voltage of +37V, so that would not meet the specs, but others that I'm unaware of might.) If so, a simple schematic showing how to use a TL431 or similar to do this would be a great help.

What I have now is not as accurate as I would like at the low voltage end of the input range, with the zener not being as linear as I want it to be. It's consistent, but not very linear from ~18-23VDC panel output, improving significantly as the solar panel output voltage increases above 23VDC. I eventually plan to log this data, and I would like the data to be accurate and high resolution; ideally, ≤1mV. I can increase the current through the zener to improve the accuracy somewhat, but I also want to keep the parasitic power losses and coincidental heat production as low as practical.

I am asking for a simple, efficient, and inexpensive solution to this problem, but I'm interested in learning the most accurate practical ways to do this, so any additional information about this is also appreciated.

Best Answer

If you're going to be wanting to log the data in the future, then I would assume you are going to need to feed the signal into an ADC for conversion into a digital form.

Because of that you might as well go that way from the start and not use the LM3914N.

You can scale and shift the incoming voltage so that the range you are interested in, plus the over voltage range, covers a range of (for instance) 0-5V (subtract 19V, scale the remainder down).

How do you do that? This answer will probably help you: https://electronics.stackexchange.com/a/18265/4245 - subtract 19V from the incoming voltage using a suitable op-amp. Then use a voltage divider to take that remaining voltage range down to the range of the ADC (3.3V, 5V, whatever).

You can then read the value from the ADC, log it, and display it on any LED display of your choice in any format you like, using any MCU that has the right IO options for you (the Arduino platform is a popular choice).

Once it's converted by the ADC you just have a set of numbers. It's then largely up to you how you divide those numbers up. Depending on your scaling resistors, ADC values (say, for a 10-bit ADC) from 0 to 900 could be 0% to 100% (19-48V) and displayed on an LED bargraph. The remaining 901-1023 would be "overvoltage" and could trigger a different LED to start blinking as a warning... The world is then really your oyster.