Renewable Energy Innovation

  • Increase font size
  • Default font size
  • Decrease font size
Email Print

No image

Measuring voltage is required for the open charge controller project, as we use the voltage to measure the charging of the battery.

Hence we need to be able to reliably and accurately measure voltage. I have written quite a bit before on voltage measurement.

Here are the details on measuring voltage specifically for the open charge regulator project.

Voltage specifications

The charge regulator is designed to work with 12V and 24V systems, hence we must be able to measure the voltage of a fully charged 24V battery bank. This is in the region of 28V DC. Hence we need to be able to measure at least 28V DC. Lets have a range of 0-35V DC, so there is some headroom.

Reference voltage

The regulator needs to have accurate voltage monitoring, even with a varying input power supply.

The microcontroller used (an ATtiny85) can perform analogue to digital conversion using either an external reference voltage, the power supply voltage, an internal 1.1V reference or an external 2.56V reference.

Using an internal reference makes the most sense as it will be buffered from a variable supply voltage but requires no additional components.

The main issue with the internal reference is the variation in the reference voltage. This value will be relatively stable, but there is variation between each IC due to manufacturing techniques. The datasheet for the ATTiny25/45/85 states that the 1.1V reference can vary between 1.0 and 1.2V and the 2.56V reference can vary between 2.3 and 2.8V. This will cause issues unless this variation can be cancelled.

Due to the variation in the reference voltage (each microcontroller will be different) we must perform some form of calibration on each unit. This is explained in more detail in my "accurate voltage measurment" post here.

Basically we apply a know accurate voltage to the microcontroller. We then upload the calibration code. This stores the calibration value within EEPROM on the microcontroller. Once this is done then other code can be run which will use that calibration factor to make accurate voltage measurements which have been adjusted for the variation in the internal reference voltage.

Another benifit of using a claibration factor is that inaccuracies in the actual resistor values used are cancelled out (as long as they stay constant).

Circuit

We are using a non-isolated voltage divider to reduce the voltage, as shown below. This is simple and low cost. A 5V1 zener diode protects the input to the microcontroller from any spikes or over voltage. A 0.1uF/100nF capacitor smooths the signal to stop any fast variation.

Voltage divider

Voltage divider (ignore the values given for R1 and R2)

The maximum input voltage is 32V DC. This should correspond to the maximum voltage to the ADC on the microcontroller, which is 2.56V DC (as we are using the internal 2.56V reference).

So we know the input and output voltages are related by the equation:
Vout = Vin (R2/(R1+R2)

For our specifications, and using a lower resitor (R2) of 10k ohms, we get:
Vout = 32V x (10k/(R1+10k) =2.56V
So R1 = 115k ohm

 

(Check here for the theory on voltage dividers.) 

If we use 10kΩ as the lower resistor (R1), then R2 is 115kΩ. The standard resistor value near to 115k is 120k Ω, so lets use that value. Total resistance is 130kΩ, so at 32V the current is 0.2mA, consuming 0.0079W (7.9mW). This is very low power consumption.

Resolution

We need to think about the resolution of this. The microcontroller has an 10-bit ADC, hence the 5V input is converted into 210 levels (1024 levels). Each input level is equivalent to 5/1024 = 0.00488V (4.88mV). This goes through th potential divider to give the voltage level steps of 0.00488 x ((120+10)/10) = 0.063V (63mV).This resolution is perfectly adequate for this application (an accuracy of approximately 0.2% of the full scale).

Add comment