Watt/Amp Hour, Watt/Amp Minute Confusion

power

I am somehow very confused about amp/watt hour and minutes. I am trying to size a battery to see how much load I can get away with before it hits the 50% discharge zone.

I plan on connecting a modified sine wave inverter (3.6W idle), microcontroller (14W max) that will be attached to the inverter, an actuator (motor1), and rotisserie motor (motor2).

My problem is that motor1 has an inrush current of 4.6A and the motor2 has one of 10A. Talking to the makers of both, it appears these happen almost instantaneously for about 0.2 sec at worst case when they both start and stop due to inertia.

Otherwise motor1 draws 3.2A and motor2 draws 5.5A (both run 12V). For both motors they will only be on when they need to be turned and they will only be in motion for at most 2 min at each time.

This is a school experiment so I'll estimate I would run each motor 30 times in a day during the same time the controller and inverter are on. How would I go about calculating how much load I would consuming?

Best Answer

I suggest that you ignore the start-up inrush since it is so quick and the overcurrent is small.

So, you have a total load of 3.2 + 5.5 = 8.7 A at 12 V for 2 minutes, 30 times...

That is 8.7 A Hr per day for the motors.

The inverter will use 3.6W = 3.6/12 A. That is 0.3A. so in a day that is 0.3 x 24 A Hr. = 7.2 A Hr

The microcontroller (14 W seems a lot, as the other person commented) will use 14/3.6 x 7.2 = 28 A Hr.

Total = 8.7 + 7.2 + 28 = 44 A Hr per day.

That controller is going to require a big battery. The motors hardly any...