I'm using a killawatt type device to measure the watts used from some led fixtures. I'm getting some results I don't quite understand. Please shed some light on this.
My setup is as follows: Outlet(approx. 217v)--->Meter--->Powerstrip(3 outlet)--->LED light #1
|
--->LED light #2
LED light # 1: 50 watts consumed @ .395a
LED light # 2: 51 watts consumed @ .278a
#1 and #2: 43 watts consumed @ .590a

??
Basically my meter is telling me I'm consuming less electricity with 2 lights than a single light? I'm pretty sure this has to do with parallel circuits and ohms law or maybe power factor but I would like to understand this thing fully.
I checked around the house with similar 2 light fixture setups and I see the same kind of results. Even with florescent lights.
Can somebody explain what I'm seeing?