From what is offered, is taken.
If 2A is offered, but only 1.6A drawn, then there you could offer 1.6A, so if the device is not wanting all that is offered, you can offer it less.
Below I’m using electrostatics, a constant power, I’m not using changing signals, where then impedence matters (changing electric field makes a changing magnetic field which creates a changing electric field which opposes, i.e. eddy currents). For data (a changing signal) it is a much more involved problem. I’m just talking power below.
If say the phone had the electronics to accept 9V 2A 18W, and it drew all what was offered, a USB meter was 18W, then if you used a less electrically efficient cable, the phone would charge slower, but if the phone draw 15W then you can afford to have a cable losses of up to 3W with no downside.
The way you measure cable efficiency is you put a USB meter at the end near the charger, then one at the other end, and measure the Voltage drop. That is energy squirted in by the charger, wasted as heat within the cable, so less energy comes out of the cable.
Ohm’s Law:
So suppose you had a USB meter and you measured 5V 1.8A going into a cable and 4.8V 1.8A coming out, so 0.2V drop. Current cannot drop (if it did then you’ve got a short circuit which is a bigger problem). So then R = V / I , 0.2 /1.8 = 0.1 Ohms.
Then that meant the power loss, energy lost in the cable (per second), is R x I-squared 0.1x1.8x1.8 = 0.36W. Suppose you did that for say 8 hours, you lost 2.8Wh. Suppose your Powercore was 72Wh, you lost 4% of the energy in the cable.
If the energy coming out of the charger exceeds the energy the device draws, then it is a sign the cable losses are not slowing things down.
There is a difference in importance for chargers vs portable chargers. A charger consumes very little cost of electricity, so if you wasted energy in the cable, you at most make the device charge slower (if it draws all that is offered). If a portable charger lost energy in the cable then you are carrying a heavier and more expensive portable charger than necessary. So bad cables, more losses, have greater consequences for portable chargers. But there is a way out.
Longer cables lose more energy, but in the case of a portable charger there is little need to have longer than 1ft-3ft as there is no wall from which the cable has to stretch.
In the era of Power Delivery, where you have a range of voltages offered ( W = V x I , Power = Volts x current in Amps ) then the cable losses become less important. Take the above example of a 0.1 Ohm resisting cable. Suppose the charger and device had a choice of between 5V and 15V and needed 10W. For 10W at 5V means 2A, 10W at 15V means 0.67A. The power loss in the cable at 5V is therefore 2x2x0.1 = 0.44W, but at 15V it is 0.05W, the cable loss is ( ratio of V squared, 3x squared) 9 times lower. So the cable choice becomes less salient, for any given Wattage at higher Voltages.
So there are people out there making these mistakes:
- under-delivery. They have a charger-device combination which is negatively impacted by cable losses, they are using a cheaper or longer cable and so slowing device charging.
- over-delivery. They have a charger-device combination cable of more than the device can consume. So they have over-spent (wasted money) on a too-fast charger and/or too-expensive a cable.
As a rough rule, a bigger battery can ingest energy at a higher rate. So phones with small batteries tend to not benefit from higher Wattage, while laptops / bigger tablets tend to benefit. So it depends where the electronics in the device are relative to the battery, if the cable losses matter or don’t. It is hard to predict but a sign is a higher Wattage small device (phone) probably will tolerate a less good, such as longer cable, than a larger device.
So what I suggest is:
- stop looking at Watts, it’s Volts, Amps and Watt-hours. Voltage drop in a cable (measured difference each end) is energy lost, Volts x Amps = Watts, energy/second. Watt-hours = energy per unit time x time = energy. Energy out of the wall is less important if wasted than energy out of a small portable charger.
- don’t look at marketing.
- buy a USB meter
- buy the cheapest cable, measure it, if it’s a bad cable then use it where it matters least (lower power devices which typically are smaller so say flashlights, headphones), if a good cable use it where it matters most (generally higher power, larger devices).
- when travelling, and using portable chargers, err on the side of using shorter cables and ones you test frequently. Cable losses increase as the cable ages and so you use such an ageing cable for different contexts in it’s life.
- use the right cable, and charger for the device and context.
Does that make all complete sense? The wisdom here is largely 100 years to 200 years old and use the level of maths a child can do.