Short version: charging proactive and slow wins over reactive and speed.
“I have a laptop supporting X Watts (say X = 45) so I need a X portable charger”
Reminder how your devices use power:
- it ingests the power from its USB input
- 1st priority is it services it’s electronics, screen, CPU, storage, etc
- any unused power is sent to the battery if it’s not fully charged.
- to not overheat the device’s battery it usually is charged at a rate of a 0.75 Watt per Watt-hour, so a 45Wh battery is ingesting at 33W until it gets to 85% full
- from 85% to 95% full it ingests at half the above, e.g. the 33W drops to 16W
- from 95% to full it halves above again e.g. 16W drops to 8W. Then the battery stops receiving any power.
Recharging efficiency of the internal device battery is usually around 80%, energy is lost in the DC-DC step and the battery chemistry.
Discharging efficiency is usually around 95%.
So that means the energy coming out of your device’s battery is 2/3rds of what you put in at the USB port.
So in the example laptop 45Wh battery, you needed to have shoved in 56Wh of energy to get it full.
A fully charged device needs far less to keep it charged, than to recharge it as only the electronics, not the battery, needs power.
You can typically infer this from device battery life in hours. Say a laptop has a 45Wh battery and has 6 hours actual battery life then it was consuming it’s stored energy at (45/6*0.95) 7.1W. Therefore even though you feel you need 45W charger, you only actually need 7W, so say a 10W charger would suffice to keep such a charged device charged while in use.
The lower limiting factor is the minimum voltage your device accepts. Don’t look at Watts, as Voltage determines if it works, then current determines how well it works. Some laptops need 15V, others 20V. Mine is fine on 5V.
Portable chargers, i.e. Anker Powercore , generally go up in cost for capacity, i.e. 26800mAh costs more than 20000mAh costs more than 10000mAh, and generally go up in cost for performance i.e. 60W costs more than 45W…30W…20W…10W.
Once you know your device’s minimum Voltage needs, check which Powecore maximum voltage supported, to ensure they overlap otherwise the Powercore won’t work for your device.
So let’s draw this together in two scenarios.
Expensive and wasteful scenario:
Say you own a laptop with a 60W max need, which is 20V 3A, but it actually will charge off 30W, you find it will take 15V, with a 45Wh internal laptop battery.
You buy an Anker 60W output Anker Powercore 26800 which has 100Wh of energy.
You use the laptop until battery is nearly empty, then plug in the Powercore, recharge the laptop til full then disconnected the Powercore, used the laptop again til empty, then plugged in Powercore.
As the process of recharging and discharging the laptop battery loses energy in the process you got (100/45*0.7) 1.5 laptop recharges.
You then, being dumb in this scenario, come here saying “my 100Wh Anker Powercore gave me 1.5 laptop recharges when it should have given more than 2. What rubbish”. And of course such a dumb person would be wrong.
Less expensive and more efficient scenario:
You have researched your laptop minimum Voltage need and found 15V is enough and bought a 30W Powercore for less cost than a 60W version.
You begin with laptop charged or very nearly full.
You plug in your Powercore and keep it plugged in. As your laptop only needs 7W average as it doesn’t have to recharge it’s internal battery, your 30W Powercore is easily able to keep up, battery level never drops.
As your energy never went via the laptop’s internal battery, you got 95% efficiency so got equivalent of (100/45/0.95*0.95) 2.2 laptop battery recharges.
So this intelligent user bought a lower cost Powercore and got 0.7 more laptop recharges equivalent out than the dumb user.
As a portable charger, by definition, is always with you, there’s very few excuses why would not plug it at the start, rather than keep it in your bag and use it later.