So Anker makes the 21W and 15W products. The bigger the solar panel the better, because in less than ideal conditions the voltage drops from 5V, if it drops too low then the device won't recharge, and each device will have a different threshold of tolerance for lower voltage.
3A. Chuckle. Reviews show about half the claim, 15W is measured around 7W, 21W measured around 10W, so 21W is about 2A in perfect conditions.
In general, I'd recommend you do NOT use an intermediate Powercore to store solar power. The primary reason is that if you use an intermediate device you lose roughly 30% of the energy in all the conversions.
With an intermediate battery you have:
- solar panel, 5V regulator, cable, 5V to 3.7V, chemical energy store, 3.7V to 5V, cable, 5V to 4.2V, chemical energy.
Without an intermediate battery you have:
- solar panel, 5V regulator, cable, 5V to 4.2V, chemical energy.
Observe the less steps.
I know exactly what you're asking but that information is scant. I own plenty of Powercore and I own a solar panel, I could discharge my Powercore, insert a meter between them and the solar panel then test in different conditions and publish a review. That would take me a few days. The issue is my solar panel is not Anker, I bought a Choetech 21W. I put a meter on it to test it. Each solar panel behaves differently to sun, so any conclusion I would form would be likely unique to my configuration.
There are not reviews of which Powercore is most tolerant. So my advice is theoretical. If you got the 21W then it would output upto 3A (although reviews say 2A) across 2 ports in perfect conditions, so in perfect conditions you'd want something capable of ingesting 3A. You can do this via either two batteries each 2A input or one with at least 3A. So say two Powercore 10000 or one Powercore 26800 using its dual input, or one Powercore II 20000 with its dual input. If "off grid" then you're more impacted by any device failure, so of these choices I'd go with two Powercore 10000. In ideal conditions plug in both. In less than ideal conditions plug in one of them so they don't fight each other for voltage and risk them making neither get any useful.
Hence you'd 1st recharge your device til its 85% full, that then avoids the losses of intermediate charging, this also if less than idea situations.
Next, in perfect conditions plug in both 10000.
Next, in less than perfect conditions plug in one of the 10000.
Key to efficiency is avoidance of heat. Try to place solar panel with air circulating around it, in a light colored area, with whatever you are recharging in shade also in as cool an area as possible. Use good quality cables, long enough to get your device in shade. Use a meter to test your cable. Solar make use of UV (Ultraviolet) so you get most energy when sun is at its zenith, less at sunrise/sunset and less with heat so your better times of day are dawn til noon, worst time is towards end of day.
Before you set out, meter everything, discover what is actually doing not what is claimed, you can have a bad cable, bad solar panel, bad powercore. Apple products hate solar so tend to avoid anything Apple. Chances are you'd find a local unique combination which is more efficient and then take that combination out with you for off-grid reliance.
About 1:4 cables I tested were bad. Anker cables had the fewest problems.
One of my portable powerbanks reacted badly to solar (non-Anker) because what I found was when I used in less than ideal sunshine, the charging circuitry opened (it saw a voltage) but then it began discharging as the voltage was too low to push energy into the battery and the opening of the circuitry caused more power drain than energy input so a net negative. I didn't find that problem with say the Powercore 10000.
If you had to rely on less than scientific evidence, then I'd say the Powercore 10000, its one I own and it appears tolerant of solar, but not necessarily the best proven most tolerant Powercore. To do that I'd need days, if not weeks, to do my own testing.