A little technical background to then explain the answers so your question:
The semiconductor in the solar panels releases electrons in the presence of sunlight. The energy of those electrons (Voltage) is a function of the frequency of the light, Ultraviolot (UV) is the strongest light and is stronger at altitude, and when sun is higher in the sky (light passes through less atmosphere which absorbs UV), and when reflection from surfaces like sand, snow.
The quantity of electrons (Current) is a function of the brightness of the light, so the thickness of cloud basically.
So in an extreme example, at night, the light reflected off the moon is enough to make energy, the voltage will be there, but the current won't.
Then you get a variable voltage based on spectrum, and variable current based on brightness.
Then you got the need to adhere to USB specifications which is realistically a minimum of 4.2V to about 5V. The DC-DC converter in the solar panel compensates for lower spectrum (less UV) via stepping up the Voltage, but that steps down the current. So what you get is a fairly fixed 4.2-5V output and varying current.
If you have understood this so far, then you can predict that if you have a physically bigger panel, then the step-down of current is less, so the current is higher.
Then minimum required input current of your device then will be above the current output of the solar panel more often the bigger you make the panel, so more useful current. This is non-linear, in that double the panel size is going to be more often useful.
Get the biggest panel you can afford, or wish to pack, so in the Anker system the 21W.
A Powercore recharge is more tolerant of varying voltage and current than most devices, a Powercore will recharge its cells on a lower current than devices, so you waste less of the energy. However, putting the energy into the Powercore then taking it out to put into device wasters energy. So if you are in strong sunshine, it is more efficient to plug your device in directly until its at least 70% charged. Once you get to 70%-85% charged then your device will move into trickle charge so you;re wasting energy. Then once your devices are nearly fully charged, then recharge Powercore instead.
A good technique is to plug in both your device and the Powercore together and any unused energy the phone cannot take, goes into Powercore.
It is good to buy a USB meter to measure the current, because in really good conditions you get too much power and it is wasted, in that case the Powercore 26800 and the Powercore II Elite has dual input so it can take all the output in perfect sunny conditions.
TO answer your question, if you get the 21W, you'd expect to see 1A 5W output in suboptimal conditions, useful energy in hazy cloud, and this goes upto about 2.1W 11W in ideal conditions, most phones cannot ingest as fast as 11W, so if you bought the meter, you can find the maximum ingest current of your phone, tablet, etc based on the brightness of the sun, and then learn that and plug in devices and Powercore to make best use of the energy.
In less than ideal conditions, the minimum current of a higher end tablet will stop charging, so consider an intermediate Powercore to capture useful energy and present at higher current to the device.
In a camping situation I tend to take a lower-end phone which accepts lower current, and a smaller tablet which accepts a lower current, as these then tolerate less than ideal conditions and collectively is smaller to carry.