There have been enough answers and posts to this, some have touched on what
I believe to be the most important difference, others haven't.
Simply put, in the real world, people want to keep costs to a minimum for a given job.
With "mains" applicatons, that's 240V AC (in my part of the world, 110/120 in other parts). Under those conditions for a given CURRENT, the permissable voltage drop is several volts. Running a 240V motor drawing 10 amps, its not the end of the world if you only have 230V at the end of the run.
Most of whats done here is working at substantially lower voltages - often 12V.
If you tried to pull the same 10A at 12V, and lost 10V because of line resistance, the 2V you've got left isn't much damn good!
This, in a nutshell, is why "mains" ratings for cables is so different - the permissable drop (which is dissipated as heat in the cable) is much less of an issue than in our field. While we might not be too concerned that the cable has got to 50 or 50 degrees C, the fact we're "wasting" 90% of our power IS a problem.
YMMV, and of course my opinion isn't worth a damn ![Smiley :)](https://www.fieldlines.com/Smileys/default/smiley.gif)