The efficiency depends considerably on the rate of charge and of discharge versus the internal resistance of the cells. To charge Ni-Fe's up quick, you may put in (eg) 1.7 volts, whereas you could charge it up slowly with (eg) 1.4 volts. The extra .3 volts for the fast rate is wasted energy.
Same goes for discharge. If you have a small load, the voltage might be (eg) 1.3 volts, but with a heavy load it might only supply (eg) 1.0 volts, again losing .3 volts.
So for a heavy, fast charged use: 1.0v (out) / 1.7v (in) = 58% efficiency - on top of that, the amp-hours available at a high rate no doubt drops as well.
On the other hand, for (eg) solar home with gradual charge and low loads: 1.3 v (out) / 1.4 v (in) = 93% efficiency.
But let's examine a few more numbers: Lead-acid's probably fare as badly if not worse with high loads. Their amp-hour specs are given for a very gradual c/20 (20 hour) discharge rate, and - from one typical "size 27" "deep cycle" battery spec - a "105 amp-hour" lead-acid will only last for 38 minutes at 75 amps, or 45 amp-hours. That's well under 50% efficiency without even considering the voltage drop or recharging it. (At 25 amps, it's up to 67 amp hours, and at 5 amps you get the whole 105 A-H.)
Of course you shouldn't try to draw much more than 50-60% of the charge out of a lead-acid if you want it to keep on working. As I mentioned recently, adding sodium sulfate will make it considerably more forgiving of deeper discharges and much longer lasting. For Ni-Fe, Ni-MH you can draw off 90% of the energy with no problems.
I'm not sure the self discharge of Ni-Fe is much different than Ni-Cd or some Ni-MH'es. Lead-acids are certainly not without self discharge either, but that's enough comparisons for me for today.
--Craig