I do not believe the difference between max attainable net data rate, and actual net data rate, with interleaving+FEC, should be considered a miscalculation, or that the modem is not properly accounting for something. It's the modem answering a hypothetical question of "If the interleaving+FEC parameters could be adjusted for maximum bandwidth, what would the bandwidth be?"
The difference is due to the basic ATTNDR method, which leaves various things unspecified.
ATTNDR then gets calculated based on adjusting the FEC ratio for the maximum coding gain with the minimum overhead, it has to stay within the maximum interleaving delay, but does not have to meet the minimum INP requirement.
The actual net data rate will have to have the FEC and interleaving parameters set to provide at least the minimum INP level specified in the line profile.
so more interleave depth means more reliability and better throughput performance (though of course increased latency)
It's not really the interleaving depth that is the important parameter, it's the delay, and the INP. Think of
tickmike's recent post, with an interleaving depth of 416, much larger than normal for ADSL2, but with a fairly typical delay of 7.47ms (probably the max delay specified in the line profile was 8ms).
I suspect that the FEC and very small interleaving levels typically used with retransmission (G.INP) are for optimizing the bandwidth from the coding gain, and not really for providing additional noise protection.