" The sensitivity (grain) at the receiving end."
I take it that "grain" should be gain ?!
It is a bit more complex than just gain, as the modem generates quantisation noise in it's ADC as well as that produced in its front end amplifier. Also there is a need for a minimum signal level needed to hold sync.
The gain is a compromise between avoiding overload on a strong signal and holding sync on to a weak one. The AR7 chip AIRI wasn't too good on weak signals, but perhaps it needed an adaptive gain control to maximise it's usable signal range and some modem designers didn't bother ? Also, my modem will hang on to sync down to 0dB S/N margin and below; I doubt if the error rate is usable there but loosing sync and taking 30 sec to re-acquire it isn't a good response to a single noise spike so it "works" below the "usable" S/N.
So in response to the orriginal query yes the modem design can have a lot of influence and it is complex.
for a deeper discussion see :=
http://www.eetimes.com/electronics-news/4138114/Taking-the-Noise-Out-of-ADSL-Modem-Designs