I'm not sure about broadband (not my area of expertise) but I'm guessing it could be anything from reliability to management. On any network, when you truly saturate a link you start to get some weird behaviour, degradation of service and time-critical things stop working. Latency and ping go through the roof as minor examples. At a guess, to keep the ppp session alive, a small amount of bandwidth should be reserved to guarantee those packets always get through. In schools it's normal to reserve 1Mb on core links for management, that way we're guaranteed remote access even if they're hammering the connection, backup is running or worst case, someone has put an unmanaged switch in, created a network loop and created a broadcast storm the managed switches can't shut down.
Overheads and hardware limitations will also play a factor. On a home network, on a 1Gb link you'll never see a transfer of 128MB/s, it's always around the 900Mb mark because overheads will consume the rest of the bandwidth even though they're not measured. This is in a best case scenario of transferring one large file. Transfer many small ones and it only gets worse.