>> if this calculator is correct
Theres also one on the main site, but I rounded a bit more
Output Power 
a lower dBm on broadband meens the line has less noise and so less milliwatts is used in TX as theres less background line noise ?
No, not necessarily

The over-riding fact when it comes to power is PSD masks & power cut back. These control the amount of power that is given to a line, and the shape of maximum bits that can be loaded at any particular frequency.
iirc BT use 5 different Power profiles for adsl/adsl2+ (havent ever seen anything specific about vdsl), these masks are profiled depending enitrely upon your line length.
I'm assuming the profiles will be something like : very short, short, medium, long and very long... and depending upon your line length I've no further info how they categorise. The profiles and psd masks will control the maximum amount of power given to each individual tone. Regarding the masks, they principally shape on frequencies to prevent x-talk within the sub channels and types of dsl that may use tones differently ie annex a/m. Regarding profiles, The idea is that short lines are given less power than longer lines so that short lines signals dont drown out longer lines.
For
adsl2+ you will find that use of masks ensures that no-ones power goes much above 20dBm. The starting point is 18dBm,
but theres also a little bit held in reserve for bitswap or if the line is struggling. This means that all lines (except the very shortest) should always see the downstream power in the region of 18-20dBm. Under performing lines that are capable of much higher speeds or not reaching their full potential will also show reduced power.
With FTTC, I dont know the profiles, but each of the subchannels (U0,U1,U2,U3,D1, etc) will have their own profiles based on line length. The reason why my U1 power is -28dB is because my line must be classed (guessing) as short therefore its throttling back to make sure U1 doesnt get too much power meaning a better signal quality for the longer lines.
Every single tone in every sub-channel will have its own power shaping going on depending upon which profile BT class your line. Although our router may show a total power for upstream & downstream, these are just the aggregate of all the tones in use.
Im trying to think of an easy way to explain it, the best way I can think of is by using my own line as an example.
These are my power profiles
VDSL Band Status U0 U1 U2 U3 U4 D1 D2 D3
TX Power(dBm): -6.6 -28.0 4.0 N/A N/A 11.8 7.4 7.4
Note how my U1 band has very little power. Now look at what effect this has on bit loading.. (cap below).. see how it means that hardly any bits are loaded in this sub-channel. However, my line is good.. the effect is that it forces my line to load more bits into the U2 subchannel, leaving U1 cleaner (x-talk) for longer lines that can use those frequencies.
PSD masks and profiles are important factor in adsl/vdsl. They help reduce crosstalk and gives longer lines less chance of being drowned out by short lines - which is exactly what would happen if BT didnt control the channels using masks and all lines were given the same amount of power.
----
ETA
Also uploaded SNR per tone and QLN & Hlog graphs to show that we have to be very careful in assuming that less dBm means less background noise.
If you look at the SNR per tone & QLN, then you could assume that these tones were more noisy... but this isnt the case.. what has happened is power cut back has been applied... which means my signal over these tones isnt as strong. In turn this causes less bit loading.
If someone didnt know what was going on, then they could assume by just looking at the SNR & QLN that I had a problem in U1... when in fact I dont. My Hlog confirms that all is actually very good and its power cutback thats causing the lower SNR and bitloading.