Wouldn't a low atmospheric pressure increase radio interference too? I'd imagine it wont take much loss to higher frequencies to shave off 40Mbit downstream. Upstream to me looks to still be within normal variance from a resync.
Plus if it was just a single resync it could be pure fluke that it synced lower.
Day to day fluctuations much more of an 'issue'. Atmospheric pressure minimal influence relative to changes in noise generation nearby. If it made that much difference waiting for a really foggy day to do a resync would be a good plan
Taking 40 Mbit off means losing 3 dB of margin over ~ 40 MHz of spectrum, 6 dB from ~ 20 MHz, whatever. That's not insignificant, that's the noise over that spectrum doubling or quadrupling in intensity or received signal power halving or being cut by three quarters. That loss is the same regardless of where in the spectrum it is. Whether dropping from BPSK to nothing at the edge or 4096-QAM to 2048-QAM in the golden band still the same loss required. Doubt the top end is carrying anything anyway, the spectrum is likely loaded far more heavily towards the lower end with much of the top unusable: his entire sync could be accommodated under 50 MHz if the bit loading algorithm were so inclined. Certainly won't be interested in 87.5 MHz and up that much: FM radio.
This kinda thing can be SRA steadily lowering the sync and it dropping over time, or given how obsessively the guy monitors stats a one-off event. On upstream reminder on G.fast there is no upstream band with lower attenuation, time not frequency division multiplexing.