I think going "full tilt" is to shave off a fraction of a millisecond of response time ... which might happen if the key happens to start being debounced at just the right moment within the USB port's one-millisecond cycle.
But some firmwares I have seen stop scanning other keys to debounce just one key, which of course would delay events from other keys that happen to be pressed/released simultaneously.
My own (unreleased) firmware for the AVR has the microcontroller's normal mode be a sleep mode, and the main loop woken up by an interrupt every millisecond. That works fine of course. I have not actually measured it but I believe that it would use less power as well. All keys are also debounced every time using a bitfield representation of keyswitch state.
If there was an interrupt that was triggered when the host actually polled for the report, which set a timer that would weak up the mainloop so that it would finish right before the host polled again... then that code would shave off a fraction of a millisecond like a "full tilt" firmware and be able to save power and debounce all keys at once.
But you can't be sure that the host will poll at the same time within every cycle. If that happens or if more than one event would be triggered at once then the code would no longer work within bounds and the penalty could be one full millisecond extra.
And frankly, I see no reason to hunt for fractions of milliseconds....
BTW. Right now, My code uses a timer but the AVR also has a USB interrupt that would be triggered every USB millisecond cycle while the USB port is active.
I have been thinking of instead rewriting the code to have a much slower timer (set more suitably for waking up from standby mode), and use the USB cycle interrupt to wake up the normal main loop and reset the timer. Then when USB standby mode has been entered, that USB cycle interrupt would simply stop triggering, which would let the timer run its course.