Hmm on second thought you could also just render the "overscan" and have the IMU work with the display to shift the pixels into view while the processor works to keep the overscan updated. On pause you can re-render everything (overscan included) and start all over again on the next movement.
Eye tracking would allow even more latitude. The eye picks up very little detail outside of the central area of focus. It is also very good at inventing information to fill in any gaps - for example during the periods when the eyeball shifts position we are essentially blind but the brain fools us that we see a continuous image.
In Peter Watts' scifi novel "Blindsight"[1] there is a description of aliens which could program their bodies on the fly to move only during the saccades of one human observer in order to hide themselves[2].