Hacker News new | past | comments | ask | show | jobs | submit login

Hmm on second thought you could also just render the "overscan" and have the IMU work with the display to shift the pixels into view while the processor works to keep the overscan updated. On pause you can re-render everything (overscan included) and start all over again on the next movement.

     +------------------------------+
     |   <<< "over scan area" >>>   |
     |   +----------------------+   |
     |   |   viewable by user   |   |
     |   |                      |   |
     |   |                      |   |
     |   |                      |   |
     |   |                      |   |
     |   +----------------------+   |
     |   <<< "over scan area" >>>   |
     +------------------------------+
This might leave enough "headroom" too keep up with potential movements at very low latencies.



Eye tracking would allow even more latitude. The eye picks up very little detail outside of the central area of focus. It is also very good at inventing information to fill in any gaps - for example during the periods when the eyeball shifts position we are essentially blind but the brain fools us that we see a continuous image.



In Peter Watts' scifi novel "Blindsight"[1] there is a description of aliens which could program their bodies on the fly to move only during the saccades of one human observer in order to hide themselves[2].

[1] https://en.wikipedia.org/wiki/Blindsight_(Watts_novel)

[2] http://www.rifters.com/real/shorts/PeterWatts_Blindsight.pdf page 236


What an awesome concept! Thanks for the link, I think I'll be adding this to my reading list.


I love that book so much. I recommend it often but nobody ever takes me up on it.


Thanks for the link!

I had to go to the mirror and try the little experiment described in the intro.

Not totally convinced yet, gotta try with someone as witness. ;)


I'm am skeptical of this solution, because in VR, when you turn your head, all the angles change. It is more complicated that just shifting the image.


Might work to first order, though. Could make the effects much smaller.


Carmack talks a bit about one optimization he used in the doom 3 demo: https://www.youtube.com/watch?v=wt-iVFxgFWk#t=1h51m45s




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: