Thanks a lot for the answers
The view camera may be attached to the player ship, but the camera position can be anywhere in the universe, as the 1.84 mobile external camera and external camera OXPs before that demonstrate. Rendering in two viewports in OpenGL is also possible, although I think that implementing this properly in Oolite might be trickier than it looks and might involve a relative performance impact as well. This is the opinion of someone who is not a rendering expert and it could be wrong, but that's what I gather after a somewhat quick google attempt.
Well, I was planning on a maximum x2 performance impact, plus maybe an unknown memory impact.
If it is implemented, I can't think how it would influence directly other projects. On this subject, the another virtual reality project you are talking about is most likely the headtrack branch on github. This was an experiment that worked (quite well, actually), but was not merged into the main source due to it being quite tricky to setup and required the installation of third party software to function. Currently it conflicts with master, but I would think that it should be possible to resolve the existing conflicts relatively painlessly (it doesn't hurt being a bit optimistic every now and then) if we would want to revive it.
Yes, it's this one. I should go and test it if it works
Also, regarding VR headsets integration, I had a look at the Oculus SDK about a year ago and it looked to me that it needed rendering to vertex buffer objects in order to work, which I don't think it is something we do in Oolite yet (I believe we are rendering directly to the screen). So, incorporating this or a similar SDK to the gane could involve some serious amount of low-level work.
I wouldn't know
Here I put the reference for future analysis
https://developer3.oculus.com/documenta ... dg-render/