AMD64 would bring more cache pressure on the CPU, but also a lot more memory for caching the world, plus more registers, a better instruction set… For a game, I expect memory would make a large difference.
You give examples of blobs, not DRM. Unless you are referring to the direct rendering manager, I have not encountered any DRM in years of using Linux on the desktop.
Sensors could manage with some sort of interferometry to rebuild an image from multiple pixels across multiple sensors. The signal processing would be similar to what is used for medical scanners. We could also see a convergence with radar, combining those pixels and algorithms with an active source.
MySQL's warnings are silent, in the sense that the web application will never act on them. Guessing instead of failing early is a poor choice for a relational database, and is only explained by MySQL favouring ease of use and porting over data integrity.
It still implies the Direct3D implementation stayed at 270fps. I wonder how Windows+OpenGL fared before the optimisations; presumably it was lower than 270 or OpenGL would have been the baseline.
The way US politicians are funded, they do not give a shit about this hypothetical lone inventor. This is just a romantic ideal used by the pro-patent lobby to glamorise patents.