A sequence of rational numbers need not converge to a rational number, so you can't play the game with rationals. As a trivial example, consider the sequence of rational numbers 3, 3.1, 3.14, 3.141, 3.1415, ... converging to pi.
You certainly can play the game with rationals, and Bob can win!
The idea is that if Alice's sequence looks to be converging to some rational number α, well then α came up in Bob's enumeration of all rationals and was (by hypothesis) legal to play. So Bob played it, and then played something smaller next move! So Alice's sequence cannot converge to α.
I kind of get it but it seems like a circular proof isn't it? As in, it seems that you are citing Cantor's Diagonal as a necessary precondition for the argument to hold for reals and not rationals. Specifically the "it will converge in the same infinite number space" was stated in the original argument with a reason that should apply to rationals, so I think it's an erroneous step?
Well my math is quite rusty, but if I remember correctly, you can "define" reals using Dedekind cuts[1] made of rational numbers - in this way you can trivially prove that a bounded increasing sequence converges to a real number (i.e., the reals are complete), without assuming anything about uncountability of reals.
You can also not define anything up front; then take the proof that Bob can enforce that there's no convergence to any rational number.
Next step, look at the process of playing the game, and show that you can define operations like adding any two games or comparing them for equality etc.
(In some sense, that would be similar to showing that Dedekind cuts define a sensible structure. With the added complication that the game doesn't have to converge to arbitrary small intervals, because the players could agree to leave a finite interval like [1/4, 3/4] untouched, eg Alice plays 1/4-1/n and Bob plays 3/4+1/n.
You'd just be defining interval arithmetic on real numbers.)