Yes, I can quite easily think of Graham's number - I call it G.
You might think that I'm saying that in jest, but it is actually the way of things. There are quite literally an infinite number of numbers. Why should Graham's be important enough for us to consider. For that matter, why should we consider 1?
The answer lies in part in the question - why - it must be important in some way. That is exactly the case - we only consider numbers that, for some reason, relate to problems we are facing, whether it be in mathematics, programming, or grocery shopping, and even then not on their own merits. Past a certain point, you don't hold numbers in your head as little balls, but rather their decimal encoding. That's a clever hack, but not without its downsides. Ask some people randomly if they were in a position to decide, how much budget would they allocate to save 10000 birds, and ask others the same with 100000 birds. The mean of the second answer would not be ten times that of the first (unless your subjects know and actively try to work around that particular bias).
But back to Graham's number - if you think the only way to imagine it is to hold its decimal notation in your head - the obvious question is why? Why not hexadecimal? Octal? Why any base-p encoding at all? The usual reason to use base-10 is to quickly get an idea of the magnitude of a number relative to things we are familiar with and do some simple arithmetical manipulations on it. Both of these are pointless for Graham's number. Relative to familiar things like billions and trillions it's quite a ways of the chart, and adding, subtracting, or even raising to powers of such makes relatively no difference.
Besides which, in this case it isn't really pointless, it's impossible. Jokes about collapsing into black holes aside, there aren't enough bits of information in the universe to encode the decimal representation of Graham's number. There aren't enough bits of information to encode the number of digits of that. Not enough even to encode the number of digits of the number of digits. Interestingly, there also aren't enough bits to encode the number of times you'll have to repeat taking the base-10 logarithm to get back to a universal scale.
Which goes to show that not only your mind, but no entity in the universe can accomplish the feat of representing an arbitrary number in decimal notation, mathematician or otherwise. And what is the point, really? G is a perfectly good symbol, and so is g_64. Usefulness is important, not imagining a string of balls. For that matter, I bet you've been perfectly content to use Pi on at least a few occasions, even though in terms of representability in decimal, it is infinitely worse than Graham's number.
> The mean of the second answer would not be ten times that of the first
I'm not sure how good an example this is to argue your point, because the second answer should only be ten times the first if the cost structure is a linear function with a zero constant term. Which seems like an awfully big assumption -- you might get economies of scale (nonlinear, decreasing derivative), or you might pick the low-hanging fruit first and then have to go after more difficult birds (nonlinear, increasing derivative), or you might have fixed costs that give you a constant term.
That being said, many people do have a lot of difficulty conceptualizing the difference between millions, billions and trillions [1].
You might think that I'm saying that in jest, but it is actually the way of things. There are quite literally an infinite number of numbers. Why should Graham's be important enough for us to consider. For that matter, why should we consider 1?
The answer lies in part in the question - why - it must be important in some way. That is exactly the case - we only consider numbers that, for some reason, relate to problems we are facing, whether it be in mathematics, programming, or grocery shopping, and even then not on their own merits. Past a certain point, you don't hold numbers in your head as little balls, but rather their decimal encoding. That's a clever hack, but not without its downsides. Ask some people randomly if they were in a position to decide, how much budget would they allocate to save 10000 birds, and ask others the same with 100000 birds. The mean of the second answer would not be ten times that of the first (unless your subjects know and actively try to work around that particular bias).
But back to Graham's number - if you think the only way to imagine it is to hold its decimal notation in your head - the obvious question is why? Why not hexadecimal? Octal? Why any base-p encoding at all? The usual reason to use base-10 is to quickly get an idea of the magnitude of a number relative to things we are familiar with and do some simple arithmetical manipulations on it. Both of these are pointless for Graham's number. Relative to familiar things like billions and trillions it's quite a ways of the chart, and adding, subtracting, or even raising to powers of such makes relatively no difference.
Besides which, in this case it isn't really pointless, it's impossible. Jokes about collapsing into black holes aside, there aren't enough bits of information in the universe to encode the decimal representation of Graham's number. There aren't enough bits of information to encode the number of digits of that. Not enough even to encode the number of digits of the number of digits. Interestingly, there also aren't enough bits to encode the number of times you'll have to repeat taking the base-10 logarithm to get back to a universal scale.
Which goes to show that not only your mind, but no entity in the universe can accomplish the feat of representing an arbitrary number in decimal notation, mathematician or otherwise. And what is the point, really? G is a perfectly good symbol, and so is g_64. Usefulness is important, not imagining a string of balls. For that matter, I bet you've been perfectly content to use Pi on at least a few occasions, even though in terms of representability in decimal, it is infinitely worse than Graham's number.