This is so out-of-touch with the modern web. Lots of totally legitimate websites, including major news outlets and Wikipedia, render their web pages on the client using JavaScript. Whether or not this is a good thing is a separate issue (personally I see nothing wrong with it), but it should be obvious that a useful search engine in 2019 needs to be able to index JavaScript-rendered content.
I just noticed I haven't been allowing any JS whatsoever from Wikipedia. Everything looks fine. I'm logged in and can edit, etc. (I fixed that now; all allowed).
But that's not my point; how did we go from "search engines shouldn't execute JS" to "you're out of touch if you think you can use the web without JS".
I, for one, would welcome a force with the size and influence of Google telling devs to cut the Javascript crap. To me, it’s like violating the conditional rendering rule: there are plausible reasons to want to do it, but Google is right to push back on it.
It’s not as if totally legitimate websites don’t have reasonable technical alternatives here.