Hacker News new | past | comments | ask | show | jobs | submit login

This is so out-of-touch with the modern web. Lots of totally legitimate websites, including major news outlets and Wikipedia, render their web pages on the client using JavaScript. Whether or not this is a good thing is a separate issue (personally I see nothing wrong with it), but it should be obvious that a useful search engine in 2019 needs to be able to index JavaScript-rendered content.



I just noticed I haven't been allowing any JS whatsoever from Wikipedia. Everything looks fine. I'm logged in and can edit, etc. (I fixed that now; all allowed).

But that's not my point; how did we go from "search engines shouldn't execute JS" to "you're out of touch if you think you can use the web without JS".


I guess the assumption was that search engines should be able to access the web, like humans do.


I, for one, would welcome a force with the size and influence of Google telling devs to cut the Javascript crap. To me, it’s like violating the conditional rendering rule: there are plausible reasons to want to do it, but Google is right to push back on it.

It’s not as if totally legitimate websites don’t have reasonable technical alternatives here.


> I, for one, would welcome a force with the size and influence of Google telling devs to cut the JavaScript crap.

Wasn't this basically AMP though? The solution to the monster web developers were creating?


hackernews hates that too.


No, that was a big ol’ web cache.

The solution to this is: “you have one year to make your webpages readable by the googlebot without javascript. Have a nice day.”




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: