Not meaning to pick at anything here, but to me web dev necessarily requires knowledge and configuration of a web server.
What's being sent to the browser in a production setting is an HTTP message, not just the web content.
Many features of a non-trivial web page/app will at least require HTTP headers to be set. The "file://" protocol handles this by mocking the full HTTP response for you in the browser, but any deviation from that response on an actual web server has the potential to break your page. I wouldn't rely on opening files directly.
You can also use the http-server npm package, but there's an issue with non-ES6 module resolution. I have started using Vite since it's lightweight, but also provides me with nice to haves module resolution for different module types, and hot reloading.
I am not affiliated with Vite in any way, I just really like it for vanilla web dev.
I also used http-server but pressing f5 or including live.js gets old quickly. Vite is a breath of fresh air, for both vanilla formats and “apps” with scss, ts, etc. Finally someone made a no-bs dev watcher and packager.
The only thing to note for purists, it creates .vite folder which you have to ignore.
Also following this approach, so much liberating not to have "all these dependencies", and having to learn them.
Been writing vanilla css, html, js, (and sometimes using static site generators) to help split the logic between `/content` (markdown, org, txt, json) and `/assets` (js,css,pdf).
When all this is plugged to git, is deployed/published with CI/CD, and served as static pages (jamstack style, git provider pages), things are great!
I started to really learn about computers (i.e. using Debian instead of Windows) in the mid-2000's, but it wasn't until Chrome and Node.js popped up everywhere at the start of the next decade that I really began to dig into the issue of hardcore applications development.
So it was really a no-brainer for me to go along on the entire JS ecosystem ride of the 2010's, purely for the purpose of trying to get my computer to do what I wanted it to do (as opposed to making "apps" for everyone else).
In 2012, I started working on the code of the project now acronymically known as LOTW (Linux on the Web [1]), but I suppose it could just as easily have gone by the letters WDWT (Web Dev Without Tools).
> The browser has matured enough that you can build a complex app without any of the above tools
The irony of this sentence is there was a time before these tools existed when we also did complex web development (for at least a good 15-20 years) in the browsers as they existed.
Nostalgia is powerful. I’m building way more complex and objectively better things now than I was then. The frustrations involved might not be any lesser, but it’s hard to argue the capabilities and level of productivity aren’t improved.
As another comment or mentioned, the built-in Python server is nice.
I would also like to give a shout-out to Caddy if you need a more configurable web server. After dealing with complex config in Apache and nginx, Caddy is a real treat to use.
JS modules are great, but the problem isn't the size of the codebase or the bandwidth available, it's the latency, which you AFAIK can't test for in browsers. 5 chain-dependent files with 200 ms RTT once around the planet adds a second of load time. Sure, you can add a CDN, but for low traffic websites you'll end up with latency spikes due to cache misses.
> So you can't open your index.html over the file:// protocol.
It's not a dealbreaker but that is a significant inconvenience for me. I do local development over file:// a lot.
One thing I'm wondering about is dynamic loading of modules. Say my main.js should only load extra.js under certain circumstances. How does that look?