Hacker News new | past | comments | ask | show | jobs | submit login
My transition to an Ubuntu workstation (ryannjohnson.com)
208 points by ryannjohnson on Oct 23, 2019 | hide | past | favorite | 231 comments



I don't get author's complaint about reinstalling Windows every few months, imho it is completely insane. Even in the age of Win95/98 I never had a need to re-install more than once a year maybe and I was in school at that time and all software was downloaded from shady sources. Now - there is zero need to re-install at all, I only do it when I change hardware platform completely - storage, MB etc.

Same with "unsolicited popups". What are you using that does this? Windows itself shows me reboot needed popup once a few months maybe and that's about it. Some software can show it's own popup when I launch it and it is often relevant.

Windows is not perfect and Linux is great (and not perfect too) but really, some complaints about Windows seems to be straight from late 90s, with custom virus packaged Win98 distributives on torrents.


Even with a paid Windows 10 Pro you get ads for games right in the Start menu. It's insane.


My favorite Windows feature is how the start menu search is so slow. I hit the windows key, type the name of the app, hit enter and I get an Edge window with what I typed as a Bing query.


I cut the start menu out completely and replaced it with Keypirinha. Instant fuzzy search with no shady web requests, just like on Linux with xfce4-whiskermenu/rofi.

The script looks something like this. It's written from memory and I don't have a Windows machine nearby to test it, but it's better than nothing.

    $path = %windir%\systemapps\microsoft.windows.startmenuexperiencehost*
    get-process *startmenuexperiencehost* | stop-process -force
    takeown /f $path /r /d y
    icacls $path /grant "${env:UserName}:F" /inheritance:d /t /c
    rm -recursive -force $path


I try to stay away from Windows, but Keypirinha is my go to as well when I'm forced onto it. That and VirtuaWin make Windows a lot easier to deal with.


Start menu 'web search' is the first thing I disable. Of all the stupid things in windows 10, web search is quite possibly the worst. It's a feature nobody wants except Microsoft.


This comment makes me realize I should go disable it right now. Thank you


Sadly in the latest win 10 update it's no longer a simple setting in group policy. You have to muck around in the registry.

https://www.howtogeek.com/224159/how-to-disable-bing-in-the-...


Create dword with value 0 for BingSearchEnabled in [HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Search]

I already had disabled Cortana, if not also create CortanaConsent dword with value 0


It may be because I am running a Windows Insider build (19008.1), but this registry tweak doesn't seem to work for me. Any other tips?


Disable websearch, and just start typing right away.

I agree that a lot of Win10 defaults are derpy. But just tune it a bit and its reasonably power user friendly. Maybe not what you want to support personally, but sufficient for professional use.


Windows is like desktop linux from the 00s now. There are several different UI toolkits, all of the settings buried in separate areas, and the desktop defaults are completely unusable without a lot of tweaking.


How do you disable it?


Registry >.> https://superuser.com/a/1325836

Like I said, maybe not the crap you want to put up with in your personal life, but as a professional tool... worth tweaking to suit your needs.

(Oh, and in a full professional environment, can also be set by group policy)


Which displays ads next to the search results, which makes MS money. Coincidence? I think not.

Though as Hanlon's Razer said: "Never attribute to malice that which is adequately explained by stupidity."

Part of me wants Microsoft to be so evil they have deliberately made the start menu search so impossibly poor that you end up opening a web search half the time and making them money. But I know deep down, that they are just incompetent and can't be bothered with the desktop anymore.


> "Never attribute to malice that which is adequately explained by stupidity."

Yes, but: "Sufficiently advanced stupidity is indistinguishable from malice"


At the same time I've had instant search in menus in Linux (gnome, kde, xfce, whatever, you name it).


Change the lock screen wallpaper away from 'Windows Spotlight' to an image of your choice.

FWIW on a Windows PC, Windows Spotlight shows me nature and city photos with a bit of info about the place. I think I've seen a game wallpaper maybe once in all the time that PC's run Windows 10. It was pretty (some sort of anime-style game?) and not exactly a typical ad.

Looking at Google Images, looks like most of these 'game ads' are similar -- but I can understand how they can be annoying if you aren't a gamer.


I think they mean tiles in the start menu advertising various games like Candy Crush-ish stuff.

If memory serves, I removed them and moved on with my day. Think the occasional big update will add a new one, but then you remove it as well and you're good for a while.

That's interesting though. I game heavily on my Windows 10 instance. Most web browsing is gaming related and this is done in Edge as well. All I get are the scenic/animal lock screen images. Really wish they'd get some images that support 3440x1440 because it looks pretty rough (Windows Store looks even worse).


I actually like the Spotlight screen wallpapers a lot.


So do I. I think if Apple had a default screensaver/lockscreen image server and every now and then showed a photo of the new Apple Watch or something, the Apple people would oooh and aaaah over how amazing it was.

I run Windows 10 (pro) with nearly all the default settings. It took me a while to even realize what these "ads" were that people were complaining about. A free game that gets included with the OS! That's it.


Surprising feature I love the most in W10

The quality of image and curation is surprising


First things to do on a new windows 10 install are run the decrapifier.ps1 and install classic shell.

Problem solved. It is a shame that the default is so garbage but there's an easy solution.


I have yet to see an ad in the start window. Yes it is anecdotal but maybe my experience is more current.


I work at a MSP and install Windows almost daily. On most systems, I install Windows Professional version 1903, the latest version. On all of those, Candy crush, Skype, One Drive, Office, and a myriad of other App Store apps autoinstall. Each of those are basically ads for paid products. This is on the small business-oriented "Professional" version of Windows, mind you.

Not only do they install, but they immediately fill the start menu tiles with this spam.


I could be wrong, but I'm pretty sure those autoinstalls are placeholders. If you go into add/remove programs, they're around ~120kb give or take. I think this goes along the lines of the "Office" install that would come with Windows 7 or 8 installs where you click it and it tries to rush you through the process of acquiring Office.


That seems to be right in line with parent calling them advertisements.


The parent (unless we mean further up chain) didn't call them advertisements, but stated that they are store apps that auto install. There is certainly a placeholder there so they are for sure advertisements, but I don't believe you could take this fresh install, take away internet access and click on Candy Crush and start playing it while on a flight. You'd need to download the actual app first.


I said "Each of those are basically ads for paid products." OneDrive and Office 365 do fully autoinstall, and if it's impossible for a layman to differentiate between an ad and a native app, I consider it at least partially autoinstalled.


OneDrive seems to be built into Windows 10 as far as I'm aware, but you must configure it to do much of anything. There was a proper OneDrive application at one time, but I don't know how much of a thing it has been since Windows 8 (10 it just seems part of Windows Explorer plus some hooks in 10 proper). It doesn't show as a tile for me on either my personal or work machine, but that might not be the case for everyone. If I click on it in the applications list on the left, it opens Windows Explorer to the OneDrive "folder".

Office 365 that is interesting because on a clean install, I still had to go to office.com to install it. I didn't have icons for Word, Excel, Powerpoint, etc. and certainly not the GBs of data already consumed by it. I'm looking at my work machine at the moment and it has an "Office" tile that I can sign in and use the web versions, but it doesn't appear to be a full Office install.


My girlfriend's got windows 10 on her computer, the other day she asked me if I could fix the start menu for her. I hadn't really looked at it before then. It was full of crap she didn't install, there was a popup in the bottom corner advertising for an office 365 subscription. She just wanted me to get rid of the tiles. There's no option for this, only an option to make them bigger. So in the end I sat and manually removed them. 2/3's if them were were links to crap like Amazon and eBay, candy crush and other shit that just came preinstalled. Nothing she actually uses on her computer ever ended up pinned there.


They have previously served up game wallpapers with a link to more info about the game, or a link to the Windows Store page for that game.

E.g., googling turns this up: https://www.howtogeek.com/243263/how-to-disable-ads-on-your-...

But as I mentioned in my other comment, for me it's been literally once and I leave 'Windows Spotlight' on, because I like the random city/nature photos it shows.


>Same with "unsolicited popups". What are you using that does this?

Open Chrome.

"Try Microsoft Edge!"


For me it was OneDrive, not sure how to explain it but it felt like every N times I hit CTRL+S to save my work prompted windows to raise a "Hey, try Microsoft One Drive!" modal which stole focus from what I was doing.

One way to absolutely guarantee I will never use your software is to constantly nag me about trying it.


My favorite is that if you turn on "Focus Assist", which is suppose to suppress popups so you can focus, it creates a notification alerting you that you have Focus Assist on. Not just once, but every time you turn on your computer.


I haven't seen this beyond 1809


Now with Chrome! (without the chrome)


"Now - there is zero need to re-install at all, I only do it when I change hardware platform completely - storage, MB etc."

This isn't true.

It's routine for goofy conflicts even within MSFT's own stack (Office + Windows + whatever) to put you in a place where reinstalling Windows is the easiest fix.

Plus, if you install and uninstall lots of software (e.g., if you work at a software company), C:\Windows just gets bigger and bigger over time. The only way to reclaim that space is to wipe & reinstall.


I do plently of install/uninstalls of MS software - office re-install, office upgrade, MSVS install/uninstall (all of this due to other reasons, not because it was "slow") and I don't see any regressions because of this, including on my older work laptop with 3 years old continuous Win7 install.

As for size of Windows dir - it not "leftovers" from uninstalls (but they exist too of course), it is a feature of Windows and 95% of it is a Windows updates, all chain of updates is stored locally and can't be safely deleted (according to MS engineers). Yes, it grows very big, especially if start point was older version/SP of Windows. In extreme cases, e.g. installing release version of Win7 and then updating to all service packs and updates will cause it to increase by 15-20Gb. But it is a known expected "feature" and it doesn't slow your PC, only clutters your c:\ partition. And re-install will not reclaim this space, it will be taken again immediately as soon as updates arrive on your fresh install of Windows.

If by chance you a talking about clutter of 3rd party apps, then it is not a windows problem. E.g. some drivers like to keep older installers in appdata, some apps keeps backups of data there - worst offender is Garmin updater, which creates 12Gb backups every map update. That's not Windows issues.


It grows very big even if you start with Win 10.

Honestly, it's indefensible to have a 30GB Windows folder.


Office hasn't had goofy conflicts with Windows since it switched to App-V-based installs nearly a decade ago (just after Office 2010, IIRC). (3rd Party Extensions to Office may still do dumb things, but the base Office install itself won't.)


last year I had to (ie, on MSFT's suggestion) reinstall Windows to fix a conflict with Outlook and Office, all with modern revs, so ... ?


> I don't get author's complaint about reinstalling Windows every few months, imho it is completely insane.

The author is not claiming it's necessary, only that they like to reset their computer to a known good state.


the basic OS install is rarely what you would consider a "good state" (unless you have superficial use cases, maybe). That being said, I'm surprised people don't use recovery points/custom windows Install images more. (especially in a business environment when installing multiple computers)

They can save you a lot of pain.


The author also said they take an image of their "good state" and revert to that, not that the OS is in that state by default.


Install -> happy -> slows down -> random crashes -> slows down more -> reinstall -> repeat

One thing I think happens is Registry gets abused by apps and such over the years.


This is one of those 90s inspired urban tales. Even a decade old CPU can rip through the entire registry without breaking a sweat.

And it is just a hierarchical database, nothing special about it. It shouldn't be very different than accessing files on the file system.

(there were some bugs at the past that caused Windows Explorer to slow down but those were fixed long ago and they were just bugs with a single application, not something inherently slow with the registry itself)


It's not the registry itself, it's using the word "registry" as a synecdoche for all the things that may be installed there. And hooked into Windows subsystems.

Speaking of Explorer, one of the classic ways to end up with a Windows system that's slow for no readily apparent reason is shell extensions. Either on the rightclick menu or in the thumbnail engine. At one point I had a folder I couldn't navigate to without getting an error popup that mentioned Nvidia in it - a chain of handlers had been installed such that JPEG decoding was delegated to the graphics card, so trying to create and cache the thumbnail of a corrupt file crashed in an unrelated-looking place.


Well yeah, but it isn't specific to registry or even to Windows, you could get that with pretty much any extensible system. In Window Maker, for example, you can set up the application menu to use a dynamically filled entry so i wrote a shell script that scan my Steam folder to fill the installed games. Any mistake there could have had a similar outcome (e.g. launching Steam every time i right clicked on the desktop or generating an endless .\.\.\.\.\.\.\. list from not skipping the . directory or whatever). KDE and GNOME have many extensions and people already complain about GNOME's performance when it comes to shell extensions.


It's not that the registry grows large. It's because it accumulates cruft. If your software relies on information in the registry to make decisions, it'll often make decisions based on outdated information a long uninstalled (or upgraded) program left there.

It's the wrong solution to an old problem: having your configuration in hundreds of .INI files was a problem because opening and reading files took a long time. Instead of solving the problem and making opening and reading hundreds of files fast (which would be awesome for a number of other uses of a computer) they created new file that every application would read and write to. It seemed like a great idea at the time.

Every problem has a solution that's simple, elegant, and wrong.


> It's not that the registry grows large. It's because it accumulates cruft.

This seems to be the correct answer, but not for the reason you think ("outdated information"). The registry is a highly indexed database, because it needs to be quick at answering if say a low-level kernel-level driver needs something in real time for an operation. Many of those indexes are variations on Most-Recently-Used (MRU) caches. The more applications that make random queries to the Registry, the more those MRU caches churn and the more likely the MRU caches are full of small, one-off application data versus important stuff that needs to be queried as quickly as possible. (Over time Windows has seemed to work to compensate by changing indexing strategies, but at a high level a lot of the slow down issues with registry still boil down to index problems.)

The Registry wasn't intended to be a general application configuration store, per the original registry usage guidelines. Certainly that ship sailed.

(Also, the key problem with INI files was not that opening/reading them took a long time, it was the same centralization problem: almost every Windows 3.x application stored their INI in C:\WINDOWS for a number of dumb reasons. When Windows 9x started enforcing basic directory ACLs and not every program under the sun could read/write to arbitrary files in C:\WINDOWS, people scurried for the Registry as that seemed the closest to what they were already doing, squirreling config data in C:\WINDOWS for dumb reasons. It took a while for developers to realize they should just put config files in whatever format they prefer in %LOCALAPPDATA% or such like and stop trying to centralize them.)


Thank you. It's very interesting.

In the Windows 3.1 days I had some interesting problems when importing data into an MS Access databases that would contain reports for our senior execs to read while offline (the web was not a thing back then) and import throughput slowed down to a near-halt after 30 megs of data or so. Our solution was to do the import in chunks, and defragment the database between them.


> If your software relies on information in the registry to make decisions, it'll often make decisions based on outdated information a long uninstalled (or upgraded) program left there.

Why is your software looking at registry entries that it didn't create?


> Why is your software looking at registry entries that it didn't create?

I may want to know where Photoshop is installed to I can register a plugin. Or know about the printers installed. Or know more about the environment than the OS is willing to tell me. Or those may be entries that older versions of other applications of mine did create. Or Windows. Or older versions of Windows.

There are many reasons to look into registry entries that aren't the ones you wrote.


I agree with everything that you said, except the part about windows explorer slowdowns being fixed. I still find it to be an order of magnitude slower than Gnome's file manager for simple cases. I understand some of the reasons why this is, but its still pretty nasty.


I might have agreed with you up to ~2 months ago (my 2 Win7 installs had been running for 3-4 years each) until a real, normal, official Radeon graphics card update completely hosed the install. Unusable and unrecoverable from the save point thing, whatever it's called.

Sure, it happens rarely - but it really depends on your usage patterns. If you often have to try out software on a regular basis (I've always used Windows VMs for work stuff when I needed to evaluate things from $randomVendor) it might accelerate this. Or sometimes it's a piece of hardware. yes, some are 90s tales but to some degrees it has persisted.


So why does a search of the registry with zregedit take so long?


Plus between chocolatey, having the data sync-ed with a Synology NAS, and making the effort of finding the command line for all my customizations of windows, I have deploy scripts that means I am operational in less than 30min.


> Same with "unsolicited popups". What are you using that does this?

Probably refers to the "notification center" in the bottom right next to the systray.


Agreed.

Windows ME taught me a lot of storage, backup and organizational skillsets, because I did have to refresh it every month. Literally. For two years (yay for Norton Ghost).

To your point however, since Windows 7, I really haven't had a need to refresh any of my or family's systems.

It does take me ~10min to get rid of some junk upon new install, but only because I'm pedantic. In reality the couple of pre-installed things don't bother me.

I don't really know about "ads". I don't see them as screensaver (just nice rotating landscapes), and again, once I uninstall initial Start Menu games and crap, I don't see them there either.

Really, the most annoying thing is that I have to persuade it several times I've tried Edge and am making my choice with Firefox :P


Windows eats itself slowly over time, and it requires reinstalling to right the ship.

I don't know why this is the case, but it's been the case since the 95/98 era atleast (I don't have experience previous to these).

A reinstall isn't required, but it absolutely runs better once you do so. The only improvement I've seen is that generally it will take longer before windows needs a reinstall. It's possible you just don't realize how much performance/speed you've lost over time.


Windows gets a big update every few months. Those would always fail for me because I had a custom partition layout (Hackintosh) and I would have to reinstall every time.

Popups? Maybe he means the new notification system that was included with... 8? I hate notifications that cover part of my screen while I use my computer/mobile, so I have them disabled. I can understand that they are annoying for some, and they are not trivial to completely disable.


I mean, yes, there popups. But my linux installation does exactly that too, every login I get software update popup by default and if I don't want to see it I need to change options manually (and update settings in linux are far from trivial).

As for bad upgrades - I had linux upgrades which failed something in the video path and I was stuck with broken X server and no GUI. And I didn't manage to repair that state even after extensive searching on stack exchange, so I just deleted everything and reinstalled. Another time I tried to install custom software and ended in a error hell of incompatible versions of libc, libtrpc etc. and it simply didn't work (it was on Debian), but at the same it worked just fine on another VM with different distributive.

No OS is perfect currently.


Oh, definitely. Linux is worse, that's why I use Windows. Doesn't mean Windows is flawless, evidently.


You're running mis-licensed software on a wonky partition and you're complaining about Microsoft!?


Aha? What does Windows care if I have partitions at the end of the drive that it can't even read?


Well, I had to re-install Windows 7 many times due to what I believe is VisualStudio. After many updates it just fills my hard drive with the library redist packages etc. Then things stop working correctly. When I got a new machine for Windows 10 I didnt not install Visual Studio, no more problems so far.


> Even when I was running Windows, I would wipe my computer every few months. I kept a "Backup Image" handy with all my settings already installed. I've spent far too many hours trying to undo damage I've done to my systems by installing random software from the internet onto my workstation; I've come to value the option of resetting my computer to a known, healthy state immensely.

I often thought I was the only one. On Windows, macOS and even Ubuntu I was reinstalling my OS at least three or four times a year. I had install scripts and GitHub repos designed to reconfigure my machine after each install. The process always took a full day.

Now I'm on NixOS. I have three .nix text files, .emacs, .bashrc and that's about all I need to backup in order to clone my environment on any machine.

I've also stopped reinstalling my OS all the time, as I always know my machine's state, just by looking at those few files.


I was puzzled by the article authors statement on this. My main daily machine is Windows 10. It's been running the same 'copy' of Windows 10 since late 2016, and has gone through all the major updates since then, and has had many applications installed and removed on it since then. Office has gone from v2010 thru v2013 to Office 365 (effectively v2019), and I'm always playing with new software. The machine still runs perfectly fine, with no strange problems, errors or slowdowns.

To rebuild it to the state it currently is, would take me several days of installing and configuring, for very little benefit. Imho, there is no reason to reinstall your OS every few months.


For me, Windows XP was the last version of Windows I found the need to reinstall regularly. I skipped Vista, and from 7 onwards, I never felt the need to do a reinstall.

I set up my current desktop with Windows 8.1 more than 5 years ago, upgraded to Windows 10, and still use the exact same installation to this day. It even survived a migration to a new SSD when I upgraded the PC a few years back.

The “needs regular reinstalls” meme had merit 10+ years ago. These days, not as much.


I used to update my memes and references every few months, but since Windows 10 came out I haven't needed to reinstall my memebase as often so I am still running some old memes.


Fun fact: Nowadays Windows reinstalls itself for every major feature update. But settings are copied over so you don't have to reinstall all your applications.


And since I think v1803 or earlier you can basically do an OSX like in place reinstall keeping your files and apps or just files not to mention the actual Reset feature built into recovery that can do this without an install disk/usb. Quick, simple, and works very well unless the OS is shredded by bad GPOs or other changes most users won't be able to do themselves.

I don't think the author was up to speed on caring for their Windows install or simply messed with it enough to break it over time and should have been prepared for the issues that brings. I've stuck to a pretty clean W10 install since mid 2015 so when I reinstall I'm up and running after Ninite and a few other installs I keep on a file share like specific drivers or apps I don't want to download again. W10 is fairly robust unless you do things to break it like click through UAC prompts, edit Windows dir files without backups, or install unvetted software from the internet (spoiler: same wisdom applies to OSX and Linux)


I can't speak for the experience of Windows 10 because I stopped using Windows altogther some time around 7. But I entirely sympathise with OP's habit. Maybe not as fequently, but at least once or twice per year, I reformatted my disk and reinstalled Windows just because if I didn't the whole system became sluggish and unusable. Looking back on it I think it was also a way of exerting control over a system I knew wasn't truly ever mine.


If you want this and still want the polish of Fedora + Gnome, try SilverBlue - https://silverblue.fedoraproject.org/

Built on Project Atomic (https://www.projectatomic.io/) and OSTree (https://fedoraproject.org/wiki/Changes/WorkstationOstree)

You can achieve the exact same thing. e.g. here - https://discussion.fedoraproject.org/t/minimal-or-custom-sil...


To put it kindly Fedora has never been a polished experience and gnome still hasn't fixed its memory leak that they spent a, decade pretending not to notice.


Them memory leak on Gnome shell is fixed on version 3.28.4 for me. It was memory-leak on Ubuntu 14.04 but they fixed in 14.04.2. The "a decade" claim needs some evidence.


The memory leak is the result of a poorly designed interface between compiled code and javascript. The result being that memory wasn't truly leaked in the traditional fashion but rather was consumed rapidly and freed on no sane timescale.

This mismatch between compiled code and javascript existed from inception to now. The fix if I understand it correctly is to really aggressively free memory in a fashion that appears grossly inefficient but in practice performs acceptably.

The release of gnome 3 was in spring of 2011 the "fix" was released in spring of 2018. It would be more accurate to say that it was terribly broken only for the first 7 years of its existence.

Because their "fix" is a hacky workaround for the fact that the only true fix is to throw everything away and start over the bug is actually back in 3.34

https://gitlab.gnome.org/GNOME/gnome-shell/issues/1740

This brings us to an 8th year of shipping a broken product heading for year 9.

That this was a bug that sprung into being in ubuntu 14.04 and was fixed in a point release is totally incorrect.


I've used Fedora (albeit not with Gnome) for more than 5 years at my job and didn't have major issues with it. If you're the kind of people that think Fedora = Gnome, then that's not true. You can have Fedora and run KDE, XFCE and whatever else you want. But I agree, Gnome sucks.


Fedora by being a testbed for RHEL will always have the goal of testing things before they are ready and will NEVER have the goal of providing an optimal experience.


NixOS feels great, I love it. The only gripe I have with it is the inability to just run a custom binary on it. For me the explanations on how to get it to work or write a ?Nixpkg? file was not very clear. Now I'm back on ubuntu and missing the simplicity of NixOS, everything was just so simple and light. I've used ubuntu, arch, manjaro and elementeryOS before, but NixOS felt the nicest.

I don't want to distrohop anymore, Arch was awesome and NixOS feels like the next step. But it doesn't really feel ready for a (maybe) less experienced user. It feels a bit like gentoo in the way that you need to re-link the libs for non-nixos packages or package a binary you just built yourself.

I think the thing I'm going to try next is PopOS with only the Nix package manager, it feels like a good middle ground between living in configs and no hassle with non-entry level documentation.


Sounds like we've made exactly the same distro-hopping journey. I went from Arch -> NixOS -> Ubuntu, and am now torn between Pop!_OS, Fedora Silverblue, or back to NixOS. I thought Ubuntu would be nirvana - just use the defaults, and use the computer as a tool rather than endlessly tweaking it forever - but it's less "just works" than I was lead to believe and I miss knowing the exact state of my system just from a set of nix files. I'd love to know if your Pop!_OS + Nix solution works.


It's kind of a pain, but you can use buildFHSUserEnv [0] to run a custom binary. If memory serves, this also works by writing the config for buildFHSUserEnv in a .nix file, running it with nix-shell, and then running the desired binary "the old-fashioned way."

[0]: https://nixos.org/nixpkgs/manual/#sec-fhs-environments


Both reinstalling every X months (as well as NixOS) seem like an awfully unproductive use of time.

It's ok if that's your hobby. But it just isn't necessary. I've been running the same install of MacOS (with updates) for eight years and it's working fine.


Think of it as testing a backup.


I’d rather use a vm for that.


Basic devops management.


Reproducibility is a little less convenient, but you can ask apt to dump which packages you have explicitly installed, and then clone the machine by asking it to install those packages again.


I never felt the need to reinstall OSX .. EVER. I was doing it very often with Windows and some times with Ubuntu (haven't used it that much personally outside of work)


The "hardware as commodity" is an ideology that I've been playing with for a while. I have my personal cloud that I can access via VPN, including a Windows desktop running in a VM, personal photos and home videos sync'd to a NAS using SyncThing.

De-coupling the portable device (which, due to its portability, is much more prone to breakage or theft) from the data that it captures or has access to.

My journey to this ideology was close calls with data loss due to both hardware failure and theft. Rather than backing up to a central location, have the data "live" at the central location, then back it up off-site somewhere.

This can also have both negative and positive security implications in the even of a lost or stolen device, but those scenarios are likely not quite as bad as if you've got a whole whack of data on the device as well.

This can be done across platforms too, as far as my experience goes. It's unrelated to the author's transition to Linux / Ubuntu. My stable consists of android phones, iPads, Chromebooks (which don't really count because they embody this ideology already), laptops running both Linux and Windows. Although I'm also most of the way through transitioning away from Windows (but RDP is just so good for controlling a remote GUI).


I feel like the "hardware is a commodity" ideal seems to conflict with this paragraph from the author:

> There are some other luxuries I've lost in the transition. For instance, if I want to plug in a USB microphone, headphones, and two displays to my computer all at once, I'll have to do some configuration through the command line to make them work correctly. Even getting the mouse to move the way I intend isn't as easy as Mac and Windows make it. Doing all these things is possible; it's just more work.

Reinstalling everything sounds like a pain if you have to mess around like that to get basic hardware working.


I have a remote work environment using x2go. It's as good or better then rdp.


What do you use for your personal cloud?


It's a mish-mash. A bunch of VMs running on ProxMox, Wireguard is the link from the outside world, QNAP NAS for document storage and running SyncThing server for phone photos and videos backup, a couple of Windows VM's we RDP into for banking and centralised documents and spreadsheets. Also PiHole running on a VM so my 'roaming' browsing is nice and clean. Also, although off-topic-ish, ZoneMinder and LMS (lightweight music server - which the author linked to in comments a week ago or so: https://github.com/epoupon/lms) VMs accessible via the VPN.

Nothing particularly "integrated" like Sandstorm or NextCloud, but I haven't had the existential need for that level of accessibility.

The QNAP NAS (like Synology) has personal cloud apps, but I don't tend to like doing things the easy way. Going down that path also locks services into a particular hardware dependency. VMs are easily backed up and restorable to ProxMox running on new / different hardware if necessary.


Not the person you are replying to, but my personal cloud is Openstack deployed via kolla-ansible with an external ceph cluster for storage.


> I used to salivate over computers. News of the next most-powerful processor or RAM upgrade was irresistible for a period of my life.

I can relate to that. Of course I'd love to have an 8-way Xeon Platinum with a terabyte of RAM (so I could use Slack) to run Emacs and my terminal sessions, but I can't really justify that. To say nothing about the noise.

Or build a Cray-2 shaped box with a beefy GPU on each of its 11 lobes immersed in Novec bubbling like a real Cray 2.

Or a Connection Machines CM-2a-like box with 16 boards hosting 8 RPi-like boards each with each core/thread lighting up one of 32 LEDs on the front. Or fewer SBCs on each if I figure out how to light up LEDs based on SIMD usage in them - 128 nodes is quite a lot.

I deeply admire the art that goes into making a Rome, a POWER9, a 28-core Xeon or an IBM z15. I designed a stack-based CPU in college and I know how hard it is to get everything in place. It's just that I have no use for any of that. Quite frankly, my Xeon tower server is almost 90% idle at any given time, its memory almost completely deserted, as is my Core i3 and my Celeron laptops.


I used to be the same way; I'd pore over data sheets and whatever else, I wanted to optimize _everything_.

Then one day, I dunno, I snapped. I think some minor tweak broke something and I just gave up. I wanted to not care about it ever again.

As it happens, I ended up buying a Mac, but I imagine you could just as easily get the same effect with Windows. The trend continued with iPhone and iPad; I don't care about specs, other than "can this hold my pics and music" or "can this run the 5 tools I need to work".

I guess I only started caring about specs again when Apple introduced broken-by-design keyboards and the ridiculous touchbar. Somehow that snapped me, again. Like suddenly someone put a tack on my comfy seat.


I like your "dream machines" - reminds me of one of mine:

8 stacked cube ThermalTake V1 aka megacube (supposedly they are designed to do this)

best mini-itx mobo in each with highest-end CPU per and max RAM; 7 as netboot, 1 as master

highest-end NVidia GPU in the master, with the remainder filled with the best NVidia TPUs or whatever instead of video cards

fans and lights as needed/wanted

Essentially to end up with one helluva nice TensorFlow box.

I just can't justify the space, power, cost, etc that it would take to build such a beast. It would really be "for show" because it honestly could be made much smaller with probably better cooling and such with a proper case.

I haven't really priced it out - just a dream that will remain so.

Now - on a different level - and something I do intend to try - is what I call my "Poor man's Jetson TX" design:

Basically take a cheap Mini-ITX motherboard, drop a decent CPU in it (something fast with 4-8 cores), put 8-16 gig of RAM in it, then take a NVidia 750 TI SC Mini and mount it horizontally over the CPU and such (use a 1U server cooler for the CPU), using one of those flexible PCIe x16 risers. I've got all the parts to try it.

It would be nice for running models for an embedded system, and costs way less than the real deal. I don't think it's as powerful, but I think it would come close. Nor maybe not as efficient - not sure. But cheap - very cheap.


I had terrible tearing with my default Ubuntu 19.04 install last week, too. I'm using an Intel integrated GPU. The solution was to create the file /etc/X11/xorg.conf.d/20-intel.conf (and the directory, as it didn't already exist but does get checked by X11 if present) with the following content:

  Section "Device"
    Identifier "Intel Graphics"
    Driver "intel"

    Option "TearFree" "true"
  EndSection
No more screen tearing, in my case!


It's quite irritating that these settings aren't sane defaults that the driver falls back on, but at least it's simple to configure.


One's sanity is another's insanity. I dislike vsync'd desktop (and it is my #1 annoyance with Windows since 8, i just put up with it) so for me the sane option is the existing default one.


Could you elaborate why?


Due to responsiveness. When i move or resize a window, the UI feels too sluggish with the window (or its edge) being several frames behind the mouse cursor (which is composited by the GPU on top of the most recent framebuffer state).

Without a compositor running (which introduces its own lag due to having to draw all toplevel windows in its own backbuffer) and v-sync, everything is up-to-date.

There is tearing, of course, but the only time it bothers me is when i watch a video - in which case i enable vsync in the video player. But other than that i prefer the responsiveness.

FWIW it is the same in games too, i always play with vsync disabled and i always notice when a first person game (where i have direct control of the camera with the mouse) is rendering behind the current state (some games use previous game state(s) to keep the GPU busy). And yeah it annoys me when it happens.

Some people suggest high refreshrate monitors, but to me that feels like a workaround that lowers the problem's impact, but doesn't make the problem go away.


Is the maximum 13ms of extra latency (assuming a modest 75Hz vsync) really that noticible? With modern desktops there's got to be multiple frames of lag before you even hit the vsync?


Is the real solution here to use a display with adaptive sync?


Adaptive sync is not being used for composited desktop, only for apps that are fullscreen (i.e. no need to coordinate several different processes to random intervals).


Is it? It's a global configuration change in a text file that doesn't exist by default.


I mean, you're hardly patching and recompiling the kernel. But you're right that its not very discoverable.


On a similar note does anyone have any idea how to improve 4K performance with Intel graphics (i7-6600U). For me it's basically unusable as everything (even loading basic webpages like this) slows down, where as on Windows it works perfectly. For now I've downscaled to 1440p, but that isn't a great solution as now my screen is blurry. I'm using Arch, but a fresh Fedora Workstation install has the same issue.


There are similar workarounds for Nvidia and AMD cards as well. For the GeForce I used previously, I remember having to set "Force Composition Pipeline" in the nvidia-settings application. For the Radeon I'm on now, I had to do some config file magic, but I don't remember what it was anymore.


I switched to using PopOS, it's completely removed the headache I previously had with Nvidia cards on Linux.


I'm on 18.04 and this totally just worked for me, too!


Thank you, this just fixed tearing on this T460p / Ubuntu 18.04.

It's both funny and sad that this is all it took. Linux on the desktop sure is configurable.


And this is why a Linux setup isn't 'objectively better'. As almost everything else, it depends on your priorities.


> I'm able to focus better. I never get unprompted system popups on my screen except during boot.

i3 helped me so much with focus. I can also move around and manage my entire workspace with my keyboard which feels more concise.

It's not really a time efficiency gain so to speak, more like a mental headroom gain that lets me be more productive. I'm not flicking between windows and spaces with alt-tab to try and find what I'm looking for. I'm just going right to what I want with a keyboard shortcut. It feels a lot more like working at a purpose built workstation rather than a consumer OS that I've stuffed software development tools into.


I tried getting into i3 but having to use the command line for things like selecting wifi, changing screens and battery saver modes is so cumbersome.


I had the same issues running i3 on top of Ubuntu, but Manjaro + i3 (https://manjaro.org/download/community/i3/) ships with a nice GUI for managing this stuff.


You can use the gnome utilities for this kind of stuff, I use gnome's network manager to select wifi and gnome-settings with i3-gnome for other tweaks. https://github.com/i3-gnome/i3-gnome

For screen management I use arandr.


The default i3 status bar actually has a tray for all of those things to run in. You just have to make sure to add the programs to your i3 config to be started.

For example, for my network widget I added "exec --no-startup-id nm-applet" to my i3 config file.


There are some taskbar utilities you can install but they are far, far inferior and more buggy compared to what you get with most popular window managers.

I wish someone will just give me GNOME (whichever version or fork) with all standard utilities, as well as settings for enabling tiling behaviour and shorkeys.


For others who are curious about tiling window managers, please also consider Awesome. It's a lua backend and is gpl vs i3's bsd (for those of us who care about that sort of thing).


Also consider Regolith Linux. It's just Ubuntu + pre-configured tiling, and it works perfectly. Excluding the normal amount of Ubuntu install time, I was able to learn it and be tiling in <10 minutes.

By far my favorite desktop environment I've ever had. Super fast, super clean, no distractions.


I tried it on both the distros I've used lately (Fedora & Clear Linux). In both cases it popped up without any scaling on my hdpi laptop screen. Not being able to even see enough to be able to research what to do about this, I gave up and went back to Gnome. Same with i3.


If you run i3 inside a gnome-session [1], you get gnome-settings-daemon running, which will set your dpi according to the preferences set in gnome.

[1] https://news.ycombinator.com/item?id=21332778


I have no doubt it can be done, but that route pursued for every little issue leads to a nightmare of computer-fiddling that is not how I personally wish to spend my life. For OS hobbyists only.


>For instance, if I want to plug in a USB microphone, headphones, and two displays to my computer all at once, I'll have to do some configuration through the command line to make them work correctly.

tbh this kind of thing is why I keep leaving linuxes/unixes/etcs. the amount of background knowledge you need to have to do this without spending hours each time is immense, and I didn't grow up with it so I don't have it.


This is something I do regularly and I have no problems on Ubuntu 18.04. What's the matter exactly?


I was wondering the same thing. I've been on Ubuntu for 2 years now and I've never run into an issue here.

EDIT: I see it now. He's running Ubuntu SERVER with i3 window manager installed and not the actual desktop distro. Just a guess, but that's the root of the issue.


Yeah it beats me why people do this and then complain about Linux as a desktop.


I feel like it's an open problem at this point. I think Sway is the nicest experience (given that it's very similar to i3, while being more user-friendly and not hauling all the Xorg cruft), especially with multiple monitors and stuff like input configuration all centralized.

An idea: Provide a GUI (or web-based with local server, like Fish's configuration) application that lets you make changes like Wi-Fi, sound, displays, time and date, basic window manager configuration and all the other things that you'd need to go into a config file to fix. This could be paired with Sway or another tiling window manager and made into a distro for people currently using DEs to inch their way into a tiling WM.


Let's move all that into systemd services and make the DE just a DE that talks dbus to them :)


I'm running Debian 10 with KDE and use microphone through mostly unplugged USB interface (it's plugged in Wintendo by default), headphones and two displays. Nothing of that was configured within command line.


Yeah, that's the kind of asinine problem which I just don't have the bandwidth to deal with any more. My computer is owned by Apple and I don't know what's doing and I can't run a proper Docker but at least when I connect my headphones it just works.


He points out it's more work, and it surely is. I've shifted in the last 18 months from MacOS to Windows 10, to Linux (Fedora), back to Windows 10, and now back again to Linux (Clear). The Windows/Linux oscillation demonstrates that I'm not really happy with either. I'm staying with Linux for now out of stubbornness. But I wish there were better options, and am saddened that the 2019 desktop OS scene is as dire as it is.


> He points out it's more work, and it surely is.

Well, technically you are using an oddball distro initially created for headless embedded systems. He is using a server distro with a i3 DE shoehorned into it.


True on CL (which I wouldn't quite recommend for normal use though its performance is really tempting for developers). But I found the same with Fedora earlier this year. And Ubuntu in the more distant past. It's more-or-less worth it to me, though I do vacillate. The available OS options are all rubbish, each in their distinct way.


Opensuse has kde, which is similar to Windows. It also handles stuff like this very well, try it.


To people focusing on the specific instance instead of the "I need to have tons of knowledge to fix what Windows/OSX don't have problems with" side of it all:

Every single time I've tried (and I try yearly or so), I've had to resort to the command line within one hour to fix something. Usually file permissions, though I haven't touched anything but the built-in software and updates at that point, so it's extra ridiculous.

Most recently: Elementary OS has broadly been a great experience. But I had to choose a touchscreen-driver kernel, and spend about 2 hours figuring out how to get an on-screen keyboard to work on the lockscreen. both of those are ridiculous.

THAT is what keeps driving me away. And it's WORSE that it's not the same problem each time because knowing how to fix N doesn't help me fix N+1.


Yup, sounds like I haven't missed a thing in ~17 years away from desktop Linux.


Try regular desktop Ubuntu then. It just works for that use case.


I will say, using a more full featured desktop environment can help with the transition (KDE or Gnome). The issues you had with usb devices requiring you to drop into a shell go away. Things like external displays just work, removing the awkward xrandr fiddling before a presentation.


For the screen tearing, I noticed you're using i3 which doesn't include a compositor. If you install Compton, that will fix the tearing issue. :)


With nvidia's driver and multiple displays I still get tearing with an active compositor. Took me a long time to discover the "Force Composition Pipeline" and "Force Full Composition Pipeline" options in nvidia-settings which fix it.


> Plus, open source has come a long way in terms of stability and reliability over the past 5-10 years

I'd say the same about Windows ;-)

No. Seriously. My Linux environments have been rock solid since the early 2000's.


Using Ubuntu since 2012 on i5 with 8GB RAM. Eclipse for Java Dev work. Thunderbird for Mail. Firefox my favourite browser. Skype for Call.

Never regretted anything.

Thing I like about this set up is that it does not come in way of getting things done.


Gaming on Linux is fine these days. Surely not a blocker for switching for the most part (except for some edge cases like some multiplayer games that depend on some weird anti-cheats).


I've made quite other experiences with my setup. I'm not that experienced but with my Ubuntu and i3 setup with the standard drivers from Nvidia I couldn't get playonlinux or wine running. I can't exactly tell what the problem in the end was, but it definitely is not as easy as on Windows or even Mac imo.


I surely won't recommend Nvidia for anything Linux these days. Get a decent AMD card. That said, Nvidia should work with Wine too. Most problems with Wine are caused by missing dependencies, like 32-bit libraries that are often needed for 32-bit games naturally.


Nvidia with the binary-blob drivers are fine. Not great, but fine. Some updates broke things and required work but got em fixed.

That said, looking at AMD for the next (linux) build, both in CPU and graphics card.


I wouldn't call them fine if you consider general progress of the Linux desktop. Their integration with the whole stack is simply broken, due to Nvidia refusing to upstream their drivers. It's the reason they for years couldn't support PRIME (Optimus) and Wayland use cases, and despite their very slow efforts to address that, a lot of it is still broken for their blob (like XWayland use case).

Basically, if you care about the progress of the Linux desktop, don't use Nvidia, since it's only holding it back.


Yes, that's what I've learned on the way but back as a bought my Thinkpad I didn't think about stuff like this as I also initially ran Windows


> I surely won't recommend Nvidia for anything Linux these days

Probably not an option for serious compute work that relies on CUDA.

EDIT: Compute, not computer.


Didn't AMD plan to provide CUDA shim, that works for their ROCm? Besides, you shouldn't be using CUDA lock-in to begin with.


> Didn't AMD plan to provide CUDA shim, that works for their ROCm?

Can't find any new developments on that

> Besides, you shouldn't be using CUDA lock-in to begin with.

For some teams, not choosing CUDA is a luxury they don't have.


For steam games enable proton for all games. For all others try Lutris.

Proton is very good default option because valve worked to make a lot of game playable with it.

Lutris have specific profiles for specific game/software release so it can install required wine version and winetricks and sometimes even patches.


Steam with proton did work for some games but not with i3 only with the default Ubuntu window manager, I'm not familiar enough with this stuff to really tell where the problems are but I'll try Lutris, thanks!


Nvidia continues to ruin the experience for people.


Yes, definitely should have avoided them in the first place


I still cannot get nVidia driver to offer subset of native resolution. Eg. My display is 2880x1800, but I wish to game in 1440x900 or similar. Not a problem for other OS’s, but nVidia Linux driver doesn’t allow this.


Idk about nVidia since they insist on being a special snowflake in the Linux graphics ecosystem but for everything else you can add arbitrary display modes with xrandr and and then either let your display do the upscaling (might not support all resolutions) or configure your GPU to do it.

Most displays also just advertise a number of smaller resolutions - does yours not do that?


I can set the desktop scaled resolution, but fullscreen games always revert to 2880x1800, most are unplayable.


Screen tearing is usually a compositor issue. I still find it terrible that this is not dealt well by default in Ubuntu in 2019.


Something I've appreciated after replacing Windows with Ubuntu on most of my machines is the ability for me to take its SSD out and put it in another computer. Really handy for travelling.

While I could get a small laptop for portability, its pretty hard to match the price/performance of a desktop. Also like how I can put together a desktop that is really quiet (PC has a NH-D15) even while under high load.


The problems you mention that you haven't solved are solvable. Are you looking for advice? I admit, I don't quite understand the reason for the opening post.


The things I would miss most leaving Mac...

1. Uploading my Signature in Preview and being able to sign and edit PDF documents on any of my Apple Devices.

2. Screen rendering quality, on Mac everything seems to be sharper and more crisp than Linux on the same device.

3. Keynote. I so want a cross-platform replacement that is nearly as good.

4. iMessage integration with phone. I can send sms/iMessages from my Mac.

But I am planning to switch anyways. As all these companies move to a "service model", I feel like I am only leasing my software, and tools that make the hardware work. Furthermore, it feels impossible to get under the hood anymore.

Would love any advice or recommendations for the above.


For 2. That's funny, for me mac screens look like a blurry mess while Linux are the sharpest thing around with infinality patches.

For 4. There is kde connect (you don't need to run kde to have it). Works only with android though afaik.


Also messages.google.com with Android. But with the obvious caveat that you're giving extra data to Google.


Are infinality patches still being updated?

Those were about the only reason I could tolerate a standard Centos or RHEL desktop, but I'd heard the maintainer 'disappeared' and nobody has picked them back up.


For some time there weren't but now someone is on it again :

``` aur/cairo-infinality 1.17.2+11+gdfe3aa6d8-1 (38) (0,00)

aur/cairo-infinality-remix 1.17.2+17+g52a7c79fd-1 (6) (1,33)

aur/fontconfig-infinality 2.13.1+12+g5f5ec56-1 (310) (0,00)

aur/fontconfig-infinality-remix 2.13.1-2 (6) (1,33)

aur/fontconfig-infinality-ultimate 2.13.1-1 [installed] (21) (0,00)

aur/fonts-meta-base 1-2 [installed] (90) (0,09)

aur/fonts-meta-extended-lt 3-1 [installed] (91) (0,09)

aur/freetype2-demos-infinality 2.10.0-4 (470) (0,01)

aur/freetype2-demos-infinality-remix 2.10.1-1 (7) (1,44)

aur/freetype2-docs-infinality 2.10.0-4 (470) (0,01)

aur/freetype2-infinality 2.10.0-4 (470) (0,01)

aur/freetype2-infinality-remix 2.10.1-1 (7) (1,44)

aur/freetype2-ultimate5 2.10.1-1 [installed] (12) (1,00)

aur/gimp-font-rendering-fix 1-1 (5) (0,00)

aur/grip-git 20120917-1 (37) (0,00)

aur/jdk7-openjdk-infinality 7.u171_2.6.13-1 (27) (0,00)

aur/jdk8-openjdk-infinality 8.u172-3 (33) (0,00)

aur/jre7-openjdk-headless-infinality 7.u171_2.6.13-1 (27) (0,00)

aur/jre7-openjdk-infinality 7.u171_2.6.13-1 (27) (0,00)

aur/jre8-openjdk-headless-infinality 8.u172-3 (33) (0,00)

aur/jre8-openjdk-infinality 8.u172-3 (33) (0,00)

aur/lib32-fontconfig-infinality 2.13.1+12+g5f5ec56-1 (0) (0,00)

aur/lib32-fontconfig-infinality-ultimate 2.13.1-1 (3) (0,00)

aur/lib32-freetype2-infinality-ultimate 2.9.1-2 (Out of Date) (9) (0,04)

aur/ttf-dejavu-ib 2.37-2 (0) (0,00) ```


Preview is amazing. I’ve never used another PDF reader that didn’t make me sad every time I had to use it. It’s nice to like PDFs. Signing and moving pages between or to new PDFs are my most-used features.

And it’s the best basic image viewer / light editor I’ve used.


For 1, Xournal can open PDFs, insert text, images, and annotations, and export back to PDF.


Libreoffice draw is amazing at PDF documents as well. It's something that feels like it should be advertised more, though I can imagine why they don't.


3 - libreoffice impress.

For Numbers I like gnumeric over libreoffice Calc, but nothing really holds a candle to excel unfortunately.


Calc actually works much better for spread sheets with odd character sets than excel does.


Will check this out. Been decades since I last tried libre office. fingers crossed


I also tried using Ubuntu with i3 for several months. I went back to an iMac, for two reasons:

1) a 5K 27" screen means I can have 3 or sometimes even 4 columns of code on screen in my Emacs at the same time, and there is no way to get 5K with a PC

2) the totally broken clipboard in Linux causes constant annoyance: I want reliable, predictable copy/paste for text that works the same in every application, with the same keystrokes, and that does not auto-copy on selection (this is important, because I often want to paste OVER a selection)

All the other complaints I had were minor: I was able to somehow get the hardware to work, I did not connect external monitors (which has always been a disaster in Linux), stuff mostly worked. But these two were showstoppers.


I don't understand your clipboard remark. Ctrl+C and Ctrl+V works everywhere.

Emacs by default uses a different copy-yank shortcut system, but that too works perfectly well with the clipboard out of the box since some years ago.

In order to have the clip stay alive when the sender closes, your desktop environment has to do something I think, but that too works out of the box in GNOME, I just tested it. If that was your problem, you were perhaps just missing something in your i3 setup? I think it is to be understood that choosing a more fiddly tool means having to be spend more time fiddling with it. A default install of the big distros are not fiddly.

In addition to that, you also have the convenient middle-button instant copy-paste thing.


CTRL C / V works in the terminal?


By default ctrl-c is already taken for something else so the default is shift-ctrl-c/v. But yeah, sure you can open the settings and change it to that if you want.


> and there is no way to get 5K with a PC

I am not sure I understand what you mean? You can even run 8k screens, though gaming at that resolution is quite demanding on your GPU. What is the problem with 5k?

2) That is a setting in your clipboard manager, e.g. klipper. "Same in every application": You have three bindings to choose from (CUA, emacs, vim) and are free to pick just one.


> 1) a 5K 27" screen means I can have 3 or sometimes even 4 columns of code on screen in my Emacs at the same time, and there is no way to get 5K with a PC

I assume that you are using the native 5k resolution instead of the default 1440p resolution?

In any case, yes, it is sad that there are still no commodity 5K monitors yet, I want integer scaling for 1440p.


Iiyama sells a 27" 5K monitor that works over a single DP 1.4 link (unlike the previous models from Dell, HP and Philips that used two DP 1.2 links).

While pretty niche, it does exist.


Yeah, it's pretty hard to find, even in the US.


I am not sure if I understand what you mean with there is no 5K with a PC, but there are 5k displays you can buy. Also there are ultrawide displays.


> 2) the totally broken clipboard in Linux causes constant annoyance: I want reliable, predictable copy/paste for text that works the same in every application

This is probably my number 1 rage point with Microsoft Office products. No matter what I do, it persists in trying to do some kind of fucked rich-text copy/paste that mangles whatever I paste into. It's so bad that I've taken to pasting into Notepad first and copying from there to try to ensure that I only get text.


Shift + ctrl + V strips the formating


Note that this is not limited to office. It works pretty much everywhere in Linux and windows.


Tangentially related, but I hate by-default enabled middle mouse paste on Firefox. I use autoscroll (hold middle mouse) to browsing, and it keeps sending paste on the site. Why is it enabled by default??

At best it's mild inconvenience, at worst it leaks company data to outside websites that bother to capture it.


Because this is how copy-paste always worked on X11.

I'm used to it, so I hate apps that don't respect it. The option to disable it is an app-specific option to override the environment's default behaviour.


Is there a way to globally disable middle mouse paste on X11? I don't really use it anywhere, and when i do, it's mostly by accident


It's enabled because it's a very old standard, just like Ctrl-C + Ctrl-V.

If you were using Ctrl-V to scroll down, I guess you'd have the same kind of problem and could level the same complaint.

Tangentially related, applying constant pressure to a mouse button to scroll sounds like it could cause stress problem for your fingers long term.


I use quick kind of flicks, so it's not a long time total the button is pressed. It also requires quite minimal amount of force to press. Concern is appreciated though


Why middle mouse paste doesn't work on Windows? Why it's disabled by default??

Why the middle mouse scrolls in Firefox instead of pasting?

It's a major inconvenience which makes the default system unusable for me.


Middle mouse doesen't scroll by default as far as i know, autoscroll has to be enabled from the settings. The functionalities seem to overlap when scrolling is enabled, which isn't expected.

Auto-pasting clipboard contents has security implications, whilst not having it only has inconvenience


I am addicted to the keyboard shortcuts and touchpad in OSX. I would love to know if there is a Linux OS that can replicate the keyboard experience of OSX. I can live with a subpar touchpad experience but not the keyboard.


Keyboard shortcuts are completely configurable in many Linux desktop environments. I use XFCE, where I simply open up Keyboard settings to edit them.

On my laptop that has a Synaptics touchpad that supports advanced gestures in Windows (with 2, 3 and 4 fingers), I was also able to get them to work under Linux with very little trouble and with complete configurability - I can assign any gesture to any action.


>>Maybe I'm spoiled with Linux, but I like that I can install the OS without worrying about a license. Windows does this, and I don't know if it's the financial burden I don't like, the mental overhead of keeping track of where my licenses are, or perhaps even just the idea that I don't really have full control of my software.

I don't recall the last time I had to worry about licencing. The licence gets bound to the particular hardware ID and it activates on its own, you don't even have to type in the licence code anywhere during installation. Very weird argument to use against Windows nowadays.


I also highly recommend Fedora. The polish is far higher than Ubuntu.


And just using a Gnome shell with almost no customization AND NOT WORRYING AT ALL ABOUT THAT is a huge productivity boost.

If we haven't wasted so many hours writing the perfect WM theme, selecting the perfect colors, making the perfect icons and backgrounds, we'd be long past the Singularity and enjoying our superintelligent agents taking care of our every need.


I don't see any problem with re-installing windows though as often as you like.

By being a little disciplined, you can have a fresh install of windows, configured with all the apps you need from woah to go in little over a couple hours.

I like having a scriptable install of windows in case I completely lose a system while travelling, it's easy for me to completely restore and be productive again within a few hours (just buy a new laptop).

Choco really helps here too.

  choco feature enable -n allowGlobalConfirmation
  choco install googlechrome
  choco install firefox 
  choco install 7zip
  choco install vlc 
  choco install git
  choco install paint.net 
  choco install sql-server-management-studio
  choco install vscode 
  choco install visualstudio2019community
gets me to 90% of where I need to go.


Boxstarter makes it even easier, as you can just do Win+R, and type https://boxstarter.org/package/nr/{packages from choco separated by commas}. It bootstraps chocolatey and installs everything, and I can be running on a new system in 10 minutes.

https://boxstarter.org/


> in little over a couple hours

That seems a couple of hours too long to me. Copying over from a backup takes a bit of time, but not hours.


I can get 2/3 of those with https://ninite.com/7zip-chrome-firefox-paint.net-vlc-vscode/ and when I want to update any of them I just double-click the same installer file.


you know you can do

   choco install googlechrome firefox 7zip vlc git ...

?


Related: How do people manage their dotfiles? I have mine in a git repository, but then I have a hand maintained bash script to make symlinks from the right paths to those files in the repo.


Similar to you I keep mine in a git repository, but I use dotbot[1] to manage the symlinks to the proper paths.

[1] https://github.com/anishathalye/dotbot


To make that easier, I use stow.


yadm. It's a very thin wrapper around git with a couple of quality of life improvements (like being able to have encrypted files + the ability to execute a bootstrap script on the first pull of your dotfiles).


git + gnu stow


I liked the last in point in particular. It's one of the reasons I don't keep dotfiles around and even my vim configuration is minimal.

I'd rather get used to defaults in case I need to use another computer in a pinch than getting used to a system and having to google how to do every little thing when in another system. I picked up that habit, when I had to work on remote servers nearly every day.

That's probably a bit zealous of me, but I've gotten used to it.


I switched from win 10 to debian-10 because of incompatibility of docker with win 10. What a pain is that to install that stuff. Its back to 1990, lets install all from the command line. Took me over a day just to get my wireless working on a acer laptop. When that was finally working, had to spend a day to delete stuff because otherwise it wont update. Sudo became my friend. Its an insane experiment for 2019.


It was maybe three or four hours after switching from Windows to Linux that I began being the biggest Linux snob in the entire world.


Is 2019 the year of the Linux desktop?


No, but 2020 is!

Meet The Linux Desktop That Will Embarrass Windows 10 And macOS in 2020 [0]

[0] https://www.forbes.com/sites/jasonevangelho/2019/10/18/meet-...


More re-arranging of the deck chairs will not make it the year of the Linux Desktop. Linux Desktop evangelists have been getting this wrong for 2 decades now.


It is for me. Sold my MacBook and switched to Linux. Just need to settle on a flavour.


Microsoft seems to think so.


When you can get Office on a Linux desktop I'll believe that.


Doesn't everyone use the online version of Microsoft office now? For non-professional use, anyway.


It was a lighthearted remark, but primarily thinking of WSL. Let's face it, tiling window manager enthusiasts aside, relatively few people use Linux because of the glories of its GUI. WSL gives most people all the "desktop linux" they are likely to want.


KDE experience is far better that Windows 10


Gnome Shell is far better than Windows 10 for me.

From time to time I end up needing to figure out why someone's Windows laptop won't connect to my home wifi (where Linux, Macs, BSDs, Blackberry's and everything else connects to flawlessly) and I have to dig through the vendor-specific control panels that vary from vendor to vendor, from Windows to Windows, from OEM to OEM...


Windows would become a very serious contender to MacOS if it becomes "unixy" like it.


It's telling of HN's demographic bias that you'd think Windows "would become a very serious contender to macOS".

There are far more people using Windows than macOS, even among developers, even if you only look at the US.


I'm pretty sure the GP meant that Windows would become much more attractive to current Mac users, and I agree. A lot of serious computer work is done on Macs, because the Mac client has so much in common with the Linux server. If MS put a Windows shell on top of a Linux kernel, Windows would suddenly become a better complement to Linux servers than the Mac ever was, while still enabling work tools such as Photoshop to still be used (unlike, say, an Ubuntu client). That would make Windows a serious contender for the market of people doing production work on Macs.


You might want to pay more attention to WSL2 then.

I'm a web developer who switched from Ubuntu to Windows earlier this year and WSL has very much been a "best of both worlds" solution. Continuous improvements to VS Code and Windows Terminal have made this transition very painless and WSL2 promises to eliminate most of my remaining pet peeves (e.g. improving Docker integration).

The only downside is that the filesystem is much slower but that's only a problem when compiling native software.


With WSL2, the filesystem is faster than 'native' windows (as long as you stay within the ext4-formatted image), which obviates one of the big Windows irritants for developers.

The one downside of WSL from my perspective is all the small impedance mismatches, wslpath hacks, etc that result from having to work with the two filesystems. And though VS Codes's remote extension makes working on WSL entirely seamless, the situation is messier with other tools. IntelliJ, for example, has some WSL support in individual plugins, but it's patchy enough that I found running the Linux version via an X server more convenient.


Yeah, you elaborated on what I was trying to get at. I own both a MacBook Pro and PC desktop. If Windows is as easy to develop in as MacOS, I would move my main work there.

There are developments in there that I appreciate like Windows support for Homebrew.


In these days when 32 gigs is the norm for developer laptops, I'd assume anything that can run a VM full screen is good enough.


What does HN think of storing your backup of your pics on Google Pictures?

It's free and has unlimited storage for typical quality photos. Is there any downside to this besides privacy concerns?


it might be a good idea try a linux distro




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: