From my experience in using Windows since elementary school, it offers lots of games, since its monopolistic behavior, thus people tend to use Windows, banks, schools, softwares usually dominated by Windows, either through behind the scene scheme. Linux offer less games and software to work, that’s why the users are not as much as Windows.
But, the negative side of tons of games, distraction, viruses all over Windows, it is paid of course, the upgrade sometimes make the users at loss, “I pay then after I upgrade it is getting slower”
Linux, on the other hand, developed by Linus with tons of other programmer, due to UNIX is commercial / paid and they are student with no money and work together to create an OS that is open source. If you are interested you can create your own Linux OS, by following Linux From Scratch Book.
With Linux you can learn more, like learning C++ is more rewarding than learning Python. I say nothing about learning Julia!
If you learn Linux more you will realize you can create better OS than Mac OS which is still UNIX derivative.
Imagine like this, Windows is an expensive car like Lexus, but then the maintenance cost is expensive, something goes wrong then you have no idea how to fix it, needs more money, and Linux is your own Sports Car that you created from scratch, like Cadillac or Ferrari that you understand from the machine to the wheel. No matter Linux From Scratch or using other Linux Distro that already popular like Ubuntu, Kali Linux, Fedora, OpenSUSE, Debian, CAELinux, ArchLinux, Endeavor, once you learn Linux you will comprehend other Linux easily.
You think hackers use Windows?
Again, if you want to create your own rocket, you learn Linux, if you have money and want to buy commercial rocket you use Windows, but if you get lost or anything happens on space, then you are doomed for having no knowledge to tweak the rocket/spaceship to battle aliens and avoiding asteroids out there.
For me the biggest differences between Linux and Windows or MacOS is “How much do you want to tell the OS what to do, vs the OS telling you what to do?”
Linux based OSes have tremendous flexibility. You can do everything from routing and inspecting packets in an embedded OpenWrt router to desktop machines to supercomputer clusters… But you have to know what you are doing to make each thing work effectively.
Windows and especially MacOS have more of a “do it our way, and then it just works”. So as long as you can shoehorn your needs into the standard set of things, you will get farther with less input required.
That being said, there are Linux distros designed for more of a “common everyday desktop users” experience. So, mileage varies a lot.
So I might have a slightly different perspective - I am working in a Psychology Dept., our lab does human neuroscience (mostly software written in Matlab and Python). As the most tech savvy I usually fill the tech support role.
So technologically Linux has many advantages, but if a person using it has limited tech skills but still has a computational job to do - it is a nightmare. Software does not install easily (e.g. permissions on one folder have changed and it can’t be removed - installing does not work, uninstalling does not work), drivers break, symlinks to critical files sometimes are just not created.
I can handle 90% of those weird issues with googling and troubleshooting issues one by one. My lab mates get lost in the file system (I also get lost from time to time), get scared of hundred line errors in the terminal. Windows also breaks, but being GUI-oriented makes it much more approachable for them and easier to fix issues on their own.
I personally prefer Windows, mostly because I am used to the architecture and use heavily software not on Linux (like Office suite). Also games . I find it as a good balance between taking care of many thing, but also leaving room for manual tweaking in the terminal, if necessary.
One important (from our view) technical difference, mentioned briefly somewhere above - Linux is much more precise with time - if I remember correctly, Windows clock is 500 or 1000Hz, which for many applications is simply to slow.
Well, most of this will not happen if you use proper packages like .deb packages from Ubuntu or Debian. It will also not happen if you use Julia. That drivers brake, yes, this can indeed happen, in particular with proprietary drivers like those from Nvidia. So you have to be careful and google a solution if it happens. On the other hand on Windows many drivers are missing that are already included in the Linux kernel, like USB-serial drivers, in this case the situation for Linux users is easier…
In any case if someone if moving from Windows and trying Linux, use one of the user friendly distros. Mint, for example, is very easy, there is not much that can go wrong with it, including the use of proprietary drivers. The interface is plain boring and natural ala Windows 95, just more refined, which is quite a virtue.
Whenever my windows computers are slower than I would expect, you can be certain that the Antimalware software is a primary reason (on a work pc even more so, as one has less control over it).
That software must be among the top energy consumers. It’s like a giant heating device spread across the world.
Wow, what a thread I made about this theme, hehe. I agree with mkoculak and Freya_the_Goddess. Especially, about games.
I found some quote from comparing windows and linux: “Linux and windows are different in the cost, versatility, and stability with each looking to get better in their superficial weak areas. Normally, some major areas of superficial weakness usually cited have included the poor “out-of-box” usability of the Linux desktop for the mass market and vulnerability to malware windows. Both are areas of fast development in both sides. Linux is not a complete operating system when you compare it to Windows operating environment. Linux doesn’t have a built-in GUI interface. Users are free to use the former windows graphic user interface with Linux where each provides a different look and feel. Windows offers a much easier user interface called graphic interface unlike Linux, where it only has one GUI for users to choose from… Both windows and Linux allow hardware to successfully communicate with the software in a similar way. But however, Linux has some compatibility problems which can cause trouble when trying to install hardware drivers.”
The usability of Linux for the average non-developer user has gotten a lot better in the past 10 years or so. It used to be that you unavoidably had to run some shell commands or edit config files by hand to get basic things done in Linux (and we tend to underestimate how hard such things are for non-programmers, to do it precisely and without screwing something else up). Nowadays, when things are working properly, the usability and discoverability of the GUI is so much better. You can get almost everything done from the GUI itself. The only problem is when something goes wrong, then you might have dig deep and end up in a rabbit hole.
That’s sort of true for Windows as well, to be fair - having to edit registry entries or group policies for eg. - but the interfaces tend to be somewhat nicer and more unified, and the instructions online are easier to discover both because of popularity and because there’s not as much variety in Windows as there are of Linux.
The device compatibility situation is also significantly better nowadays. It used to be that it was just expected that one or two components of your computer wouldn’t work with Linux, sometimes with hacky workarounds and sometimes none. Nowadays things mostly just work, and there’s also a lot of info available online about which hardware is compatible with Linux and to what extent.
This one’s sort of a theoretical point. It’s true that the Linux kernel isn’t necessarily coupled with any particular GUI, but in practice any user is going to download and install a specific distro, and most likely one of the popular ones at that. Those come with a particular desktop environment - usually GNOME/its derivatives, or KDE - and most users are gonna stick with that default.
I understand that really now, nowadays, these OSes are much better than they were before. And, based on your answer, and the answers above, even if they are different, they are made solely for the convenience of users and yet, in some ways they are similar…
Actually Linux has a GUI built in. “Linux” is ambiguous, can mean Linux OS (as in Linux distro), and they most commonly have a GUI. It can also mean just the Linux kernel (what Linus Torvalds made, and is responsible for), which originally didn’t have any graphics, but actually does by now.
If you’re simply choosing between Windows and Linux, then while I don’t like promoting Windows, it actually does have a Linux kernel too (in addition to its own), and you can add a full Linux distro, and get a full Linux GUI apps there too:
Linux has generic framebuffer support since 2.1.109 kernel.[9]
That’s what I mean by Linux kernel has a graphics, maybe a stretch to call it a GUI, and it’s not much used. Implied by GUI would mean for most mouse support and windows on the screen. Historically Linux kernel didn’t have any mouse support (only keyboard and text), but actually by now it has USB support (USB didn’t exist back then), and USB supports mice and more.
[A Linux kernel alone is never useful, as you must boot into something more, e.g. a shell, and days of the the shell being text based, or only such provided are pretty much gone.]
I research some info (for my personal experience, not for assignment or smt) about Windows and Linux operating systems. I’m comparing these operating systems to know what’s really a difference between them, minus or plus both of them.
I think, that if you are really interested in any sober answer to this question, as it applies to you, you need to narrow it down and define your particular use-case.
If you are “just” interested in people’s feelings about this topic, regardless of their use-cases, and don’t actually want to come to any conclusion, then, well - this is the right way to go about it…
I have gotten to know a zillion different people and their setups in easily 50 organizations, with hardware ranging from handhelds to (IBM-) z/OS-mainframes, with mirroring and multiple instant redundancies over large geographical distances, to be robust against nuclear strikes or natural catastrophes.
Each of these settings has their own requirements, metrics and criteria to even compare options.
So what are you interested in - running linux vs. windows on?
o 1-user-PC
o shared-PC (multi-user)
o server
What do you want to be doing on that piece of hardware?
o play games
o develop software, locally
o develop software, remotely
o what kind of software is being developed (what is the softwares’ target-OS?)
o use applications (which ones: just general office-stuff, audio-, video-editing, or something very specific)?
What’s the “social” context of your use-case / do you have any external dependencies / requirements of interoperability, that are not met by platform-independent standards?
…
There are likely use-cases defined by single requirements in those 3 categories, I outlined, which likely already don’t leave you with more than one option:
o Run a high-performance - web- or databaseserver => Linux in 99% of the cases
o Play the newest games, regularly => Windows in 99% of the cases
o Test & benchmark all the latest PC-hardware, regularly => Windows in 99% of the cases.
o Work as a general IT-Consultant, typically within top-500 corporations, on site => Windows in 99% of the cases
…and if none of those apply, then you even have a choice and there is not a single beats-them-all - criterion and the general question for pros and cons starts making sense, but allthemore: That question needs to be rooted in some requirement for having this comparison in any sane way?
For most people, the question is can the OS run the applications I want. Both can run Julia (and most apps made with Julia), and Linux is best supported (Windows is supported too, but I see more bugs and slowdown reported by other users for it and a bit too for macOS).
If you want convenience, meaning access to all the (Windows) apps, e.g. games, then Windows is your option. You can run many Windows games with wine/PlayOnLinux (and even non-game Windows apps), but I wouldn’t bet on it working for all. Some say Linux is the future of PC gaming…
If you want a free as in freedom OS (and no DRM), then Windows can’t be your choice, and Linux is your best bet, likely Ubuntu (though I would avoid the latest LTS, it had bugs), or Linux Mint.
If you want a server, then even Microsoft runs Linux most of the time on their Azure, i.e. more than their own Windows OS [API] (and Microsoft even has a little known Linux distro, besides adding Linux to their Windows).
Fun anecdotal fact: In a professional software-dev - context (where money is being made in one way or another from developing or maintaining IT-systems), I’ve found that the higher the requirements for that software are, the more likely it was, overall, that Windows was being used on personal laptops or desktops. Not because those laptops or desktops would meet any higher standards, but because everything critical (in whatever way) was located on servers, running Linux, some other Unix or even something more obscure and specialized, anyways, and Windows was just the least common denominator for meeting all the other (non-critical) requirements.
I’ve never really had to revive legacy-code, myself, so I cannot talk on that from first hand-experience, but what you are implying (easy revives in the linux-world of software after a long time) is literally the opposite of what I’m used to hearing consistently from colleagues over the years, some of who I don’t know as the complaining types.
Check out this discussion, which seems to be genuine, non-flammatory and held among professionals, about this very subject.
Sorry, seems I was not clear. I do not revive old code, I keep developing a package that I compile with the latest VS and that links with > 15 years old dlls (from a that old Matlab) for the graphical display. Anyone imagine that on the unix world? Even 1 year (or less) old libs start to be incompatible to use for linking with code compiled today.
This was compiled with ~latest VS and does not use the modern java graphical Matlab libs.
That brings back memories! Stuck in one version of the OS because the next version had bugs/incompatibilities, but then having less and less package updates, descending into a sort of chain dependency hell where one package wouldn’t update because a package that it depends on wouldn’t, which was because its dependency wouldn’t, and so on… And trying to build them locally would often run into the same problem the commenter above describes.
I had hoped that AppImages would solve this kind of problem, but apparently they too are built with some baked in assumptions about library versions (glibc in particular), and aren’t as self-contained as they appear to be at first glance.
There’s a subtle semantic distinction here that makes a big difference. When Windows apps depend on Windows libraries, they have a large API surface to work with, since a lot of functionality is bundled together and made available through them. And Microsoft bends over backwards to maintain compatibility through all of it.
In contrast, the set of programs that can reasonably be built to depend only on core Linux libraries is not so large, and most software in pratice is going to need to depend on at least GNU/Linux (or GNU+Linux if you prefer). Those don’t have as strict compatibility adherence as the kernel, and that applies even more to the long tail of other non-GNU but important libraries. A couple comments on the HN thread mention this too:
He started saying that syscalls are stable and that it’s a gtk/GUI library problem and that if you were to, say, if you somehow loaded all the old libraries, it would work.
Which might be true and everything, but come on. That’s the equivalent of pushing your hands on your ears and shouting.
Binary compatibility is very important for the kernel folks, but apparently much less so for maintainers of glibc and other libraries.
Maintaining backwards compatibility is an enormous (and boring) undertaking though, so this is understandable from the volunteer-developer point of view. There have been some horror stories from Microsoft developers about lengths they had to go (and the kludgy code they had to maintain) to keep supporting all the various versions of options and the abuses of APIs that software developers had committed, just because Microsoft did not want to lose out on software availability this way.
Yes, it’s not terribly hard to make code work forever in Linux/Unix. You just have to NOT use shared libraries (and the dependencies you get from Linux package managers).
If you statically link in all your dependencies (or not, and distribute the .so with, as Julia does), then you’re good. There’s one main library that most dynamically link to, and that’s libc (and it has sometimes broken in the past).
The downside with doing this (to some degree done in Windows) is that then you miss out on security and other updates… and you have to update your apps (or autoupdate preferably), which puts users are risk. So there are pros and cons, and in the Linux/Unix world with more dynamic linking you are likely more secure.
It is wise to strike some middle ground and dynamically link to only security critical components, such as for networking, and libc maybe. Julia is also like a mini-distro (Karpinski’s words I recall, so the distro’s package managers are getting redundant), and Julia has reproducability for all code.
You CAN get rid of the default C library (GNU glibc) also, statically link to musl, or Cosmopolitan (have code that works on Linux, and Windows and macOS etc.) with them the only interface you talk to is the Linux kernel (or Windows API), which Linus is very hardline about (“Don’t break userspace”):