Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Apple won't support it, therefore support and maintenance will be working around apples proprietary hardware and special drivers. Linux is not meant to live in that environment, I'd argue bettering support for other OEM's is a more fruitfull goal.


If anyone has been following nvidia/linux story, this nails it on the head. Apple has never played well with FOSS and aren't going to start now. Apple's implementation will always be ahead, better, less bugs, and the linux port will always be a shitty experience that takes up several weeks to get working properly, and then it'll be slower than expected.

Get yourself a laptop with an nvidia card and Ubuntu. Even today it's a garbage experience that takes up hours/days of debugging to get right, and then it's still way worse than the mac/windows experience.


Apple have a FOSS kernel, one of the three remaining web renderers, CUPS, and a bunch more smaller things. They’re not the most religious of FOSS companies, but it’s a bit unfair to say they don’t play well in the places that they play.

If you take FOSS to mean only GPL, then yes, I get your point.


Yeah I think it’s more often that FOSS that doesn’t play well with Apple. It’s understandable - Apple keeps a lot of desirable software for itself - but perhaps unfortunate.

Federici has said in one of his interviews that Apple wants people to hack on the M1 machines so I’m hopeful that they will be more open than on IOS.

All of that said, I won’t hold my breath.


> Apple wants people to hack on the M1 machines

Of course they do. Like they wanted people to hack on OSX 20 years ago; and once they reached critical mass, they pulled up the drawbridges (dropping anything GNU, dropping Java, restricting access to the OS, pushing AppStore, etc etc).

Chances they're going to do exactly the same with M1 and its follow-ups are 99.99%... if anything because their management is largely drawn from the very same people who executed that strategy.


They dropped anything GNU because of the GPLv3, which was specifically designed to stop what the authors saw as exploitation of others' work.

I can see both sides here - the viral nature of GPLv3 is anathema to a company like Apple, so simply walking away was inevitable; and people were exploiting loopholes in the GPLv2 (I have no idea whether Apple was, but there are documented cases of others doing it).


It's just one datapoint - there are plenty more. Simply speaking, when they are the underdog and have to attract developers, they open up; and when they don't need it, they close doors. That's just what they do, it's a perfectly rational strategy (if cynical). They are hardly the only ones at this game, Microsoft does it too. I am just pointing out that promises of openness with Apple typically come with an expiry date.


Apple and Microsoft have never pretended to be “open” and their behaviour is quite predictable - Apple most of all. You might not like their behaviour but they haven’t tried to trick anyone.

Contrast with Google and the infamous tweet about Android:

https://mobile.twitter.com/Arubin/status/27808662429


XNU is not FOSS, and 99% of the foundation classes, Aqua, and drivers are all proprietary.


Why is XNU not FOSS? Has something changed? Is Wikipedia wrong? It says that XNU is APSL which is approved by OSI and FSF.

While I agree that much of the good stuff is proprietary (but certainly not all, eg WebKit), that doesn’t mean that the free stuff isn’t free.

https://en.m.wikipedia.org/wiki/Apple_Public_Source_License


Webkit was likely "saved" by the original KHTML license being GPL. At the time, Apple were wise enough (or desperate enough) to figure that they could work with such a license, although they were eventually careful to chisel out anything they could into BSD-licensed modules. And still they had to be dragged into the light more or less kicking and screaming (e.g. they had no public VSC until KDE people kicked up a stink in the press, and were just throwing huge swaths of code "over the fence" like they still do with XNU).


KHTML is LGPL, isn’t it?


If i remember correctly KDE used to be GPL when webkit started, and was later relicensed. The difference is relatively irrelevant anyway, the way they used KHTML they would have had to release sources even under LGPL. Had KDE used BSD, MIT, or Apache, back then, we likely wouldn't have had webkit.


I’m trying to find a specific commit to disprove this, but that’s not my recollection. Certainly this blog post from 2005 indicates LGPL: https://web.archive.org/web/20050428230122/http://www.kdedev...


You might be right. Still, it shows the Apple attitude at the time, and how the license helped changing their ways. Note how the post complains they are simply using OSX apis... without the modification-release clauses, webkit as a reusable library would never have happened.


Indeed. Git repo at https://github.com/apple/darwin-xnu.

Note that that is 2 years behind. You can get a newer source dump from https://opensource.apple.com/source/xnu/xnu-6153.141.1/.

So, open source, but development isn’t done in the open. You can’t really expect you’ll be able to see recent commit messages or even just regular source dumps.


The development methodology has nothing to do with whether something is FOSS or not.

If you're complaining that you want to see daily updates, that's a completely different thing than claiming it's not FOSS (as OP did).


FOSS means being able to develop it the same way they do. That ought to mean being able to see to VCS history, bug trackers and so on - their own developers would use those things when developing.


It really doesn’t mean that. Neither Stallman’s four freedoms[1] nor Peren’s Open Source Definition[2] have anything to say about code history, bug trackers, development standards etc etc, and they’re the only commonly accepted definitions for what free and open source software is.

[1] https://www.gnu.org/philosophy/free-sw.html.en

[2] https://opensource.org/docs/osd


> The freedom to study how the program works, and change it so it does your computing as you wish (freedom 1). Access to the source code is a precondition for this.

> In order for freedoms 1 and 3 (the freedom to make changes and the freedom to publish the changed versions) to be meaningful, you need to have access to the source code of the program. Therefore, accessibility of source code is a necessary condition for free software. Obfuscated “source code” is not real source code and does not count as source code.

IMO exactly the same logic applies to code history and bug trackers: you need access to those things to be able to study how the program works. A code dump is much the same thing as obfuscated source code: you can build the program from it, but you can't understand the program from it.


Right, but, like, that’s just your opinion man.

FOSS means you get the source code, everything else is gravy.


Source "in the preferred form for making modifications". IMO that means a VCS checkout rather than a code dump.


A code dump is nothing remotely like obfuscated source code. They run competitions for obfuscated source code, take a look and compare.

Seriously, at this point I think you have to be just defending a position that you know is wrong but since you originally stated it, you’re not backing down. If not then let’s just agree to disagree...


I've been struggling with nvidia on my laptop for quite some time, but I definitely wouldn't say it took debugging. There are clear limitations, primarily that on older chips the gpu never fully powers down when not offloading, but that was pretty clear from the start. And that's on Arch, using a port of a tool that was made for Ubuntu. On newer computers, it works even better. No matter what, there isn't much room for debugging, be that a good or bad thing.


I have a Dell XPS 9500 (the 2020 model) with an nVidia driver. I run PopOS! as my daily driver and it works with the GPU just fine.


Can it run sway?


I haven't attempted sway specifically, but I don't see why not. The OS installs with nouveau on wayland by default, and runs perfectly fine (at a ~60-70% performance level, based on Linux/Proton gaming FPS). I use GNOME on Xorg, so have no issues with the proprietary driver.


Looks like that’d be the GTX 1650 Ti? That’s of the NV160 family, and per https://nouveau.freedesktop.org/FeatureMatrix.html (assuming it’s up to date), that lacks all 2D and video acceleration, and power management. That sounds rather like a mediocre toy for a large fraction of users. And Wayland needs Nouveau, because the NVIDIA proprietary driver is a hostile environment that does everything its own way rather than the way everyone else does things.


Then use the Intel card and you'll have full acceleration + wayland. Or don't buy the laptop. I was simply answering the question.

You're being unnecessarily antagonistic here. I have a modern 2020 laptop, it works for all of my needs with an nvidia GPU.


Look, the whole context here is “NVIDIA cards have bad Linux support”. And that’s just what we demonstrated from saati’s comment onwards in the example of Sway: that if you want to go Wayland, you’re stuck with bad functionality. Because graphics cards and computers are powerful enough, many people will be able to live with this crippled functionality and might not even notice it, but compare it with the story for AMD dedicated GPUs, or AMD or Intel integrated GPUs, and the point is substantiated: if you want to actually use the GPU fully, you just can’t do so properly.

(And a large portion of the problems people are talking about are with Nouveau; people that can use the proprietary driver—specifically, people using X rather than Wayland—will have a better time of it, though still worse than with GPUs of other brands.)


I own an Ubuntu laptop with an NVIDIA card since 2014 (HP ZBook 15 with Quadro K1100M). Not perfect but not garbage. Hours of debugging / workarounds cumulated in 6 years yes, days not.

The remaining problems on Ubuntu 20.04:

1. The brightness control keys don't work. They generate the event they are supposed to generate then I see an error in syslog. Workaround: two hotkeys bound to windows+fn+brightness up/down to run a bash script that increases / reduces the backlight level.

2. The screen works at 40 Hz since Ubuntu 18.04. I expected to be unusable but actually I can't notice the difference with 60 Hz. Noveau works at 60 Hz but it's unusable for other reasons (I don't remember the details, I checked months ago)

Problems I had because of the NVIDIA driver: the laptop didn't shutdown, only restart. Workaround: I press the power button right after the restart, when the BIOS shows up. Not a big deal because I shutdown a couple of times per year.


I was able to fix the brightness controls on Ubuntu by switching to Linux 5.8, if you haven't already tried that.


> not garbage

> The brightness control keys don't work.

> The screen works at 40 Hz

I think you need to use a laptop (with factory OS) produced within the last 2 decades, because your garbage detector seems to be broken.


I didn't downvote you, because there are tradeoffs and they are very subjective.

BTW I do use Windows and MacOS a few times per month, via remote desktop to run two programs of a customer that only run there. Not a complete experience but add to it some continued (Windows) or sketchy (Mac) use of those OSes from their very first release (80s / 90s).

I'm trading those two nuisances, one very small, the other invisible, for other much larger problems

1. Not using Windows, which is a pretty much horrible experience. Good for gaming, which I don't care anymore on a PC, bad UX (and I don't mean only the GUI.) And I'm targeting Linux servers anyway.

2. MacOS and it's top bar and the menu at the top. It was OK on the very first Mac because the screen was so tiny that it actually saved space. It was perplexing or infuriating when the screens got larger. Unfortunately it stuck and it will be like this forever. And Macs don't have physical buttons on the touchpad.

So I'm happy with my Gnome desktop, configured with: the top bar at the bottom merged with a Windows like task bar, no dock, hotkeys to swap virtual desktop by customer project, no animations, visible permanent scrollbars (but I show only their outline on the background color of their window.)


You have your preferences about UI and GUI and that's ok. But 40Hz screen and non-working brightness controls are a non-starter for any but the most dedicated users. 144Hz is common on decent laptops nowadays, along with automatic brightness.

To me, this trade-off is very simple.


As I wrote, it's very subjective. I've been working at 40 Hz for 2 years and I didn't notice the difference. I press windows-fn-F9/F10 instead of only fn-F9/F10, not a big deal. Automatic brightness control as on phones? I would probably disable that anyway.

By the way, brightness control didn't work in 2014, then started working, then it stopped working again. Kernel versions / NVIDIA drivers, who knows.


Exactly this.

For many of us the disadvantages of using Linux are tiny compared to the advantages:

- from 30-50% faster compiles (vs Windows)

- instant git (vs Windows)

- choice of Desktop environment (vs both)

- choice of hardware (vs Mac)


> Get yourself a laptop with an nvidia card and Ubuntu. Even today it's a garbage experience that takes up hours/days of debugging to get right, and then it's still way worse than the mac/windows experience.

This. Writing this from a laptop running Ubuntu with an nvidia card and I can feel my body tense up just reading it.

I remember several nights in college just struggling to get the drivers to work and finding the right cuda spec from the repository, after combing through multiple sources.

It's still a nightmare and I always dread that the StackOverflow and Ask Ubuntu posts in the bookmarks folder that I've reserved for this arduous process will be obsolete when the GPU card crashes again for the umteenth time.


The new XPS 15 I am writing this on works fine on Ubuntu. So did the Thinkpad I used before it and the XPS 13 before that. I spent 0 time debugging. I am not sure how the Windows experience would compare, but I am fairly happy, with maybe the exception of battery life, which from what I gather kinda sucks on the XPS.


I disagree with this on a few points aside from the hyperbole:

>Apple's implementation will always be ahead, better, less bugs...

No, it's not and really hasn't ever been. Every platform has a shortcoming and claiming it's ahead of the game is a biased stance. The M1 specs may be impressive now, but the same can be said of every "new" SoC chip.

>... linux port will always be a shitty experience...

People who really love Linux and FOSS actually prefer to tune their environment to their expectations. The weeks of tinkering is part of the hacker mindset that is slowly eroding, and I'll take the weeks long config experience over a locked environment.

To sum it all, Linux and consumer open systems have a different target audience from Apple. Apple users are like car lessees, they just want to hop in and drive their car until the lease is up. Linux users are like the mechanic working out of their garage who drives a hodgepodge vehicle they pieced together. To say that the Apple experience is better is just hype, and calling the personalization/optimization process "shitty" is closed-source/closed-minded view.


> Linux is not meant to live in that environment,

Linux is like the cockroach of the OS world, it can do just fine in any environment. After the apocalypse, I'm pretty sure the OS the cockroaches are going to use.

If it hadn't been for reverse engineered proprietary drivers, Linux would have never gotten to the point it is now.


On the other hand, we have basically one single model per year to support. That hardware is used by millions and usually users keep it for 5 or more years.


That's not too big advantage - e.g. the major problems with GPU drivers don't seem to be caused by too many companies producing them (2 or 3 depending on how you count) or too many models (new architecture appears only once every couple of years).

(I won't trust myself to speculate on what is making GPU drivers so difficult to get right)


How is that different from simply picking the best option from some other manufacturer each year and supporting only that? There's no mandate that you need to support every device a company makes.


The Linux community won't go along with that. People keep buying laptops that don't support Linux then ask for the community to support them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: