Get the top HN stories in your inbox every day.
Fiveplus
kshri24
Game development is STILL a highly underrated field. Plenty of advancements/optimizations (both in software/hardware) can be directly traced back to game development. Hopefully, with RAM prices shooting up the way it is, we go back to keeping optimizations front and center and reduce all the bloat that has accumulated industry wide.
hinkley
A number of my tricks are stolen from game devs and applied to boring software. Most notably, resource budgets for each task. You can’t make a whole system fast if you’re spending 20% of your reasonable execution time on one moderately useful aspect of the overall operation.
ksec
I think one could even say gaming as a sector single handedly move most of the personal computing platform forward since 80s and 90s. Before that it was probably Military and cooperate. From DOS era, overclocking CPU to push benchmarks, DOOM, 3D Graphics API from 3DFx Glide to Direct X. Faster HDD for faster Gaming Load times. And for 10 - 15 years it was gaming that carried CUDA forward.
abustamam
Yes please! Stop making me download 100+gb patches!
ffsm8
The large file sizes are not because of bloat per-se...
It's a technique which supposedly helped at one point in time to reduce loading times, helldiver's being the most note-able example of removing this "optimization".
However, this is by design - specifically as an optimization. Can't really be calling that boat in the parents context of inefficient resource usage
MarleTangible
Over time they're going to touch things that people were waiting for Microsoft to do for years. I don't have an example in mind at the moment, but it's a lot better to make the changes yourself than wait for OS or console manufacturer to take action.
asveikau
I was at Microsoft during the Windows 8 cycle. I remember hearing about a kernel feature I found interesting. Then I found linux had it for a few years at the time.
I think the reality is that Linux is ahead on a lot of kernel stuff. More experimentation is happening.
wmf
I was surprised to hear that Windows just added native NVMe which Linux has had for many years. I wonder if Azure has been paying the SCSI emulation tax this whole time.
mycall
Linux is behind Windows wrt (Hybrid) Microkernel vs Monolith, which helps with having drivers and subsystems in user mode and support multiple personalities (Win32, POSIX, OS/2 and WSL subsystems). Linux can hot‑patch the kernel, but replacing core components is risky and drivers and filesystems cannot be restarted independently.
7bit
And behind on a lot of stuff. The Microsoft's ACLs are nothing short of one of the best designed permission systems there are.
On the surface, they are as simple as Linux UOG/rwx stuff if you want it to be, but you can really, REALLY dive into the technology and apply super specific permissions.
b00ty4breakfast
when the hood is open for anyone to tinker, lots of little weirdos get to indulge their ideas. Sometimes those are ideas are even good!
IshKebab
Yeah and Linux is waaay behind in other areas. Windows had a secure attention sequence (ctrl-alt-del to login) for several decades now. Linux still doesn't.
pjmlp
And behind in anything related to kernel security, sandboxing, user space drivers, and 3D graphics drivers.
Without Proton there would be no "Linux" games.
It would be great if Valve actually continued Loki Entertainment's work.
dijit
yeah, but you have IO Completion Ports…
IO_Uring is still a pale imitation :(
6r17
Tbh i'm starting to think that I do not see microsoft being able to keep it's position in the OS market ; with steam doing all the hard work and having a great market to play with ; the vast distributions to choose from, and most importantly how easy it has become to create an operating system from scratch - they not only lost all possible appeal, they seem stuck on really weird fetichism with their taskbar and just didn't provide me any kind of reason to be excited about windows.
Their research department rocks however so it's not a full bash on Microsoft at all - i just feel like they are focusing on other way more interesting stuff
Arainach
Kernel improvements are interesting to geeks and data centers, but open source is fundamentally incompatible with great user experience.
Great UX requires a lot of work that is hard but not algorithmically challenging. It requires consistency and getting many stakeholders to buy in. It requires spending lots of time on things that will never be used by more than 10-20% of people.
Windows got a proper graphics compositor (DWM) in 2006 and made it mandatory in 2012. macOS had one even earlier. Linux fought against Compiz and while Wayland feels inevitable vocal forces still complain about/argue against it. Linux has a dozen incompatible UI toolkits.
Screen readers on Linux are a mess. High contrast is a mess. Setting font size in a way that most programs respect is a mess. Consistent keyboard shortcuts are a mess.
I could go on, but these are problems that open source is not set up to solve. These are problems that are hard, annoying, not particularly fun. People generally only solve them when they are paid to, and often only when governments or large customers pass laws requiring the work to be done and threaten to not buy your product if you don't do it. But they are crucially important things to building a great, widely adopted experience.
embedding-shape
> Tbh i'm starting to think that I do not see microsoft being able to keep it's position in the OS market
It's a big space. Traditionally, Microsoft has held both the multimedia, gaming and lots of professional segments, but with Valve doing a large push into the two first and Microsoft not even giving it a half-hearted try, it might just be that corporate computers continue using Microsoft, people's home media equipment is all Valve and hipsters (and others...) keep on using Apple.
pjmlp
First Valve has to actually start pushing for proper Linux games, until then Windows can keep enjoying its 70% market share, with game studios using Windows business as usual.
Also Raspeberri PIs are the only GNU/Linux devices most people can find at retail stores.
m4rtink
Add to that all the bullshit they have been pushing on their customers lately: * OS level adds
* invasive AI integration
* dropping support for 40% of their installed base (Windows 10)
* forcing useless DRM/trusted computing hardware - TPM - as a requirement to install the new and objectively worse Windows version version, with even more spying and worse performance (Windows 11)
With that I think their prospects are bleak & I have no idea who would install anything else than Steam OS or Bazzite in the future with this kind of Microsoft behavior.
benoau
"It just works" sleep and hibernate.
"Slide left or right" CPU and GPU underclocking.
dijit
“it just works” sleep was working, at least on basically every laptop I had the last 10 years…
until the new s2idle stuff that Microsoft and Intel have foisted on the world (to update your laptop while sleeping… I guess?)
pmontra
Sleep and hibernate don't just work on Windows unless Microsoft work with laptop and boards manufacturers to make Windows play nice with all those drivers. It's inevitable that it's hit and miss on any other OS that manufacturers don't care much about. Apple does nearly everything inside their walls, that's why it just works.
Krssst
On my Framework 13 AMD : Sleep just works on Fedora. Sleep is unreliable on Windows; if my fans are all running at full speed while running a game and I close the lid to begin sleeping, it will start sleeping and eventually wake up with all fans blaring.
devnullbrain
I don't understand this comment in this context. Both of these features work on my Steam Deck. Neither of them have worked on any Windows laptop my employers have foisted upon me.
tremon
That requires driver support. What you're seeing is Microsoft's hardware certification forcing device vendors to care about their products. You're right that this is lacking on Linux, but it's not a slight on the kernel itself.
seba_dos1
Both of these have worked fine for the last 15 years or so on all my laptops.
packetlost
Kernel level anti-cheat with trusted execution / signed kernels is probably a reasonable new frontier for online games, but it requires a certain level of adoption from game makers.
dabockster
This is a part of Secure Boot, which Linux people have raged against for a long time. Mostly because the main key signing authority was Microsoft.
But here's my rub: no one else bothered to step up to be a key signer. Everyone has instead whined for 15 years and told people to disable Secure Boot and the loads of trusted compute tech that depends on it, instead of actually building and running the necessary infra for everyone to have a Secure Boot authority outside of big tech. Not even Red Hat/IBM even though they have the infra to do it.
Secure Boot and signed kernels are proven tech. But the Linux world absolutely needs to pull their heads out of their butts on this.
duped
> I don't have an example in mind at the moment
I do, MIDI 2.0. It's not because they're not doing it, just that they're doing it at a glacial pace compared to everyone else. They have reasons for this (a complete rewrite of the windows media services APIs and internals) but it's taken years and delays to do something that shipped on Linux over two years ago and on Apple more like 5 (although there were some protocol changes over that time).
mstank
Valve... please do Github Actions next
xmprt
I wonder what Valve uses for source control (no pun intended) internally.
shantara
I’ve heard from several people who game on Windows that Gamescope side panel with OS-wide tweakables for overlays, performance, power, frame limiters and scaling is something that they miss after playing on Steam Deck. There are separate utilities for each, but not anything so simple and accessible as in Gamescope.
amlib
A good one is the shader pre caching with fossilize, microsoft is only now getting around it and it still pales in comparison to Valve's solution for Linux.
delusional
> Valve is practically singlehandedly dragging the Linux ecosystem forward in areas that nobody else wanted to touch.
I'm loving what valve has been doing, and their willingness to shove money into projects that have long been under invested in, BUT. Please don't forget all the volunteers that have developed these systems for years before valve decided to step up. All of this is only possible because a ton of different people spent decades slowly building a project, that for most of it's lifetime seemed like a dead end idea.
Wine as a software package is nothing short of miraculous. It has been monumentally expensive to build, but is provided to everyone to freely use as they wish.
Nobody, and I do mean NOBODY would have funded a project that spent 20 years struggling to run office and photoshop. Valve took it across the finish line into commercially useful project, but they could not have done that without the decade+ of work before that.
aeyes
Long before Valve there was CrossOver which sold a polished version of Wine making a lot of Windows only enterprise software work on Linux.
I'm sure there have been more commercial contributors to Wine other than Valve and CodeWeavers.
mixmastamyk
Like giving the Han Solo award to the Rebel Fleet. ;-)
cosmic_cheese
One would've expected one of the many desktop-oriented distros (some with considerable funding, even) to have tackled these things already, but somehow desktop Linux has been stuck in the awkward midway of "it technically works, just learn to live with the rough edges" until finally Valve took initiative. Go figure.
johnny22
Please don't erase all the groundwork they've done over the years to make it possible for these later enhancements to happen. It wasn't like they were twiddling their thumbs this whole time!
cosmic_cheese
That's not my intention at all. It's just frustrating how little of it translates to impact that's readily felt by end users, including those of us without technical inclination.
rapind
It's not just Valve taking the initiative. It's mostly because Windows has become increasingly hostile and just plain horrible over the years. They'll be writing textbooks on how badly Microsoft screwed up their operating system.
pwthornton
I'm a Mac user, but I recently played around with a beefy laptop at work to see how games ran on it, and I was shocked at how bad and user-hostile Windows 11 is. I had previously used Windows 98, 2000, XP, Vista, and 7, but 11 is just so janky. It's feestoned with Co-pilot/AI jank, and seems to be filled with ads and spyware.
If I didn't know better, I'd assume Windows was a free, ad-supported product. If I ever pick up a dedicated PC for gaming, it's going to be a Steam Machine and/or Steam Deck. Microsoft is basically lighting Xbox and Windows on fire to chase AI clanker slop.
WackyFighter
That isn't it. Generally whatever the majority of users tend to use that where the majority of focus goes.
The vast majority of people that were using Linux on the desktop before 2015 were either hobbyists, developers or people that didn't want to run proprietary software for whatever reason.
These people generally didn't care about a lot of fancy tech mentioned. So this stuff didn't get fixed.
cosmic_cheese
There’s some truth to that, but a lot of (maybe most) Linux desktop users are on laptops and yet there are many aspects of the Linux laptop experience that skew poor.
I think the bigger problem is that commercial use cases suck much of the air out of the room, leaving little for end user desktop use cases.
iknowstuff
There's far more of that, starting with the lack of a stable ABI in gnu/linux distros. Eventually Valve or Google (with Android) are gonna swoop in with a user-friendly, targetable by devs OS that's actually a single platform
thewebguyd
The enterprise distros do provide that, somewhat.
That's why, RHEL for example, has such a long support lifecycle. It's so you can develop software targeting RHEL specifically, and know you have a stable environment for 10+ years. RHEL sells a stable (as in unchanging) OS for x number of years to target.
cosmic_cheese
I don't have a whole lot of faith in Google, based on considerable experience with developing for Android. Put plainly, it's a mess, and even with improvements in recent years there's enough low-hanging fruit for improving its developer story that much of it has fallen off the tree and stands a foot thick on the ground.
ninth_ant
Ubuntu LTS is currently on track to be that. Both in the server and desktop space, in my personal experience it feels like a rising number of commercial apps are targeting that distro specifically.
It’s not my distribution of choice, but it’s currently doing exactly what you suggest.
LeFantome
Valve has been pretty clear that Win32 is the platform.
singron
Isn't that the steam linux runtime? Games linked against the runtime many years ago still run on modern distros.
bilekas
I do agree. It's also thanks to gaming that the GPU industry was in such a good state to be consumed by AI now. Game development used to always be the frontier of software optimisation techniques and ingenious approaches to the constraints.
baq
I low key hope the current DDR5 prices push them to drag the Linux memory and swap management into the 21st century, too, because hard locking on low memory got old a while ago
the_pwner224
It takes a solid 45 seconds for me to enable zram (compressed RAM as swap) on a fresh Arch install. I know that doesn't solve the issue for 99% of people who don't even know what zram is / have no idea how to do it / are trying to do it for the first time, but it would be pretty easy for someone to enable that in a distro. I wouldn't be shocked if it is already enabled by default in Ubuntu or Fedora.
m4rtink
Zram has been enabled on Fedora by default since 2020:
MrDrMcCoy
Zswap is arguably better. It confers most of the benefits of zram swap, plus being able to evict to non-RAM if cache becomes more important or if the situation is dire. The only times I use zram are when all I have to work with for storage is MMC, which is too slow and fragile to be written to unless absolutely necessary.
johnny22
that just pushes away the problem ,it doesn't solve it. I still hit that limit when i ran a big compile while some other programs were using a lot of memory.
ahepp
what behavior would you like to see when primary memory is under extreme pressure?
baq
See mac or windows: grow swap automatically up to some sane limit, show a warning, give user an option to kill stuff; on headless systems, kill stuff. Do not page out critical system processes like sshd or the compositor.
A hard lock which requires a reboot or god forbid power cycling is the worst possible outcome, literally anything else which doesn’t start a fire is an improvement TBH.
jhasse
Same as Windows. Instead the system freezes.
marcodiego
I thought that was fixed after MGLRU.
stdbrouw
I feel like all of the elements are there: zram, zswap, various packages that improve on default oom handling... maybe it's more about creating sane defaults that "just work" at this point?
gf000
I think it's more of a user space issue, that the UI doesn't degrade nicely. The kernel just defaults to a more server-oriented approach.
PartiallyTyped
To be fair proton is based on DXVK which is some guy’s project because he wanted to play nier automata on Linux.
The guy is Philip Rebohler.
foresto
Yes, and when Valve caught wind of his early efforts, they paid him to work on it full time.
https://www.gamingonlinux.com/2018/09/an-interview-with-the-...
undefined
robotnikman
And thanks to him I was able to play and finish Nier Automata on the Steam Deck!
raverbashing
Let's be honest
Linux (and its ecosystem) sucks at having focus and direction.
They might get something right here and there, especially related to servers, but they are awful at not spinning wheels
See how wayland progress is slow. See how some distros moved to it only after a lot of kicking and screaming.
See how a lot of peripherals in "newer" (sometimes a model that's 2 or 3 yrs on the market) only barely works in a newer distro. Or has weird bugs
"but the manufacturers..." "but the hw producers..." "but open source..." whine
Because Linux lacks a good hierarchy at isolating responsibility, otherwise going for a "every kernel driver can do all it wants" together with "interfaces that keep flipping and flopping at every new kernel release" - notable (good) exception : USB userspace drivers. And don't even get me started on the whole mess that is xorg drivers
And then you have a Ruby Goldberg machine in form of udev dbus and what not, or whatever newer solution that solves half the problems and create another new collection of bugs.
cosmic_cheese
Honestly I can't see it remaining tenable to keep things like drivers in the kernel for too much longer… both due to the sheer speed at the industry moves and due to the security implications involved.
mikkupikku
> SCX-LAVD has been worked on by Linux consulting firm Igalia under contract for Valve
It seems like every time I read about this kind of stuff, it's being done by contractors. I think Proton is similar. Of course that makes it no less awesome, but it makes me wonder about the contractor to employee ratio at Valve. Do they pretty much stick to Steam/game development and contract out most of the rest?
ZeroCool2u
Igalia is a bit unique as it serves as a single corporate entity for organizing a lot of sponsored work on the Linux kernel and open source projects. You'll notice in their blog posts they have collaborations with a number of other large companies seeking to sponsor very specific development work. For example, Google works with them a lot. I think it really just simplifies a lot of logistics for paying folks to do this kind of work, plus the Igalia employees can get shared efficiency's and savings for things like benefits etc.
butlike
Oh ok, so Igalia owns the developer sweatshops now. Got it.
dan-robertson
This seems to be a win-win where developers benefit from more work in niche areas, companies benefit by getting better developers for the things they want done, and Igalia gets paid (effectively) for matching the two together, sourcing sufficient work/developers, etc.
ksynwa
I don't know much about Igalia but they are worker owned and I always see them work on high skill requirement tasks. Makes me wish I was good enough to work for them.
the_mitsuhiko
It's a cooperative sweatshop in that sense.
saagarjha
And the developers own Igalia.
zipy124
Just because work is 'out-sourced' to contractors does not mean it is a sweatshop....
chucky_z
This isn’t explicitly called out in any of the other comments in my opinion so I’ll state this. Valve as a company is incredibly focused internally on its business. Its business is games, game hardware, and game delivery. For anything outside of that purview instead of trying to build a huge internal team they contract out. I’m genuinely curious why other companies don’t do this style more often because it seems incredibly cost effective. They hire top level contractors to do top tier work on hyper specific areas and everyone benefits. I think this kind of work is why Valve gets a free pass to do some real heinous shit (all the gambling stuff) and maintain incredible good will. They’re a true “take the good with the bad” kind of company. I certainly don’t condone all the bad they’ve put out, and I also have to recognize all the good they’ve done at the same time.
Back to the root point. Small company focused on core business competencies, extremely effective at contracting non-core business functions. I wish more businesses functioned this way.
javier2
Yeah, I suppose this workflow is not for everyone. I can only imagine Valve has very specific issue or requirements in mind when they hire contractors like this. When you hire like this, i suspect what one really pay for is a well known name that will be able to push something important to you to upstream linux. Its the right way to do it if you want it resolved quickly. If you come in as a fresh contributor, landing features upstream could take years.
smotched
Whats the bad practices valve is doing in gambling?
crtasm
Their games and systems tie into huge gambling operations on 3rd party sites
If you have 30mins for a video I recommend People Make Games' documentary on it https://www.youtube.com/watch?v=eMmNy11Mn7g
mewse-hn
Loot box style underage gambling in their live service games - TF2 hats, counterstrike skins, "trading cards", etc etc
msh
Lootboxes comes to mind.
butlike
Small company doesn't have the capital to contract out library work like that. Same story as it's always been
tayo42
I feel like I rarely see contacting out work go well. This seems like an exception
OkayPhysicist
The .308 footgun with software contracting stems from a misunderstanding of what we pay software developers for. The model under which contracting seems like the right move is "we pay software developers because we want a unit of software", like how you pay a carpenter to build you some custom cabinets. If the union of "things you have a very particular opinion about, and can specify coherently" and "things you don't care about" completely cover a project, contracting works great for that purpose.
But most of the time you don't want "a unit of software", you want some amorphous blob of product and business wants and needs, continuously changing at the whims of business, businessmen, and customers. In this context, sure, you're paying your developers to solve problems, but moreover you're paying them to store the institutional knowledge of how your particular system is built. Code is much easier to write than to read, because writing code involves applying a mental model that fits your understanding of the world onto the application, whereas reading code requires you to try and recreate someone else's alien mental model. In the situation of in-house products and business automation, at some point your senior developers become more valuable for their understanding of your codebase than their code output productivity.
The context of "I want this particular thing fixed in a popular open source codebase that there are existing people with expertise in", contracting makes a ton of sense, because you aren't the sole buyer of that expertise.
magicalhippo
If you have competent people on both sides who care, I don't see why it wouldn't work.
The problem seems, at least from a distance, to be that bosses treat it as a fire-and-forget solution.
We haven't had any software done by oursiders yet, but we have hired consultants to help us on specifics, like changing our infra and help move local servers to the cloud. They've been very effective and helped us a lot.
We had talks though so we found someone who we could trust had the knowledge, and we were knowledgeable enough ourselves that we could determine that. We then followed up closely.
TulliusCicero
Valve contracts out to actually competent people and companies rather than giant bodycount consulting firms.
zipy124
This is mostly because the title of contracter has come to mean many things. In the original form, of outsourcing temporary work to experts in the field it still works very very well. Where it fails is when a business contracts out business critical work, or contracts to a general company rather than experts.
to11mtm
I've seen both good and bad contractors in multiple industries.
When I worked in the HFC/Fiber plant design industry, the simple act of "Don't use the same boilerplate MSA for every type of vendor" and being more specific about project requirements in the RFP makes it very clear what is expected, and suddenly we'd get better bids, and would carefully review the bids to make sure that the response indicated they understood the work.
We also had our own 'internal' cost estimates (i.e. if we had the in house capacity, how long would it take to do and how much would it cost) which made it clear when a vendor was in over their head under-bidding just to get the work, which was never a good thing.
And, I've seen that done in the software industry as well, and it worked.
That said, the main 'extra' challenge in IT is that key is that many of the good players aren't going to be the ones beating down your door like the big 4 or a WITCH consultancy will.
But really at the end of the day, the problem is what often happens is that business-people who don't really know (or necessarily -care-) about specifics enough unfortunately are the people picking things like vendors.
And worse, sometimes they're the ones writing the spec and not letting engineers review it. [0]
[0] - This once led to an off-shore body shop getting a requirement along the lines of 'the stored procedures and SQL called should be configurable' and sure enough the web.config had ALL the SQL and stored procedures as XML elements, loaded from config just before the DB call, thing was a bitch to debug and their testing alone wreaked havoc on our dev DB.
WD-42
Igalia isn’t your typical contractor. It’s made up of competent developers that actually want to be there and care to see open source succeed. Completely different ball game.
abnercoimbre
Nope. Plenty of top-tier contractors work quietly with their clientele and let the companies take the credit (so long as they reference the contractor to others, keeping the gravy train going.)
If you don't see it happening, the game is being played as intended.
tapoxi
Valve is actually extremely small, I've heard estimates at around 350-400 people.
They're also a flat organization, with all the good and bad that brings, so scaling with contractors is easier than bringing on employees that might want to work on something else instead.
sneak
300 people isn’t “extremely small” for a company. I don’t work with/for companies over 100 people, for example, and those are already quite big.
zipy124
300 is extremely small for a company of their size in terms of revenue and impact. Linus media group and their other companies for instance is over 100 people, and is much smaller in impact and revenue than a company like valve, despite not being far off the number of employers (within an order of magnitude)...
tester756
300 people running Steam, creating games and maintaining Steam Deck / Linux and stuff?
Yes, 300 is quite small.
frakkingcylons
I think a better way to think of it is in terms of revenue per employee. Valve is WAY up there.
PlanksVariable
Of course smaller companies exist — there are 1 person companies! But in a world where many tech companies have 50,000+ employees, 300 is much closer to 100 or 10 and they can all be considered small.
And then you consider it in context: a company with huge impact, brand recognition, and revenue (about $50M/employee in 2025). They’ve remained extremely small compared to how big they could grow.
hatthew
the implied observation is that valve is extremely small relative to what it does and how big most people would expect it to be
mindcrash
Proton is mainly a co-effort between in-house developers at Valve (with support on specific parts from contractors like Igalia), developers at CodeWeavers and the wider community.
For contextual, super specific, super specialized work (e.g. SCX-LAVD, the DirectX-to-Vulkan and OpenGL-to-Vulkan translation layers in Proton, and most of the graphics driver work required to make games run on the upcoming ARM based Steam Frame) they like to subcontract work to orgs like Igalia but that's about it.
everfrustrated
Valve is known to keep their employee count as low as possible. I would guess anything that can reasonably be contracted out is.
That said, something like this which is a fixed project, highly technical and requires a lot of domain expertise would make sense for _anybody_ to contract out.
treyd
They seem to be doing it through Igalia, which is a company based on specialized consulting for the Linux ecosystem, as opposed to hiring individual contractors. Your point still stands, but from my perspective this arrangement makes a lot of sense while the Igalia employees have better job security than they would as individual contractors.
izacus
This is how "Company funding OSS" looks like in real life.
There have been demands to do that more on HN lately. This is how it looks like when it happens - a company paying for OSS development.
wildzzz
It would be a large effort to stand up a department that solely focuses on Linux development just like it would be to shift game developers to writing Linux code. Much easier to just pay a company to do the hard stuff for you. I'm sure the steam deck hardware was the same, Valve did the overall design and requirements but another company did the actual hardware development.
redleader55
It's worth mentioning that sched_ext was developed at Meta. The schedulers are developed by several companies who collaborate to develop them, not just Meta or Valve or Italia and the development is done in a shared GitHub repo - https://github.com/sched-ext/scx.
999900000999
That's the magic of open source. Valve can't say ohh noes you need a deluxe enterprise license.
senfiaj
In this case yes, but on the other hand Red Hat won't publish the RHEL code unless you have the binaries. The GPLv2 license requires you to provide the source code only if you provide the compiled binaries. In theory Meta can apply its own proprietary patches on Linux and don't publish the source code if it runs that patched Linux on its servers only.
dralley
RHEL source code is easily available to the public - via CentOS Stream.
For any individual RHEL package, you can find the source code with barely any effort. If you have a list of the exact versions of every package used in RHEL, you could compose it without that much effort by finding those packages in Stream. It's just not served up to you on a silver platter unless you're a paying customer. You have M package versions for N packages - all open source - and you have to figure out the correct construction for yourself.
cherryteastain
Can't anyone get a RHEL instance on their favorite cloud, dnf install whatever packages they want sources of, email Redhat to demand the sources, and shut down the instance?
dfedbeef
RHEL specifically makes it really annoying to see the source. You get a web view.
kstrauser
I'm more surprised that the scheduler made for a handheld gaming console is also demonstrably good for Facebook's servers.
giantrobot
Latency-aware scheduling is important in a lot of domains. Getting video frames or controller input delivered on a deadline is a similar problem to getting voice or video packets delivered on a deadline. Meanwhile housecleaning processes like log rotation can sort of happen whenever.
bigyabai
I mean, part of it is that Linux's default scheduler is braindead by modern standards: https://en.wikipedia.org/wiki/Completely_Fair_Scheduler
3eb7988a1663
Part of that is the assumption that Amazon/Meta/Google all have dedicated engineers who should be doing nothing but tuning performance for 0.0001% efficiency gains. At the scale of millions of servers, those tweaks add up to real dollar savings, and I suspect little of how they run is stock.
accelbred
CFS was replaced by EEVDF, no?
ranger207
A lot of scheduler experimentation has been enabled by sched_ext: https://lwn.net/Articles/922405/
jorvi
I mean.. many SteamOS flavors (and Linux distros in general have) have switched to Meta's Kyber IO scheduler to fix microstutter issues.. the knife cuts both ways :)
bronson
Kyber is an I/O scheduler. Nothing to do with this article.
Brian_K_White
The comment was perfectly valid and topical and applicable. It doesn't matter what kind of improvement Meta supplied that everyone else took up. It could have been better cache invalidation or better usb mouse support.
HexPhantom
Exactly. Once the work is upstream and open, it stops being "Valve's thing" and just becomes part of the commons
sintax
Well if you think about it, in this case the license is the 30% cut on every game you purchase on steam.
Sparkyte
I've been using Bazzite Desktop for 4 months now and it has been my everything. Windows is just abandonware now even with every update they push. It is clunky and hard to manage.
aucisson_masque
Isn't bazzite a gaming focused distribution ? It seems weird to install it on a PC that does 'my everything'.
I wouldn't make excel spreadsheet on the steam deck for instance.
0x1ch
Bazzite is advertised for gamers, however from my understanding it's just Fedora Atomic wrapped up to work well on steamdeck adjacent hardware and gaming is a top priority. You'd still be receiving the same level of quality you would expect from Fedora/RHEL (I would think).
Sparkyte
Precisely, I like it's commitment to the Fedora Atomic. Fedora is in my opinion the best user experience Linux out there, not just because Linus Torvalds said it was his favorite. Probably not the best server or best to base a console OS on, but as a daily driver consistency is more important. Keeping things in flatpaks makes it easy to manage what is installed too.
pawelduda
Why not? It has full desktop mode with Plasma and can be docked like PC
Sparkyte
Gaming or not, stability is important. An OS that focuses on gaming will typically focus on stability, neither bleeding edge or lag behind in support. Has to update enough to work with certain games and behind enough to not have weird support isues.
So Bazzite in my opinion is probably one of the best user experience flavors of Fedora around.
Yes you can do more than gaming on Bazzite.
hinkley
I think you’ve forgotten or aren’t aware that before 3d graphics cards took over, people would buy new video cards to ostensibly make excel faster but then use them to play video games. It was an interesting time with interesting justifications for buying upgrades.
tra3
I'm curious how this came to be:
> Meta has found that the scheduler can actually adapt and work very well on the hyperscaler's large servers.
I'm not at all in the know about this, so it would not even occur to me to test it. Is it the case that if you're optimizing Linux performance you'd just try whatever is available?
laweijfmvo
almost certainly bottom-up: some eng somewhere read about it, ran a test, saw positive results, and it bubbles up from there. this is still how lots of cool things happen at big companies like Meta.
balls187
How well does Linux handle game streaming? I’m just now getting into it, and now that Windows10 is dead, I want to move my desktop PC over to linux, and end my relationship with Microsoft, formally.
Kholin
It works will. I've tried used Sunshine as stream server and Moonlight as client to play games on my Steam Deck, my PC installed openSUSE Tumbleweed with KDE Plasma. There may be some key binding issues, but they can be solved with a little setup.
tayo42
Interesting to see server workloads take ideas from other areas. I saw recently that some of the k8s specific os do their updates like android devices
ahartmetz
I keep being puzzled by the unwillingness of developers to deal with scheduling issues. Many developers avoid optimization, almost all avoid scheduling. There are some pretty interesting algorithms and data structures in that space, and doing it well almost always improves user experience. Often it even decreases total wall-clock time for a given set of tasks.
HexPhantom
Something built to shave off latency on a handheld gaming device ends up scaling to hyperscale servers, not because anyone planned it that way, but because the abstraction was done right
erichocean
Omarchy should adopt the SCX-LAVD scheduler as its default, it helps conserve power on laptops.
shmerl
Can't find scxctl in Debian. Was it never packaged?
Get the top HN stories in your inbox every day.
Valve is practically singlehandedly dragging the Linux ecosystem forward in areas that nobody else wanted to touch.
They needed Windows games to run on Linux so we got massive Proton/Wine advancements. They needed better display output for the deck and we got HDR and VRR support in wayland. They also needed smoother frame pacing and we got a scheduler that Zuck is now using to run data centers.
Its funny to think that Meta's server efficiency is being improved because Valve paid Igalia to make Elden Ring stutter less on a portable Linux PC. This is the best kind of open source trickledown.