Get the top HN stories in your inbox every day.
cladopa
fxtentacle
Related to your answer, I would say the reason is that it works good enough for now and it can always be patched later. Back in the good old days that we remember, software was frozen on a gold master disc, which was then tested for weeks or months before its public release. The fact that bugs could not easily be fixed in the field meant they would incur support costs or lost revenue with people returning their purchased software box.
In my opinion that is the true reason why the old native software was developed to such a high standard. But then once online stores and shrink wrap agreements made it impossible to return buggy software, then the financial incentives shifted towards shipping a partially broken product.
Who cares about pleasing with good performance when you can instead keep customers hostage?
voidnap
Your examples of engines are less about "it works" as more that it does a thing we couldn't do before and it works better than the previous thing. But neither of those are especially true of react.
React was an instant hit because it had the facebook brand behind it and everyone was tired of angular. But ultimately, react has worse outcomes for developers, users, and businesses. On the web, react websites are bloating. They run slower, their javascript payloads are larger, and they take longer to load.
Your suggestion -- that it works and then it gets more efficient later -- would make sense if we lived in a world where react moved off the virtual dom model. A virtual dom is a fine first attempt or prototype but we can do better. We know how. Projects like SolidJS do do better. React has not caught up, but it is still very popular. This whole "It worked badly, but it worked. Later came efficiency" thing is complete nonsense.
And there are loads of businesses that started off with an angular app, started to migrate to react, then started to migrate to react hooks, now switching to whatever the latest methodology is. Time and again you find these products, always endlessly migrating to the new thing, most of them never finishing a migration before beginning a new one. So these products end up being a chimera of four different frameworks held together with pain.
This isn't a good outcome for businesses, or for users, and it's not a good developer experience. react is stagnant and surviving off of being the default or the status quo and supported by tech companies that have long since stopped innovating and subsist on rent seeking. Developers choose react because nobody was ever fired for buying IBM and because they can look busy at their job, and because they buy a new phone and laptop every year with the latest hardware that can compensate for the deteriorating software they ship.
Ygg2
> React was an instant hit because it had the facebook brand behind it and everyone was tired of angular.
Ok, but why was everyone tired of Angular? Sure, web frameworks are examples of Fad Driven Development to the extreme, but Angular.js, was pure unmitigated ARSE.
Made ten bindings on a page? That's 100 cross connections. Made 100 two-way bindings? that's 10000 connections.
Clicked one way through fields A, B then started typing, they show same data. Clicked through fields A and C, now they are bound but B isn't. Clicked B then C, congrats all three of your bindings suddenly start filling in.
It was a combination of shitty performance scaling and unintuitive Angular data flow that primed everyone for React to take over.
JKCalhoun
I would prefer the old-school approach to wrapping the "Claude bits" in a per-platform framework (or if you really can't be bothered, a platform-specific command-line tool that could then be called natively from the program).
The UI wrapper then could be Electron, or something a little more platform-native you hand off to some junior engineers.
patrick451
> The first thing you need when you make something new is making it work, it is much better that it works badly than having something not working at all.
It is better for something to not exist than for a shitty version to exist. Software doesn't get better over time, it gets worse. If you make a bad, suboptimal choice today chances are that solution becomes permanent. It's telling that all of your examples of increasing efficiency are not software.
If are aren't going to do it well, don't do it.
corroclaro
obscure java error? Clojure editor made in Java?
Are you sure it wasn't just an unfamiliarity with Java errors in general?
Clojure popped out of the _senior_ Java camp. It often lives within that mindshare.
observationist
They could have done better. They chose the path of least resistance, putting in the least amount of effort, spending the least amount of resources into accomplishing a task.
There's nothing "good" about electron. Hell, there are even easier ways of getting high performance cross platform software out there. Electron was used because it's a default, defacto choice that nobody bothered with even researching or testing if it was the right choice, or even a good choice.
"It just works". A rabid raccoon mashing its face on a keyboard could plausibly produce a shippable electron app. Vibe-bandit development. (This is not a selling point.) People claiming to be software developers should aim to do better.
Gooblebrai
> They could have done better. They chose the path of least resistance, putting in the least amount of effort, spending the least amount of resources into accomplishing a task
You might as well tell reality to do better: The reality of physics (water flows downhill, electricity moves through the best conductor, systems settle where the least energy is required) and the reality of business (companies naturally move toward solutions that cost less time, less money, and less effort)
I personally think that some battles require playing within the rules of the game. Not to wish for new rules. Make something that requires less effort and resources than Electron but is good enough, and people will be more likely to use it.
observationist
Shaming the use of electron? I'll do that every day and twice on sunday. Same with nonsense websites that waste gigabytes on bloat, spam users with ads, and feed the adtech beast. And I'll lay credit for this monument to enshittification we call the internet at the feet of Google and Facebook and Microsoft.
Using electron and doing things shittily is a choice. If you're ever presented with a choice between doing something well and not, do the best you can. Electron is never the best choice. It's not even the easiest, most efficient choice. It's the lazy, zero effort, default, ad-hoc choice picked by someone who really should know better but couldn't be bothered to even try.
3oil3
I agree with you, I even think it's shameful. When I saw it was elctron, I sighted so long I almost choked. Can't even cmd+g nor shift+cmd+f to search, context menu has nothing. Can't even swipe, no gestures etc. ELctron is better than nothing, and I'm grateful, but it tastes bitter. As for performance, somebody if I remember correctly, once asked here "what's the point of 512GB RAM on the mac Studio?" And then someone replied "so you can run two electron apps".
undefined
pjmlp
Nah, some developers are lazy, that is all, lets not dance around the bush with that one.
Most of those Electron folks would not manage to even write C applications on an Amiga, use Delphi, VB, or whatever.
Educated on node and do not know anything else.
Even doing a TUI seems like a revelation to current generations, something quite mudane and quite common on 1980's text based computing of Turbo Vision, Clipper and curses.
hedgehog
Let's assume for the moment the developers are doing about the best they can for the goals they're given. The slowness is a product decision, not an engineering problem. They are swimming in cash and could write a check to solve the problem if they cared. Evidence is they have other priorities.
pjmlp
Caring now that is an interesting word, it is clear they don't care about their customers hardware, or user experience, only themselves.
hedgehog
To be fair to all involved I'm committing a bit of a category error here, a company of any size is not so much a coherent entity as a flock of individuals with their own motivations. As the popular saying goes: the company ships its org chart.
zadikian
At least it seems like a lot more apps are cross-platform than before. I wouldn't call the native devs lazy for not making a Mac version of their Windows app.
pjmlp
Agreed, yet back in the day we even managed to do that with applications being written in Assembly, in some cases.
Uphill both ways, should be easy for a company doing C compilers with LLMs.
codebje
Early games frequently took the approach of inventing an interpreted machine code in which the bulk of the game would be written, with an assembly interpreter that would need to be rewritten for each target IA and modified for the specific peripheral mixes of each target machine.
The approach runs slower than a game written directly in assembly, but the cost to port to different architectures is much lower.
Sort of like Electron trades off native performance and look-and-feel to make multi-platform apps much more achievable.
IMO the OS vendors failed everyone by refusing to even attempt to agree on a common API for UI development, paving the way for web browsers to become the real OS and, ultimately, embedded browsers to be the affordable and practical way to be cross platform.
carefree-bob
Honestly I'm grateful to apps that otherwise wouldn't be available outside of windows.
reactordev
They wrote a React TUI renderer, that’s what they did. Shame…
I understand why, but there is such beauty in the simplicity of ansi.
port11
I think your view is lazy. The article explains some of the actual reasons, none of which have to do with laziness.
I’ve built for Electron and did a course on Swift for macOS apps. Not out of laziness, but I don’t think I’d ever build native for Macs. And the Windows folk have been complaining for a long time about the native APIs.
Now, native on mobile, that’s something else. I’ve been stuck on RN/Expo because that’s what the resources the business had allowed for, but native Kotlin is much more enjoyable (AFAIK). Swift… dunno, still icky.
JKCalhoun
FTA:
"Looks could be good, but they also can be bad, and then you are stuck with platform-consistent, but generally bad UI (Liquid Glass ahem)."
Since the discussion was specifically about platform-consistency, odd that the author would decide that personal taste might take priority over platform-consistency.
"It changes too often, too: the app you made today will look out of place next year, when Apple decides to change look and feel yet again."
Seems to be arguing the exact opposite? If you adopt a native API for controls, windows, etc., your app will change next year to look completely in place (perhaps for better or worse according to the author).
Going DIY on UI and your app might still, in 2026, have brushed aluminum and "lickable" buttons.
port11
I read it as criticism on how the native APIs don’t give you a sure-fire way of getting to a consistent native UI. Case in point is Apple’s own software, with some of its apps looking completely out of place despite being built on native UI.
sehugg
Realize, though, that just grabbing a frame buffer is not a thing anymore. To render graphics you need GLES support through something like ANGLE, vectors and fonts via Skia, Unicode, etc. A web browser has those things. Any static binary bundling those things is also gonna be pretty large.
And JavaScript is very good at backwards compatibility when you remove the churn of frameworks (unfortunately Electron doesn't guarantee compatibility quite as far back)
pjmlp
And CPUs are only sand powered by electricity.
I do realise the need for abstractions and they do exist, provided there is actually the interest to learn them.
senadir
Do you think that writing Delphi, a language no one is buying, makes you hardworking?
pjmlp
If no one would be buying Delphi, Embarcadero would not be in business, yet here they are.
https://www.embarcadero.com/products/delphi
One of the related conferences just took place last October,
Ygg2
There is a difference between being in business and thriving.
I've been in a few companies that managed to eke out a living by maintaining a piece of software no one in their sane mind would still maintain. Sometimes a government gig, sometimes the private sector.
Shoutout to my boys that in 2018 maintained a Java 1.3 app. Still going strong to this day (it was migrated to Java 8 last time I checked).
EDIT: ~21 Delphi apps in world! Woohoo! Delphi number #525
sharts
Facts.
The sad reality is everyone wanting to have fancy looking pages/apps as quickly and easily as possible.
And now, the web (and increasingly desktop) is littered with the lowest common denominator of platforms with all sorts of crazy optimizations that still can’t be as snappy as a Windows 95 app on a 200mhz / 16MB desktop.
At this point we may as well just use electron and nodejs for fighter jets and missile defense systems. Surely it’s fast enough for that, too.
Ygg2
What's the alternative for cross-platform GUIs? Qt? Swing? Avalonia?
Electron is bad, but it's the least bad of all other cross-platform GUIs.
> Most of those Electron folks would not manage to even write C applications on an Amiga, use Delphi, VB, or whatever.
That's true of most people on the planet. No one has access to them.
hresvelgr
Something that isn't touched on as much is that in the time between old-school native apps and Electron apps is design systems and brand language have become much more prevalent, and implementing native UI often results in compromising design, and brand elements. Most applications used to look more or less the same, nowadays two apps on the same computer can look completely different. No one wants to compromise on design.
This mentality creates a worse experience for end users because all applications have their own conventions and no one wants to be dictated to what good UX is. The best UX in every single instance I've encountered is consistency. Sure, some old UIs were obtuse (90% weren't) but they were obtuse in predictable ways that someone could reasonably navigate. The argument here is between platform consistency and application consistency. Should all apps on the platform look the same, or should the app look the same on all platforms?
edit: grammar
pavlov
If I look at the Notion and Linear desktop apps, they’re essentially identical in styling and design. They’re often considered the best of today’s web/Electron productivity apps, and they have converged on a style that’s basically what Apple had five years ago.
IMO that’s a fairly strong argument that the branding was always unnecessary, and apps would have been better off built from a common set of UI components following uniform human interface guidelines.
lelandfe
I do notice those things occupying your "essentially," and your "basically." The success of worse designed stuff is a hard thing to argue against, though.
zigzag312
> The best UX in every single instance I've encountered is consistency.
While I agree that consistency is hugely important, I have also seen a lot of cases where it made the UX worse. The reason is that, unfortunately, UX isn't so simple. There isn't a single UX rule that is always true. UX design rules (best practices, guidelines, or principles) are a good starting point, but in a lot of situations multiple rules are conflicting each other. UI/UX design is dealing with tradeoffs most of the time. Good designer will know when breaking a specific rule will actually improve the UX.
Consistency is very important, but sometimes a custom UI element will be the best tool for the job. For example, imagine UI for seat selection in a movie theater ticket booking app. A consistent design would mean using standard controls users are already familiar with, but no standard control will provide high quality UX in this situation (not without heavy modifications).
But I still I agree with you that a lot of bad UX is due to inconsistency. There needs to be a good reason each time consistency broken and often it is broken for the wrong reasons.
ttd
Some random thoughts, since I've had a similar train of thought for a while now.
On one hand I also lament the amount of hardware-potential wastage that occurs with deep stacks of abstractions. On the other hand, I've evolved my perspective into feeling that the medium doesn't really matter as much as the result... and most software is about achieving a result. I still take personal joy in writing what I think is well-crafted code, and I also accept that that may become more niche as time goes on.
To me this shift from software-as-craft to software-as-bulk-product has some similarities to the "pets vs cattle" mindset change when thinking about server / process orchestration and provisioning.
Then also on the dismay of JS becoming even more entrenched as the lingua franca. There's every possibility that in a software-as-bulk-product world, LLM-driven development could land on a safer language due to efficiency gains from e.g. static type checking. Economically I wonder if an adoption of a different lingua franca could manifest by way of increasing LLM development speed / throughput.
usrnm
> LLM-driven development could land on a safer language
Why does an LLM need to produce human readable code at all? Especially in a language optimized around preventing humans from making human mistakes. For now, sure, we're in the transitional period, but in the long run? Why?
jerf
From my post at https://jerf.org/iri/post/2026/what_value_code_in_ai_era/ , in a footnote:
"It has been lost in AI money-grabbing frenzy but a few years ago we were talking a lot about AIs being “legible”, that they could explain their actions in human-comprehensible terms. “Running code we can examine” is the highest grade of legibility any AI system has produced to date. We should not give that away.
"We will, of course. The Number Must Go Up. We aren’t very good at this sort of thinking.
"But we shouldn’t."
brewtide
Once again, communication remains key.
mjr00
Because the traits that make code easy for LLMs to work on are the same that make it ideal for humans: predictable patterns, clearly named functions and variables, one canonical way to accomplish a task, logical separation of concerns, clear separation of layers of abstraction, etc. Ultimately human readability costs very little.
mandevil
I can't even imagine what "next token prediction" would look like generating x86 asm. Feels like 300 buffer overflows wearing a trench-coat, honestly.
zadikian
It'd just run out of tokens
recursive
So humans can verify that the code is behaving in the interests of humanity.
ianbicking
In a sense they do use their own language; they program in tokenized source, not ASCII source. And maybe that's just a form of syntactic sugar, like replacing >= with ≥ but x100. Or... maybe it's more than that? The tokenization and the models coevolve, from my understanding.
If we do enough passes of synthetic or goal-based training of source code generation, where the models are trained to successfully implement things instead of imitating success, then we may see new programming paradigms emerge that were not present in any training data. The "new language" would probably not be a programming language (because we train on generating source FOR a language, not giving it the freedom to generate languages), but could be new patterns within languages.
davorak
> For now, sure, we're in the transitional period, but in the long run? Why?
Assuming that after the transitional period it will still be humans working with ai tools to build things where humans actually add value to the process. Will the human+ai where the ai can explain what the ai built in detail and the human leverages that to build something better, be more productive that the human+ai where the human does not leverage those details?
That 'explanation' will be/can act as the human readable code or the equivalent. It does not need to be any coding language we know today however. The languages we have today are already abstractions and generalizations over architectures, OSs, etc and that 'explanation' will be different but in the same vein.
IncreasePosts
For one thing, because it would be trained on human readable code.
ttd
Well, IMO there's not much reason for an LLM to be trained to produce machine language, nor a functional binary blob appearing fully-formed from its head.
If you take your question and look into the future, you might consider the existence of an LLM specifically trained to take high-level language inputs and produce machine code. Well, we already have that technology: we call it a compiler. Compilers exist, are (frequently) deterministic, and are generally exceedingly good at their job. Leaving this behind in favor of a complete English -> binary blob black box doesn't make much sense to me, logically or economically.
I also think there is utility in humans being able to read the generated output. At the end of the day, we're the conscious ones here, we're the ones operating in meatspace, and we're driving the goals, outputs, etc. Reading and understanding the building blocks of what's driving our lives feels like a good thing to me. (I don't have many well-articulated thoughts about the concept of singularity, so I leave that to others to contemplate.)
il-b
Somehow, a CAD program, a 3D editor, a video editor, broadcasting software, a circuit simulation package, etc are all native applications with thousands of features each - yet native development has nothing to offer?
rapnie
Besides going full native, a Tauri [0] app might have been another good alternative given they already use Rust. There are pros and cons to that choice, of course, and perhaps Tauri was considered and not chosen. Tauri plus Extism [1] would have been interesting, enabling polyglot plugin development via wasm. For Extism see also the list of known implementations [2].
TimFogarty
I have been using Tauri for a macOS app I'm making[1] and it has been great. The app is only 11MB and I've had most of the APIs I'd need.
However, there are still some rough edges that have been annoying to work with. I think for my next project I will actually go back to electron. There are two issues that caused me pain:
1. I can't use Playwright to run e2e tests on the tauri app itself. That's because the webview doesn't expose the Chrome DevTools Protocol, and the tauri-driver [2] does not work on MacOS.
2. Security Scoped Resources aren't fully implemented which means if a user gets the app through the app store the app won't be able to remember file permissions between runs [3]. It's not too much of an issue since I probably won't release it on the app store, but still annoying.
But I hope Tauri continues to grow and we start seeing apps use it more.
moogly
I have also used Tauri for one of my private apps, and using the OS's webview just doesn't work for me, so for my next stuff I'm probably going to use Electron as well since you can embed the webview. Yeah, it's bloated, but I'm so tired of things not working properly on Wayland without disabling this and that with random env vars and not able to do a fully OOTB single portable AppImage build on Linux. I can either make it work in Kubuntu + Arch (building on Ubuntu), or Arch + Fedora (building on Arch), but not all 3.
I tried Uno Platform and AvaloniaUI last year but I had similar problems there with external drag 'n' drop not working on Wayland and the difficulty in writing your own advanced components of which there are oodles to choose from using React/Vue/Solid/Svelte.
I'm not rewriting that other app in Electron, so for Tauri (the development of which largely seems to have stalled?) I'm hoping this[1] will solve my Linux hurdles. Going to try that branch out.
[1]: https://github.com/tauri-apps/tauri/pull/12491
And this is just desktop Linux. I used to care about Windows but stopped building for that.
adisinghyc
such a real problem, tested the webdriver myself. really should be something to automate e2e tests via an mcp for tauri aswell.
Joeboy
I find it a bit odd how much people talk up the Rust aspect of Tauri. For most cases you'll be writing a Typescript frontend and relying on boilerplate Rust + plugins for the backend. And I'd think most of the target audience would see that as a good thing.
francisl
I working on a project using tauri with htmx. I know a bit uncommon. But the backend part use axum and htmx. No Js/Ts UI. It's fast, reliable and it work well. Plus its easy to share/reuse the lib with the server/web.
rapnie
I am considering a Tauri app, but still wondering about architecture design choices, which the docs are sparse about. For instance the Web-side may constitute a more full-blown, say NextJS, webapp. And include the database persistance, say SQLite based, on the web side too, closest to the webapp. That goes against the sandboxing (and best-practice likely), where all platform-related side effects are dealt with Platform-side, implemented in Rust code. I wonder if it is a valid choice. There is a trade-off in more ease of use and straightforwardness vs. stricter sandboxing.
jemmyw
At least with Tauri it's easy to both make the choice and change it later if you want to. I think the docs are sparse because it's your decision to make. I've done it both ways and there are pros and cons. If you use the sqlite plugin and write the actual statements on the JS side then you don't need to worry about the JS<->Rust interface and sharing types. Easier to just get going. If you write your own interface then you probably want to generate TS types from Rust. I think a big advantage to the Rust interface way is that it makes it easier to have the web side be dual purpose with the same code running on the web and in Tauri - the only difference being whether it invokes a tauri call or an API call.
arjie
I built a vibe-coded personal LLM client using Tauri and if I'm being honest the result was much worse than either Electron or just ditching it and going full ratatui. LLMs do well when you can supply them an verification loop and Tauri just doesn't have the primitives to expose. For my personal tools, I'm very happy with ratatui or non-TUI CLIs with Rust, but for GUIs I wouldn't use it. Just not good dev ex.
gamarin
[dead]
headcanon
+1 for Tauri, I've been using it for my recent vibe-coded experimental apps. Making rust the "center of gravity" for the app lets me use the best of all worlds:
- declarative-ish UI in typescript with react
- rust backend for performance-sensitive operations
- I can run a python sidecar, bundled with the app, that lets me use python libraries if I need it
If I can and it makes sense to, I'll pull functionality into rust progressively, but this give me a ton of flexibility and lets me use the best parts of each language/platform.
Its fast too and doesn't use a ton of memory like electron apps do.
EduardoBautista
Also, Rust's strong and strict type system keeps Claude honest. It seems as if the big LLM models have trained on a lot of poorly written TypeScript because they tend to use type assertions such as `as any` and eslint disable comments.
I had to add strict ESLint and TypeScript rules to keep guardrails on the coding agents.
rapnie
I added a list of known Extism implementers to my comment above, to take inspiration from should Extism be attractive to consider for you.
rothific
My team is building a cross platform app with Tauri that is mobile, web, and desktop in one codebase and we've had almost nothing bad to say. It's been great. Also the executable size and security are amazing. Rust is nice. Haven't done as much with it yet but it will come in useful soon as we plan to implement on-device AI models that are faster in Rust than WebGPU.
oooyay
I use something similar to Tauri called Wails: https://wails.io/ that's Go-based.
tvink
Looks cool, but the phrase 'build applications with the flexibility and power of go' made me chuckle. Least damn flexible language in this whole space.
YmiYugy
This might be a "the grass is greener on the other side" situation because I do a lot more web than native dev, but in my experience native while just as quirky as web will usually give you low level APIs to work around design flaws. On web it too often feels like you can either accept a slightly janky result or throw everything away and use canvas or webgl. Here are some recent examples I stumbled across: - try putting a semi transparent element on part of an image with rounded corners and you will observe unfixable anti-alias issues in those corners - try animating together an input with the on screen keyboard - try doing a JS driven animation in a real app (never the main thread feels hopeless and houdini animation worklets never materialized)
I don't think it's that native has nothing to offer. I think that developing (in case of desktop) for 3 different platforms all with own complication of what is native UI is a nightmare. macos has swiftui (incomplete), uikit and appkit, linux in practice gtk/qt, windows winui 3 (fundamentally broken) with WPF and WinForms still hanging around .
rayiner
> I think that developing (in case of desktop) for 3 different platforms all with own complication of what is native UI is a nightmare. macos has swiftui (incomplete), uikit and appkit, linux in practice gtk/qt, windows winui 3 (fundamentally broken) with WPF and WinForms still hanging around .
Wouldn’t it be a good use of AI to port the same app to several native platforms?
YmiYugy
yes it would, but depending on the app it could put you in a ton of hurt. - AI has gotten a lot better on less popular tech, but there is still a big capability gap between native frameworks an the blessed react + tailwind stack. - You will get something that is likely in the right shape but littered with a million subtle bugs and fixing them without having intimate knowledge of the plat form is really hard.
odiroot
I'd still take native KDE/Plasma apps over Electron any day. Just the performance and memory usage alone is worth it.
Sublime Text feels so much snappier than VSCode, for another example. And I can leave it running for weeks without it leaking memory.
etothet
"The real reason is: native has nothing to offer."
I get it, but this is a bit dramatic.
One of the biggest challenges I've found with using non-native tools (and specifically the various frameworks that let you write JavaScript that compile to Native code) is that there is much less of a guarantee that the 3rd party solution will continue support for new OS versions. There's much less of a risk with that with 1st party solutions.
Additionally, those 3rd parties are always chasing the 1st part vendor for features. Being far behind the regular cadence of releases can be quite inconvenient, despite any advantages initially gained.
iamsaitam
The meaning of native in these discussions is "no web technologies", because Qt gets thrown around and that's as native as to macOS as Electron, just in a different manner.
zozbot234
That's the exact same way AAA games are native, which well, they are. As the article itself makes clear, the OS-default toolkit doesn't really have a privileged status on macOS today, any more than it does on Windows or many Linux distros.
andyjohnson0
I felt that this article didn't provide strong justifications for some of its assertions.
> Native APIs are terrible to use, and OS vendors use everything in their power to make you not want to develop native apps for their platform.
Disagree. I'm most familiar with Windows and Android - but native apps on those platforms, snd also on Mac, look pretty good when using the default tools and libraries. Yes, its possible to use (say) material design and other ux-overkill approaches on native, but thats a choice just like it us for web apps.
And OS vendors are very much incentivised to make natuve development as easy and painless as possible - because lock-in.
> That explains the rise of Electron before LLM times,
Disagree. The "rise of Electron" is due to the economics of skill-set convergence on JS, the ubiquity of the JS/HTML/CSS/Node stack platform, and many junior developers knowing little or nothing else.
As for the rest: minor variations in traffic light positioning and corner radii are topical but hardly indicators of decaying platorms.
bloomca
The rise of Electron was purely because you can share the codebase for real with the web app (for lots of apps it is their main focus) and get cross-platform support for free.
Native apps are not bad to develop when using Swift or C#, they are nice to use and their UI frameworks are fine, it's just that it requires a separate team. With Electron you need much less, simple as that.
> As for the rest: minor variations in traffic light positioning and corner radii are topical but hardly indicators of decaying platorms.
I think it shows how important the platform itself is to the company. The system settings app on macOS is literally slow to change the topic (the detail page is updated like ~500ms after clicking).
I personally love to develop desktop apps but business-wise they rarely make sense these days.
bdangubic
> Disagree. The "rise of Electron" is due to the ubiquity of the JS/HTML/CSS/Node stack, and many junior developers knowing nothing else.
with all due respect - hard disagree. in what place on Earth to Junior Devs make these types of decisions?? Or decision makers going “we got these Juniors that know JS so it is what is…”
nitwit005
I don't believe they were implying they would make the decision. It's expensive to have your team learn new skills from scratch, and management won't want to pay for that if they don't have to.
andyjohnson0
This is indeed what I meant. Thanks for stating it with more clarity than I was able to.
bdangubic
I have been coding for 30 years now and I have never encountered a technical decision like choosing technology (e.g. Electron) for anything important to the company being made with "oh, we must use X because so and so knows X"
Maybe if there was a toss-up between X and Y or something like that but to flat-out pick Electron because you have people that knows JS is madness
eviks
While that's not what the author meant - in all places on Earth where they people grow up and become powerful enough to make those decisions (but also before that , in their own little apps)
lapcat
It's weird for the author to mention Mac window buttons and corner radius as reasons to use Electron, because while the main content of Electron app windows is HTML, the Electron windows themselves and the window chrome are native, with the same buttons and corner radius as other apps on the system.
Electron is a native wrapper for web content. The wrapper is still native.
> Native APIs are terrible to use, and OS vendors use everything in their power to make you not want to develop native apps for their platform.
I'm honestly not quite sure what the author means here.
Web APIs are equally “terrible” in my opinion. In any case, you have to release an Electron app on Mac the same way you release any native app on Mac. The benefit of using web APIs is not that they are non-terrible but that you can share the same code as your website. And of course you can more easily find web developers than native developers. But that has nothing to do with whether or not the API is terrible. It’s just supply and demand.
I’ll take AppKit and autolayout any day over CSS, ugh. CSS is the worst.
lxgr
> with the same buttons and corner radius as other apps on the system
I just checked: No, the corner radius is different. I'm personally not very bothered by that, but it's just empirically true.
> Electron is a native wrapper for web content. The wrapper is still native.
In my view, the problem isn't that it's a wrapper, but rather that it's that it's a bad wrapper of a bad runtime (i.e. the incredibly bloated JS/web stack).
zadikian
UIKit etc never made sense to me after years, CSS also didn't make sense, but right out of the box I understood React. And with hooks, it's way less boilerplate than the UIKit ways.
Separate from that, Apple doesn't seem to mind breaking native macOS apps, to the point where most devs treat native code like a liability on Mac but ok on Windows.
Get the top HN stories in your inbox every day.
I would say that the real reason is because "it works". As simple as that.
The first thing you need when you make something new is making it work, it is much better that it works badly than having something not working at all.
Take for example the Newcomen engine, with an abysmal efficiency of half a percent. You needed 90 times more fuel than an engine today, so it could only be used in the mines were the fuel was.
It worked badly, but it worked. Later came efficiency.
The same happened with locomotives. So bad efficiency at first, but it changed the world.
The first thing AI people had to do is making it work in all OSes. Yeah, it works badly but it works.
We downloaded some Clojure editor made in java to test if we were going to deploy it in our company. It gave us some obscure java error in different OSes like linux or Mac configurations. We discarded it. It did not work.
We have engineers and we can fix those issues but it is not worth it. The people that made this software do not understand basic things.
We have Claude working in hundreds of computers with different OSes. It just works.