Brian Lovin
/
Hacker News
Daily Digest email

Get the top HN stories in your inbox every day.

antirez

If this had been available in 2010, Redis scripting would have been JavaScript and not Lua. Lua was chosen based on the implementation requirements, not on the language ones... (small, fast, ANSI-C). I appreciate certain ideas in Lua, and people love it, but I was never able to like Lua, because it departs from a more Algol-like syntax and semantics without good reasons, for my taste. This creates friction for newcomers. I love friction when it opens new useful ideas and abstractions that are worth it, if you learn SmallTalk or FORTH and for some time you are lost, it's part of how the languages are different. But I think for Lua this is not true enough: it feels like it departs from what people know without good reasons.

norir

I don't love a good deal of Lua's syntax, but I do think the authors had good reasons for their choices and have generally explained them. Even if you disagree, I think "without good reasons" is overly dismissive.

Personally though, I think the distinctive choices are a boon. You are never confused about what language you are writing because Lua code is so obviously Lua. There is value in this. Once you have written enough Lua, your mind easily switches in and out of Lua mode. Javascript, on the other hand, is filled with poor semantic decisions which for me, cancel out any benefits from syntactic familiarity.

More importantly, Lua has a crucial feature that Javascript lacks: tail call optimization. There are programs that I can easily write in Lua, in spite of its syntactic verbosity, that I cannot write in Javascript because of this limitation. Perhaps this particular JS implementation has tco, but I doubt it reading the release notes.

I have learned as much from Lua as I have Forth (SmallTalk doesn't interest me) and my programming skill has increased significantly since I switched to it as my primary language. Lua is the only lightweight language that I am aware of with TCO. In my programs, I have banned the use of loops. This is a liberation that is not possible in JS or even c, where TCO cannot be relied upon.

In particular, Lua is an exceptional language for writing compilers. Compilers are inherently recursive and thus languages lacking TCO are a poor fit (even if people have been valiantly forcing that square peg through a round hole for all this time).

Having said all that, perhaps as a scripting language for Redis, JS is a better fit. For me though Lua is clearly better than JS on many different dimensions and I don't appreciate the needless denigration of Lua, especially from someone as influential as you.

kbenson

> For me though Lua is clearly better than JS on many different dimensions and I don't appreciate the needless denigration of Lua, especially from someone as influential as you.

Is it needless? It's useful specifically because he is someone influential, and someone might say "Lua was antirez's choice when making redis, and I trust and respect his engineering, so I'm going to keep Lua as a top contender for use in my project because of that" and him being clear on his choices and reasoning is useful in that respect. In any case where you think he has a responsibility to be careful what he says because of that influence, that can also be used in this case as a reason he should definitely explain his thoughts on it then and now.

bakkoting

Formally JavaScript is specified as having TCO as of ES6, although for unfortunate and painful reasons this is spec fiction - Safari implements it, but Firefox and Chrome do not. Neither did QuickJS last I checked and I don't think this does either.

undefined

[deleted]

tracker1

ES is now ES2025, not ES6/2015. There are still platforms that don't even fully implement enough to shim out ES5 completely, let alone ES6+. Portions of ES6 require buy in from the hosting/runtime environment that aren't even practical for some environments... so I feel the statement itself is kind of ignorant.

shawn_w

>Lua is the only lightweight language that I am aware of with TCO.

Scheme is pretty lightweight.

fnord123

Which scheme implementation? Guile?

ksec

> I think the distinctive choices are a boon. You are never confused about what language you are writing because Lua code is so obviously Lua. There is value in this.

This. And not just Lua , but having different kind of syntax for scripting languages or very high level languages signal it is something entirely different, and not C as in system programming language.

The syntax is also easier for people who dont intend to make programming as their profession, but simply want something done. It used to be the case in the old days people would design simple PL for new beginners, ActionScript / Flash era and even Hypercard before that. Unfortunately the industry is no longer interested in it, and if anything intend to make every as complicated as possible.

teo_zero

> Lua has a crucial feature that Javascript lacks: tail call optimization.

I'm not familiar with Lua, but I expect tco to be a feature of the compiler, not of the language. Am I wrong?

mananaysiempre

You’re wrong in the way in which many people are wrong when they hear about a thing called “tail-call optimization”, which is why some people have been trying to get away from the term in favour of “proper tail calls” or something similar, at least as far as R5RS[1]:

> A Scheme implementation is properly tail-recursive if it supports an unbounded number of active tail calls.

The issue here is that, in every language that has a detailed enough specification, there is some provision saying that a program that makes an unbounded number of nested calls at runtime is not legal. Support for proper tail calls means that tail calls (a well-defined subgrammar of the language) do not ever count as nested, which expands the set of legal programs. That’s a language feature, not (merely) a compiler feature.

[1] https://standards.scheme.org/corrected-r5rs/r5rs-Z-H-6.html#...

kerkeslager

I don't think you're wrong per se. This is a "correct" way of thinking of the situation, but it's not the only correct way and it's arguably not the most useful.

A more useful way to understand the situation is that a language's major implementations are more important than the language itself. If the spec of the language says something, but nobody implements it, you can't write code against the spec. And on the flip side, if the major implementations of a language implement a feature that's not in the spec, you can write code that uses that feature.

A minor historical example of this was Python dictionaries. Maybe a decade ago, the Python spec didn't specify that dictionary keys would be retrieved in insertion order, so in theory, implementations of the Python language could do something like:

  >>> abc = {}
  >>> abc['a'] = 1
  >>> abc['b'] = 2
  >>> abc['c'] = 3
  >>> abc.keys()
  dict_keys(['c', 'a', 'b'])
But the CPython implementation did return all the keys in insertion order, and very few people were using anything other than the CPython implementation, so some codebases started depending on the keys being returned in insertion order without even knowing that they were depending on it. You could say that they weren't writing Python, but that seems a bit pedantic to me.

In any case, Python later standardized that as a feature, so now the ambiguity is solved.

It's all very tricky though, because for example, I wrote some code a decade that used GCC's compare-and-swap extensions, and at least at that time, it didn't compile on Clang. I think you'd have a stronger argument there that I wasn't writing C--not because what I wrote wasn't standard C, but because the code I wrote didn't compile on the most commonly used C compiler. The better approach to communication in this case, I think, is to simply use phrases that communicate what you're doing: instead of saying "C", say "ANSI C", "GCC C", "Portable C", etc.--phrases that communicate what implementations of the language you're supporting. Saying you're writing "C" isn't wrong, it's just not communicating a very important detail: what implementations of the compiler can compile your code. I'm much more interested in effectively communicating what compilers can compile a piece of code than pedantically gatekeeping what's C and what's not.

naasking

If the language spec requires TCO, I think you can reasonably call it part of the language.

lioeters

> as my primary language

I'd love to hear more how it is, the state of the library ecosystem, language evolution (wasn't there a new major version recently?), pros/cons, reasons to use it compared to other languages.

About tail-calls, in other languages I've found sometimes a conversion of recursive algorithm to a flat iterative loop with stack/queue to be effective. But it can be a pain, less elegant or intuitive than TCO.

alexdowad

Lua isn't my primary programming language now, but it was for a while. My personal experience on the library ecosystem was:

It's definitely smaller than many languages, and this is something to consider before selecting Lua for a project. But, on the positive side: With some 'other' languages I might find 5 or 10 libraries all doing more or less the same thing, many of them bloated and over-engineered. But with Lua I would often find just one library available, and it would be small and clean enough that I could easily read through its source code and know exactly how it worked.

Another nice thing about Lua when run on LuaJIT: extremely high CPU performance for a scripting language.

In summary: A better choice than it might appear at first, but with trade-offs which need serious consideration.

tracker1

Yeah, you can usually write a TCO based algorithm differently without recursion though it's often more messy of an implementation... In practice, with JS, I find that if I know I'm going to wind up more/less than 3-4 calls deep I'll optimize or not to avoid the stack overflow.

Also worth noting that some features in JS may rely on application/environment support and may raise errors that you cannot catch in JS code. This is often fun to discover and painful to try to work around.

xonix

Re: TCO

Does the language give any guarantee that TCO was applied? In other words can it give you an error that the recursion is not of tail call form? Because I imagine a probability of writing a recursion and relying on it being TCO-optimized, where it's not. I would prefer if a language had some form of explicit TCO modifier for a function. Is there any language that has this?

ZiiS

At least in Lua then the rule is simply 'last thing a function dose' this is unambiguous. `return f()` is always a tail call and `return f() + 1` never is.

alexisread

Although it’s a bit weird, Able Forth has the explicit word ~

https://github.com/ablevm/able-forth/blob/current/forth.scr

I do prefer this as it keeps the language more regular (fewer surprises)

stellartux

Sounds a bit like Clojure's "recur". https://clojuredocs.org/clojure.core/recur

draven

Scala has the @tailrec annotation which will raise a warning if the function can’t be TCO’d

garaetjjte

C, with [[clang::musttail]]

hajile

JS has required proper tail calls (PTC) for a decade now. Safari's JavascriptCore and almost every implementation except v8/spidermonkey (and the now defunct chakra) have PTC.

v8 had PTC, but removed it because they insisted it MUST have a new tail call keyword. When they were shot down, they threw a childish fit and removed the PTC from their JIT.

brabel

> it feels like it departs from what people know without good reasons.

Lua was first released in 1993. I think that it's pretty conventional for the time, though yeah it did not follow Algol syntax but Pascal's and Ada's (which were more popular in Brazil at the time than C, which is why that is the case)!

Ruby, which appeared just 2 years later, departs a lot more, arguably without good reasons either? Perl, which is 5 years older and was very popular at the time, is much more "different" than Lua from what we now consider mainstream.

rwmj

We had a lot problems embedding Ruby in a multithreaded C program as the garbage collector tries to scan memory between the threads (more details here: https://gitlab.com/nbdkit/nbdkit/-/commit/7364cbaae809b5ffb6... )

Perl, Python, OCaml, Lua and Rust were all fine (Rust wasn't around in 2010 of course).

rurban

I'm reving _why's syck right now. Turns out my fork from 2013 was still the most advanced. It doesn't implement the latest YAML specs, and all of it's new insecurities, which is a good thing. And it's much, much faster than the sax-like libyaml.

But since syck uses the ruby hashtable internally, I got stuck in the gem for a while. It fell out of their stdlib, and is not really maintained neither. PHP had the latest updates for it. And perl (me) extended it to be more recursion safe, and added more policies (what to do on duplicate keys: skip or overwrite).

So the ruby bindings are troublesome because of its GC, which with threading requires now7 a global vm instance. And using the ruby alloc/free pairs.

PHP, perl, python, Lua, IO, cocoa, all no problem. Just ruby, because of its too tight coupling. Looks I have to decouple it finally from ruby.

rapind

> Ruby, which appeared just 2 years later, departs a lot more, arguably without good reasons either?

I doubt we ever would have heard about Ruby without it's syntax decisions. From my understanding it's entire raison d'être was readability.

rwmj

It's essentially Perl for people who don't like punctuation marks.

nurettin

    def ruby(is)
      it = is 
      a = "bad"
      example()
      begin
        it["had"] = pascal(:like)
      rescue
        flow
      end
    end

zeckalpha

Pascal and Ada are Algol syntaxed relative to most languages.

jhgb

> yeah it did not follow Algol syntax but Pascal's and Ada's

Now quite sure what you mean by that; all of Lua, Pascal, and Ada follow Algol's syntax much more closely than C does.

a-french-anon

I don't think you understand his point. Ruby has a different syntax because it presents different/more language features than a very basic C-like language; it's inspired by Lisp/SmallTalk, after all. Lua doesn't but still decided to change its looks a lot, according to him.

rweichler

I read this comment, about to snap back with an anecdote how I as a 13 year old was able to learn Lua quite easily, and then I stopped myself because that wasn't productive, then pondered what antirez might think of this comment, and then I realized that antirez wrote it.

aidenn0

I think the older you are the harder Lua is to learn. GP didn't say it made wrong choices, just choices that are gratuitously different from other languages in the Algol family.

le-mark

I’m tickled that one of my favorite developers is commenting on another of my favorites work. Would be great if Nicolas Cannasse were also in this thread!

cxr

It wouldn't fix the issue of semantics, but "language skins"[1][2] are an underexplored area of programming language development.

People go through all this effort to separate parsing and lexing, but never exploit the ability to just plug in a different lexer that allows for e.g. "{" and "}" tokens instead of "then" and "end", or vice versa.

1. <https://hn.algolia.com/?type=comment&prefix=true&query=cxr%2...>

2. <https://old.reddit.com/r/Oberon/comments/1pcmw8n/is_this_sac...>

nine_k

Not "never exploit"; Reason and BuckleScript are examples of different "language skins" for OCaml.

The problem with "skins" is that they create variety where people strive for uniformity to lower the cognitive load. OTOH transparent switching between skins (about as easy as changing the tab sizes) would alleviate that.

brabel

> OTOH transparent switching between skins (about as easy as changing the tab sizes) would alleviate that.

That's one of my hopes for the future of the industry: people will be able to just choose the code style and even syntax family (which you're calling skin) they prefer when editing code, and it will be saved in whatever is the "default" for the language (or even something like the Unison Language: store the AST directly which allows cool stuff like de-duplicating definitions and content-addressable code - an idea I first found out on the amazing talk by Joe Armstrong, "The mess we're in" [1]).

Rust, in particular, would perhaps benefit a lot given how a lot of people hate its syntax... but also Lua for people who just can't stand the Pascal-like syntax and really need their C-like braces to be happy.

[1] https://www.youtube.com/watch?v=lKXe3HUG2l4

dualogy

> transparent switching between skins (about as easy as changing the tab sizes)

One of my pet "not today but some day" project ideas. In my case, I wanted to give Python/Gdscript syntax to any & all the curly languages (a potential boon to all users of non-Anglo keyboard layouts), one by one, via VSCode extension that implements a virtual filesystem over the real one which translates back & forth the syntaxes during the load/edit/save cycle. Then the whole live LSP background running for the underlying real source files and resurfacing that in the same extension with line-number matchings etc.

Anyone, please steal this idea and run with it, I'm too short on time for it for now =)

cibyr

People fight about tab sizes all the time though.

rao-v

One day Brython (python with braces allowing copy paste code to autoindent) will be well supported by LSPs and world peace will ensure

twic

  SyntaxError: not a chance

xigoi

What editor are you using that does not have a way to paste code with proper indentation?

kevin_thibedeau

VB.Net is mostly a reskin of C# with a few extras to smooth the transition from VB.

procaryote

Lowering the barrier to create your own syntax seems like a bad thing though. C.f. perl.

CapsAdmin

It sounds like you're trying to articulate why you don't like Lua, but it seems to just boil down to syntax and semantics unfamiliarity?

I see this argument a lot with Lua. People simply don't like its syntax because we live in a world where C style syntax is more common, and the departure from that seem unnecessary. So going "well actually, in 1992 when Lua was made, C style syntax was more unfamiliar" won't help, because in the current year, C syntax is more familiar.

The first language I learned was Lua, and because of that it seems to have a special place in my heart or something. The reason for this is because in around 2006, the sandbox game "Garry's Mod" was extended with scripting support and chose Lua for seemingly the same reasons as Redis.

The game's author famously didn't like Lua, its unfamiliarity, its syntax, etc. He even modified it to add C style comments and operators. His new sandbox game "s&box" is based on C#, which is the language closest to his heart I think.

The point I'm trying to make is just that Lua is familiar to me and not to you for seemingly no objective reason. Had Garry chosen a different language, I would likely have a different favorite language, and Lua would feel unfamiliar and strange to me.

junon

GP is the creator of Redis. I would imagine he knows Lua well given that Redis has embedded it for around a decade.

CapsAdmin

In that case, my point about Garry not liking Lua despite choosing it for Garrysmod, for seemingly the same reason as antirez is very appropriate.

I haven't read antirez'/redis' opinions about Lua, so I'm just going off of his post.

In contrast I do know more about what Garry's opinion on Lua is as I've read his thoughts on it over many years. It ultimately boils down to what antirez said. He just doesn't like it, it's too unfamiliar for seemingly no intentional reason.

But Lua is very much an intentionally designed language, driven in cathedral-style development by a bunch of professors who seem to obsess about language design. Some people like it, some people don't, but over 15 years of talking about Lua to other developers, "I don't like the syntax" is ultimately the fundamental reason I hear from developers.

So my main point is that it just feels arbitrary. I'm confident the main reason I like Lua is because garry's mod chose to implement it. Had it been "MicroQuickJS", Lua would likely feel unfamiliar to me as well.

c-smile

Lua syntax is pretty good for DSL (domain specific language) cases / configuration definitions.

For example Premake[1] uses Lua as it is - without custom syntax parser but with set of domain specific functions.

This is pure Lua:

   workspace "MyWorkspace"
      configurations { "Debug", "Release" }
   
   project "MyProject"
      kind "ConsoleApp"
      language "C++"
      files { "**.h", "**.cpp" }
   
   filter { "configurations:Debug" }
      defines { "DEBUG" }
      symbols "On"
   
   filter { "configurations:Release" }
      defines { "NDEBUG" }
      optimize "On"

In that sense Premake looks significantly better than CMake with its esoteric constructs. Having regular and robust PL to implement those 10% of configuration cases that cannot be defined with "standard" declarations is the way to go, IMO.

[1] https://premake.github.io/docs/What-Is-Premake

vegabook

Lua has been a wild success considering it was born in Brazil, and not some high wealth, network-effected country with all its consequent influential muscle (Ruby? Python? C? Rust? Prolog? Pascal? APL? Ocaml? Show me which one broke out that wasn't "born in the G7"). We should celebrate its plucky success which punches waaay above its adoption weight. It didn't blindly lockstep ALGOL citing "adooooption!!", but didn't indulge in revolution either, and so treads a humble path of cooperative independence of thought.

Come to think of it I don't think I can name a single mainstream language other than Lua that wasn't invented in the G7.

s3graham

I appreciate your point, but Python was invented in .nl which wouldn't be G7 strictly speaking.

jabl

In the same vein Pascal was invented by Niklaus Wirth in Switzerland.

garganzol

JavaScript in 2010 was a totally different beast, standartization-wise. Lots of sharp corners and blank spaces were still there.

So, even if an implementation like MicroQuickJS existed in 2010, it's unlikely that too many people would have chosen JS over Lua, given all the shortcomings that JavaScript had at the time.

kybernetikos

While you're not wrong that JS has come a long way in that time, it's not the case that it was an extremely unusual choice at the time - Ryan Dahl chose it for node in 2009.

simonw

Clarification added later: One of my key interests at the moment is finding ways to run untrusted code from users (or generated by LLMs) in a robust sandbox from a Python application. MicroQuickJS looked like a very strong contender on that front, so I fired up Claude Code to try that out and build some prototypes.

I had Claude Code for web figure out how to run this in a bunch of different ways this morning - I have working prototypes of calling it as a Python FFI library (via ctypes), as a Python compiled module and compiled to WebAssembly and called from Deno and Node.js and Pyodide and Wasmtime https://github.com/simonw/research/blob/main/mquickjs-sandbo...

PR and prompt I used here: https://github.com/simonw/research/pull/50 - using this pattern: https://simonwillison.net/2025/Nov/6/async-code-research/

simonw

Down to -4. Is this generic LLM-dislike, or a reaction to perceived over-self-promotion, or something else?

No matter how much you hate LLM stuff I think it's useful to know that there's a working proof of concept of this library compiled to WASM and working as a Python library.

I didn't plan to share this on HN but then MicroQuickJS showed up on the homepage so I figured people might find it useful.

(If I hadn't disclosed I'd used Claude for this I imagine I wouldn't have had any down-votes here.)

claar

I think many subscribe to this philosophy: https://distantprovince.by/posts/its-rude-to-show-ai-output-...

Your github research/ links are an interesting case of this. On one hand, late AI adopters may appreciate your example prompts and outputs. But it feels like trivially reproducible noise to expert LLM users, especially if they are unaware of your reputation for substantive work.

The HN AI pushback then drowns out your true message in favor of squashing perceived AI fluff.

simonw

Yeah, I agree that it's rude to show AI output to people... in most cases (and 100% if you don't disclose it.)

My simonw/research GitHub repo is deliberately separate from everything else I do because it's entirely AI-generated. I wrote about that here: https://simonwillison.net/2025/Nov/6/async-code-research/#th...

This particular case is a very solid use-case for that approach though. There are a ton of important questions to answer: can it run in WebAssembly? What's the difference to regular JavaScript? Is it safe to use as a sandbox against attacks like the regex thing?

Those questions can be answered by having Claude Code crunch along, produce and execute a couple of dozen files of code and report back on the results.

I think the knee-jerk reaction pushing back against this is understandable. I'd encourage people not to miss out on the substance.

colesantiago

It is because you keep over promoting AI almost every day of the week in the HN comments.

In this particular case AI has nothing to do with Fabrice Bellard.

We can have something different on HN like what Fabrice Bellard is up to.

You can continue AI posting as normal in the coming days.

simonw

Forget about the AI bit. Do you think it's interesting that MicroQuickJS can be used from Python via FFI or as a compiled module, and can also be compiled to WebAssembly and called from Node.js and Deno and from Pyodide running in a browser?

... and that it provides a useful sandbox in that you can robustly limit both the memory and time allowed, including limiting expensive regular expression evaluation?

I included the AI bit because it would have been dishonest not to disclose how I used AI to figure this all out.

jmull

I don't know why people are downvoting your comment, but it could be considered a low-effort post: here's (a link to) something I prompted AI with, here's (a link to) what it produced (the whole repo).

I would guess people don't know how you expect them to evaluate this, so it comes off as spamming us with a bunch of AI slop.

(That C can be compiled to WASM or wrapped as a python library isn't really something that needs a proof-of-concept, so again it could be understood as an excuse to spam us with AI slop.)

halfmatthalfcat

I downvoted because I'm tired of people regurgitating how they've done this or that with whatever LLM of the week on seemingly every technical post.

If you care that much, write a blog post and post that, we don't need low effort LLM show and tell all day everyday.

alex_suzuki

I think the people interacting with this post are just more likely to appreciate the raw craftsmanship and talent of an individual like Bellard, and coincidentally might be more critical of the machinery that in their perception devalues it. I count myself among them, but didn’t downvote, as I generally think your content is of high quality.

garganzol

Thank you for sharing.

A lot of HN people got cut by AI in one way or another, so they seem to have personal beefs with AI. I am talking about not only job shortages but also general humbling of the bloated egos.

foobarchu

> I am talking about not only job shortages but also general humbling of the bloated egos.

I'm gonna give you the benefit for the doubt here. Most of us do not dislike genAI because we were fired or "humbled". Most of us dislike it because a) the terrible environmental impacts, b) the terrible economic impacts, and c) the general non-production-readiness of results once you get past common, well-solved problems

Your stated understanding comes off a little bit like "they just don't like it because they're jealous".

wartywhoa23

I'm constanly encountering this "bloated ego" argument every time the narrative is being steered away to prevent monetary losses for AI companies.

Especially so when it concerns AI theft of human music and visual art.

"Those pompous artists, who do they think they are? We'll rob them of their egos".

The problem is that these ego-accusations don't quite come from egoless entities.

yeasku

Because it adds nothing to the conversation. Im

johnfn

How does executing MicroQuickJS from Python not have anything to do with MicroQuickJS?

johnfn

I appreciate all your work and I did not downvote you. One suggestion, though, is that the README looks very AI generated, which makes the project feel low effort, like you just said “hey Claude do a security analysis of this package”. I don’t think this is actually what you did, but it’s hard to know. It’s also very difficult to identify the highlights. Just a few handwritten sentences would be better.

simonw

The README is indeed AI generated, as is everything else in that simonw/research repository - it's my public demo of the asynchronous research process I use.

MobiusHorizons

What is the purpose of compiling this to web assembly? What web assembly runtimes are there where there is not already an easily accessible (substantially faster) js execution environment? I know wasmtime exists and is not tied to a js execution engine like basically every other web assembly implementation, but the uses of wasmtime are not restricted from dependencies like v8 or jsc. Usually web assembly is used for providing sandboxing something a js execution environment is already designed to provide, and is only used when the code that requires sandboxing is native code not javascript. It sounds like a good way to waste a lot of performance for some additional sandboxing, but I can't imagine why you would ever design a system that way if you could choose a different (already available and higher performance) sandbox.

simonw

I want to build features - both client- and server-side - where users can provide JavaScript code that I then execute safely.

Just having a WebAssembly engine available isn't enough for this - something has to take that user-provided string of JavaScript and execute it within a safe sandbox.

Generally that means you need a JavaScript interpreter that has itself been compiled to WebAssembly. I've experimented with QuickJS itself for that in the past - demo here: https://tools.simonwillison.net/quickjs - but MicroQuickJS may be interesting as a smaller alternative.

If there's a better option than that I'd love to hear about it!

santadays

GraalVM supports running javascript in a sandbox with a bunch of convenient options for running untrusted code.

https://www.graalvm.org/latest/security-guide/sandboxing/

MobiusHorizons

This is generally the purpose of JavaScript execution environments like v8 or jsc (or quickjs although I understand not trusting that as a sandbox to the same degree). They are specifically intended for executing untrusted scripts (eg web browsers). Web assembly’s sandboxing comes from js sandboxing, since it was originally a feature of the same programs for the same reasons. Wrapping one sandbox in another is what I’m surprised by.

kettlecorn

As I noted in another comment Figma has used QuickJS to run JS inside Wasm ever since a security vulnerability was discovered in their previous implementation.

In a browser environment it's much easier to sandbox Wasm successfully than to sandbox JS.

MobiusHorizons

That’s very interesting! Have they documented the reasoning for that approach? I would have expected iframes to be both simpler and faster sandboxing mechanism especially in compute bound cases. Maybe the communication overhead is too high in their workload?

EDIT: found this from your other comment: https://www.figma.com/blog/an-update-on-plugin-security/ they do not address any alternatives considered.

undefined

[deleted]

miki123211

Since you're here and this is likely to become professionally relevant for me pretty soon, what is the best way you know of for securely running Python inside Python?

I was looking for something like Pyodide but runnable from Python, but that doesn't seem to exist quite yet. I can get a Python interpreter to run in wasmtime, but that doesn't have the Pyodide goodies like Micropip etc. Sadly, Pyodide itself seems fully married to JS, as it compiles to Emscripten and not WASI.

I'm almost tempted to just go with a small binary embedding V8 and running Pyodide inside V8 isolates or something.

(I know I can do this via Firecracker / GVisor / whatever, that is not the solution I'm looking for.)

zellyn

I’m horribly biased but I think it’s a combination of: (1) knee-jerk reaction to similar-looking but low-value comments, and (2) most people not having played around with LLM coding agents and messed around with their own agents enough to immediately jump to excitement at simple, safe sandboxing primitives for that purpose.

And +1000 on linking to your own (or any other well-written) blog.

incognito124

You should take a look at https://judge0.com/

sublimefire

Look at how others implement quickjs and restrict its runtime for sensitive workloads [1], should be similar.

But there are other ways, e.g. run the logic isolated within gvisor/firecracker/kata.

[1] github.com/microsoft/CCF under src/js/core

strbean

Curious if you have a specific use case for sandboxed JS that you would share?

simonw

I want to build software features where users can configure some additional JavaScript to run in order to customize that software.

One example: given this database table run this JavaScript function against every value in this column to calculate a value to be stored in another column.

Or once a day fetch the JSON from this URL and transform it with this JavaScript and store it here.

steren

Check out this sample of using gVisor to spin up code sandboxes (potentially running on Cloud Run): https://github.com/GoogleCloudPlatform/cloud-run-sandbox

pizlonator

This engine restricts JS in all of the ways I wished I could restrict the language back when I was working on JSC.

You can’t restrict JS that way on the web because of compatibility. But I totally buy that restricting it this way for embedded systems will result in something that sparks joy

groundzeros2015

He already has a JS engine which doesn’t make these restrictions

pizlonator

Yeah QuickJS is great.

I bet MQJS will also be very popular. Quite impressive that bro is going to have two JS engines to brag about in addition to a lot of other very useful things!

alexdowad

> Quite impressive...

Yes, quite! Monsieur Bellard is a legend of computer programming. It would be hard to think of another programmer whose body of public work is more impressive than FB.

Unfortunate that he doesn't seem to write publicly about how he thinks about software. I've never seen him as a guest on any podcast either.

I have long wondered who the "Charlie Gordon" who seems to collaborate with him on everything is. Googling the name brings up a young ballet dancer from England, but I doubt that's the person in question.

andai

> You can’t restrict JS that way on the web because of compatibility.

Well, now we can run this thing in WASM and get, I imagine, sane runtime errors :)

jacobp100

Since you’re on the topic, what ever happened to the multi threading stuff you were doing on JSC? Did it stop when you left Apple? Is the code still in JSC or did it get taken out?

pizlonator

I never really started on it other than writing up how to do it

simonw

If anyone wants to try out MicroQuickJS in a browser here's a simple playground interface for executing a WebAssembly compiled version of it: https://tools.simonwillison.net/microquickjs

It's a variant of my QuickJS playground here: https://tools.simonwillison.net/quickjs

The QuickJS page loads 2.28 MB (675 KB transferred). The MicroQuickJS one loads 303 KB (120 KB transferred).

azakai

Looks like those sizes could be improved significantly, as the builds include names etc. I would suggest linking with

emcc -O3

(and maybe even adding --closure 1 )

edit: actually the QuickJS playground looks already optimized - just the MicroQuickJS one could be improved.

simonw

Nice. Got it down from 229KB to 148KB! Thanks for the tips.

https://github.com/simonw/research/pull/5

Thats now live on https://tools.simonwillison.net/microquickjs

julenx

Thanks for sharing! The link to the PR looks like a wrong paste. I found https://github.com/simonw/tools/pull/181 which seems to be what was intended to be shared instead.

kamranjon

I was interested to try Date.now() since this is mentioned as being the only part of the Date implementation that is supported but was surprised to find it always returns 0 for your microquickjs version - your quickjs variant appears to return the current unix time.

simonw

Good catch. WebAssembly doesn't have access to the current time unless the JavaScript host provides it through injecting a function, so the WASM build would need to be hooked up specially to support that.

xigoi

At last, I can run JavaScript in my browser. The world is now complete.

throwaway290

The most important thing about any new JS runtime in 2025, how do I use it from JS? /s

ea016

Well, as Jeff Atwood famously said [0], "any application that can be written in JavaScript, will eventually be written in JavaScript". I guess that applies to embedded systems too

[0] https://en.wikipedia.org/wiki/Jeff_Atwood

arendtio

Well, wasn't Fabrice Bellard the guy who built a virtual machine with JS so that you could run Linux within the browser?

https://bellard.org/jslinux/vm.html?cpu=riscv64&url=fedora33...

tombert

Fabrice is an absolute legend. Most people would be content with just making QEMU, but this guy makes TinyC and FFmpeg and QuickJS and MicroQuickJS and a bunch of other huge projects.

I am envious that I will never anywhere near his level of productivity.

avaer

Not to detract from his status as a legend, but I think the kind of person that singlehandedly makes one of these projects is exactly the kind of person that would make the others.

I forgot about FFmpeg (thanks for the reminder), but my first thought was "yup that makes perfect sense".

umvi

Not just programming either; he invented a mathematical technique for calculating the nth hex digit of pi

keepamovin

I know it's not true, but it would be funny if Bellard had access to AI for 15 years (time-traveler, independent invention, classified researcher) and that was the cause of his superhuman producitvity.

AI will let 10,000 Bellards bloom - or more.

kzrdude

And thanks to that we can run Linux in a PDF as well..

anthk

And FFMPEG, the standard codec suite for Unix today. And Qemu, the core of KVM. Plus TCC, a great small compiler compared to C/Clang altough cparser has better C99 coverage. Oh, and some DVB transmitter reusing the MHZ radiation from a computer screen by tweaking the Vidtune values from X. It's similar to what Tempest for Eliza does.

tacone

Sounds a bit like rule 35 of the Internet.

undefined

[deleted]

vitaminCPP

Please don't use js in medical devices.

achenet

attempt at humor:Okay so, would you rather your beloved great aunt's pacemaker fail because the software in it was written in C, and there's a use-after-free memory error, or because the software in it was written in JavaScript, and because someone used `==` instead of `===` a boolean that should have been `false` is `true`?

timschumi

It's unfortunate that he uploaded this without notable commit history, it would be interesting to see how long it takes a programmer of his caliber to bring up a project like this.

That said, judging by the license file this was based on QuickJS anyway, making it a moot comparison.

k4rli

It does say "public repository of..." implying there's a non-public one with real history. Not sure why not upload the main one though.

zarzavat

If he's anything like me (doubtful but roll with it), the commit history when prototyping is probably something like "commit", "commit", "fixed a bug", etc.

incognito124

Maybe he just oneshotted it

agumonkey

Maybe claude code uses bellard as agent

MisterTea

Claude is really Bellard sitting in his kitchen, sipping coffee, casually replying to code requests while getting ready for his day.

saagarjha

I’d expect much better results, honestly

khazhoux

"You're right! I apologize for the confusion. I am, in fact, Fabrice Bellard. Comment allez-vous?"

ale

It's Fabrice so there's a chance he did

undefined

[deleted]

foresto

I wonder if this could become the most lightweight way for yt-dlp to solve YouTube Javascript challenges.

https://github.com/yt-dlp/yt-dlp/wiki/EJS

(Note that Bellard's QuickJS is already a supported option.)

qbane

Not likely:

> It only supports a subset of Javascript close to ES5 [...]

I have not read the code of the solver, but solving YouTube's JS challenge is so demanding that the team behind yt-dlp ditched their JS emulator written in Python.

undefined

[deleted]

AndyKelley

That's a great idea, but if they did, then YouTube could retaliate by specifically using features that MicroQuickJS does not support.

foresto

Of course... The arms race is eternal. :)

leptons

There's no reason it has to be lightweight, what it has to do is solve Youtube challenges without workarounds due to limited Javascript syntax.

silverwind

Likely not, given that it only implements ES5.

polyrand

Not sure about the impact of these, I guess it depends on the context where this engine is used. But there seems to be already exploits for the engine:

https://x.com/itszn13/status/2003707921679679563

https://x.com/itszn13/status/2003808443761938602

MattGrommes

I'm not an embedded systems guy (besides using esp32 boards) so this might be a dumb question but does something like this open up the possibility of programming an esp32/arduino board with Javascript, like Micro/Circuit Python?

halfmatthalfcat

There are already libraries/frameworks that have supported this:

* espruino (https://www.espruino.com/)

* elk (https://github.com/cesanta/elk)

* DeviceScript (Microsoft Research's now defunct effort, https://github.com/microsoft/devicescript)

niutech

And also Duktape (https://duktape.org)

cxr

That's been possible with Moddable/Kinoma's XS engine, which is standards compliant with ES6 and beyond.

<https://www.moddable.com/faq#comparison>

If you take a look at the MicroQuickJS README, you can see that it's not a full implementation of even ES5, and it's incompatible in several ways.

Just being able to run JS also isn't going to automatically give you any bindings for the environment.

redfloatplane

Sort of related: About ten years ago there was a device called the Tessel by Technical Machine which you programmed with Javascript, npm, the whole nine yards. It was pretty clever - the javascript got transpiled to Lua VM bytecode and ran in the Lua VM on the device (a Cortex M3 I believe). I recently had Claude rewrite their old Node 0.8 CLI tools in Rust because I wasn't inclined to do the javascript archeology needed to get the old tools up and running. Of course then I put the Tessel back in its drawer, but fun nonetheless.

niutech

There are still Espruino JS devices.

matt_trentini

It's a good _start_; much more code needs to be written to allow control of the hardware of those devices (GPIO, I2C etc).

15155

Yes. The key enabling feature is a lack of malloc()

keepamovin

Fabrice, Mr Bellard, O Indefatigable One, if you are reading this, I would love for you to make a JavaScript that compiles to assembly and works across Windows PE, macOS and Linux. Surrendering the various efficiencies of the V8 JIT bytecode in favor of AOT is entirely acceptable for the concision, speed and the chance to "begin again" that this affords. In fact, I believe you may already be working on such an idea! If you are not (highly doubtful) I encourage you to ponder it, and if we are so lucky and the universe wills it, you shall turn the hand of your incomparable craftsmanship upon this worthy goal, and doubtless such a magnificent creation shall be realized by you in a surprisingly short amount of time!

nektro

you might be interested in the future of https://porffor.dev/

keepamovin

Oh, this is excellent! I'm so happy with this. This is EXCELLENT! and what a beautiful website. Is this your project?

So cool. Where did the name come from? I am so stoked and glad that we are going to have a JS to native binary compiler. The best thing ever!

I was going to set up an AI automation to run on this against the autotests, but as I got started, I felt - why not just create a new language where I can pick my own concurrency paradigms and syntax? So I went with that instead.

So glad someone is doing this. What more do you know about this proejct, kind person?

aapoalas

This is Oliver Medhurst' project: https://goose.icu/

It has stable funding and a full-time development team of 1.

bmc7505

Interesting. I wonder if mqjs would make it feasible to massively parallelize JavaScript on the GPU. I’m looking for a way to run thousands of simultaneous JS interpreters, each with an isolated heap and some shared memory. There are some research projects [1, 2] in this direction, but they are fairly experimental.

[1]: https://github.com/SamGinzburg/VectorVisor

[2]: https://github.com/beehive-lab/ProtonVM

conoro

As a long-time Espruino user I was immediately interested.

At first glance Espruino has broader coverage including quite a bit of ES6 and even up to parts of ES2020. (https://www.espruino.com/Features). And obviously has a ton of libraries and support for a wide range of hardware.

For a laugh, and to further annoy the people annoyed by @simonw's experiments, I got Cursor to butcher it and run as a REPL on an ESP32-S3 over USB-Serial using ESP-IDF.

Blink is now running so my work here is done :-)

  led.init(48)
  
  function blink() {
    led.rgb(0, 0, 255)
    setTimeout(function() {
      led.off();
      setTimeout(blink, 500)
    }, 500)
  }
  blink()
Daily Digest email

Get the top HN stories in your inbox every day.