Get the top HN stories in your inbox every day.
reikonomusha
zozbot234
> We don't "need" Lisp machines. We "need" Lisp software. What made a Lisp machines extraordinary wasn't the hardware, it was the software. Nothing today is impeding one from writing such software, except time, energy, willpower, and/or money.
Discussed here https://news.ycombinator.com/item?id=30800520 The main issue is that Lisp, for all its inherent "power", has very limited tools for enforcing modularity boundaries in code and "programming in the large". So everything ends up being a bespoke solo-programmer project, there is no real shared development. You can see the modern GC-based/"managed" languages, perhaps most notably with Java, as Lisps that avoided this significant pitfall. This might explain much of their ongoing success.
reikonomusha
I think many people have conjectures, such as this one, but I don't think it's a tech problem, or a "Lisp is too powerful for its own good" problem. It's a "people aren't writing software" problem. History has demonstrated umpteen times that developing large, sophisticated, maintained, and maintainable projects in Lisp is entirely and demonstrably possible. Modern Common Lisp coding practices gravitate toward modular, reusable libraries through proper modules via ASDF packages ("systems") and Common Lisp namespaces ("packages").
SulphurCrested
The obvious answer as to why people aren't writing software is that almost all of the people able to write good software don't like the language and are writing software in some other language or for some other platform.
I know Lisp enough to have written programs of a few thousand lines in it. I'm not even slightly fazed by functional programming and (tail-)recursion instead of loops. I've read Steele's Common Lisp book from cover to cover. Someone even tried to get me to interview for a job writing Lisp (I politely told them I thought their system could not practically be implemented in Lisp and was, several years and tens of millions of dollars later, eventually proven right).
And I don't think the language has any redeeming features other than garbage collection and documentation, neither of which is notable in 2022. I'm someone familiar with the language who could quickly become productive in any decent Lisp, and that's what I think of Lisp. Can you imagine what a person new to the forest of parentheses, weird identifiers and rejection of 500 years of operator precedence notation thinks?
taneq
> History has demonstrated umpteen times that developing large, sophisticated, maintained, and maintainable projects in Lisp is entirely and demonstrably possible.
The thing is, developing maintained, maintainable projects is work, and everything I see about Lisp seems to have an undertone of being done, to some degree, for fun. It’s a language that scratches a specific itch; people feel clever about writing elegant code in it.
But the sad truth is that 99% of real application code doesn’t need to be clever or elegant. It needs to be simple and cheap and maintainable by anyone.
coryrc
> people aren't writing software
Maybe not OSS, but somebody is keeping Lispworks and Franz in business.
lispm
Already in the 80s the Lisp Machine OS had > 1 MLOC code lines with support for programming in the large. The development environment was networked from the start. A server keeps the definition of things on the network: machines, users, file systems, networks, gateways, printers, databases, mailers, ... The source code and documentation is usually shared in the team and versioned via a common file server.
Nowadays, there are large applications written by groups/teams, which are worked on for three or more decades. For example the ACL2 theorem prover has its roots in the 70s and is used/maintained until today for users in the chip industry.
Common Lisp was especially designed for the development and production use of complex programs. The military at that time wanted to have a single Lisp dialect for those - often applications came with their own Lisp, which made deployment difficult. A standard language for projects was then required.
These were usually developed by teams in the range of 5 - 100 people. Larger teams are rare.
zozbot234
"Designed for complex programs in the 1980s" is not really up to current standards. Moore's law means that complexity of overall systems can grow by multiple orders of magnitude in the 01980 -to- 02022 timeframe.
throw10920
> The main issue is that Lisp, for all its inherent "power", has very limited tools for enforcing modularity boundaries in code and "programming in the large"
I don't see any mention of "modular" or "boundaries" in the post you linked, so I'm assuming that it doesn't add extra context to your point.
You say "very limited tools for enforcing modularity boundaries", which I'm going to assume means that you believe that Lisps have constructs for creating modularity boundaries (e.g. unexported symbols in Common Lisp), and just don't enforce them (e.g. I can still use an unexported symbol in CL by writing foo::bar), in which case - I don't think that this is actually an issue.
Programmers are capable of doing all kinds of bad things with their code, and shooting themselves in the foot, yet I've never seen an indication that the ability to shoot yourself in the foot with a language noticeably contributes to its popularity (see: C, C++, Perl, Ruby).
Moreover, specializing to Common Lisp, it's not like CL allows you to accidentally access an unexported symbol in a package - you have to either deliberately use :: (which is bad style, and takes more effort than typing :) or you get a hard crash. This is opposed to the above listed languages, which allow you to shoot yourself in the foot in a large number of extremely diverse and interesting manners, often without giving you advance warning - and yet are still far more successful.
------------
I don't believe that the lack of success of Lisps are due to technical deficiencies.
wvenable
Lisp is clever and also a but inhumane. It only ever appeals to the small subset of the population that likes that. Lisp has some niceties that are fundamentally inherent to its design but it's just not enough to overcome how awful it is for people.
But this is not a technical deficiency of Lisp, it's more of a technical deficiency of humans.
johnny22
What do you think it is?
syngrog66
agreed, and well said
in broadstrokes, imo, Lisp's endless plasticity of infinitely nested parentheses is both its greatest strength and... its greatest weakness
I love it at Write time. hate it at Read/Maintain time. why I've avoided it for serious work. Python, Java, Go Rust etc are all easier for my brain & eyes to quickly parse on the screen
which is unfortunate. because I still drool at Lisp's linguistic power
reikonomusha
Lisp at "read/maintain" time in a good environment is probably the most enjoyable time for me working with Lisp, assuming it's on a code base that followed some sort of coding standards. Introspection, debugging, and navigation in a Lisp environment are incredible.
Turing_Machine
> . You can see the modern GC-based/"managed" languages, perhaps most notably with Java, as Lisps that avoided this significant pitfall.
An interesting perspective. From my POV, it's hard to think of a less Lisp-like language than Java. COBOL, maybe.
undefined
metroholografix
The greatness of Lisp -at least when it comes to end-user empowerment- and (I think) the only differentiating factor that most other languages have still not caught up to, is the cybernetic philosophy with its roots in Licklider (man-computer symbiosis) and Engelbart.
Building an environment that strongly and uncompromisingly expresses this philosophy at its core is a serious undertaking in terms of time investment. Emacs has been in continuous development for 37 years and while it is still not as good as Genera, it's certainly "good enough" for lots of people and definitely captures the spark of this philosophy.
In the Common Lisp world, we've had plenty of tries (MCL, Hemlock, McCLIM) but they've all failed to get people to coalesce and generate progresss.
Maybe the fact that Emacs exists is a curse in that people realize the barrier they'll have to clear to make something substantially better and decide to devote their energies into more easily realizable efforts.
GregorMendel
Anyone interested in the computing holes that can be filled by lisp machines should check out Urbit. There is an vibrant and growing community of people building a network of personal servers running a lisp-like functional OS. It uses an identity system secured by the Ethereum blockchain and it has created a bottom up economic incentive for developers to participate. They are starting to solve unique problems that couldn't be addressed on currently prevalent platforms. Urbit is an affirmation; we can ditch the nihilism.
convolvatron
i associate lisp machines with power, simplicity, and a delightfully shallow learning curve.
urbit to me is exactly the opposite
Gollapalli
And it's 4x more expensive than it was supposed to be to buy a planet.
Ethereum was a mistake.
olah_1
Since they moved to Layer 2 it has been amazing. No eth fees anymore.
They should just move to their own chain entirely. Maybe this is a baby step towards that. Each Galaxy/Star is already basically a staker in a Proof of Stake network.
TacoT
It's $3. www.azimuth.shop
chevill
>They are starting to solve unique problems that couldn't be addressed on currently prevalent platforms.
Got any examples?
lonjil
> Yet, it appears the median and mean Lisp programmer is producing Yet Another (TM) test framework, anaphoric macro library, utility library, syntactic quirk, or half-baked binding library to scratch an itch. Our Lisp programming environments are less than what they were in the 80s because everybody feels the current situation with SLIME and Emacs is good enough.
I don't think this is true. Not anywhere close. Most such examples are small, and probably only took a small number of hours to produce. While "superlative" stuff takes very many man hours to create. So just by seeing that there are many throwaway testing frameworks or whatever, you cannot tell where most of the work hours actually go. A half baked binding library takes 20 minutes to make, while a proper high quality rendering engine takes hundreds if not thousands of hours.
The Lisp population is thin on people making cool shit because the Lisp population in general is thin.
agumonkey
I'd say it's both. It seems most lisping is done high on the stack. Some are doing assembler level lisp (or scheme) but less bare metal / OS / system oriented lisp.
I wonder what the lisp os guy are thinking about OS / UI these days.
13415
Personally, I think the problem is that CommonLisp is just another programming language, whereas Lisp really shines when it provides a full-fledged programming environment. Nowadays, it would seem best to create such an environment on top of commodity hardware as a "virtual machine" that abstracts away from the hardware in a general, portable way. However, a good environment (1) needs a purpose, and (2) somebody needs to write it. Lisp currently fails on both points. The purpose used to be symbolic AI and NLP among other things. Nowadays it could be the same, or a web framework with integrated distributed computing and database, or scientific computing, or a server for certain business services, etc. There are many application domains for which a virtual "Lisp machine" would be beneficial but it needs to be developed for one of those real world applications, not just as a toy like existing attempts of building Lisp machines. And in my opinion the problem really is (2), developer power / size of the community. If you exclude super-expensive legacy Lisps, the current Lisp community doesn't even have a fully supported Lisp-native editor (portable Hemlock is not good enough) and also doesn't have good enough object persistence / native Lisp-based databases. Both are the foundations of any integrated Lisp machine.
People sometimes claim CL+Emacs+Slime provides the full interactive experience. I just can't agree with that at all. I have tried, and the experience was not substantially different from Go development and development in any other fast-compiling language with Emacs. In some respects, it's even worse than with various modern languages, even though most of those languages are strictly inferior to CL from a pure language perspective. If editing files in Emacs is all there is to the allegedly great Lisp experience, and developers at the same time have to deal with all those idiosyncrasies of CL such as CL's filesystem path handling, ASDF, and tons of poorly documented libraries, then I can't really see the advantages of CL. The language is powerful, don't get me wrong, but a truly interactive experience is totally different. Smalltalk managed to keep this experience but for some reason the Lisp community seems too have lost this vision. I guess the developer community is just not large enough.
Anyway, before someone tries to build another "close to metal" Lisp machine or tries to revive any old Lisp machine, I'd rather wish the community would focus on creating truly integrated development environments that abstract away from the host system and are fully hackable and editable from the ground up while maintaining security and multi-user access. A "virtual Lisp" machine with virtual OS, so to say. If that's developed for a certain purpose like building and deploying extremely portable web applications, I believe it can have a great future.
Sorry for the long rant. This is just my impression after having programmed in various Lisp dialects for the past three decades.
coryrc
> the experience was not substantially different from Go
I think the (a?) reason for that is a (otherwise good) shift to production being immutable. When you aren't allowed to interact and change code in a running system, you lose a massive advantage of CL over simpler languages. When the answer to every error is "send a 4xx or 5xx to the client" then having a powerful error-handling system is similarly pointless. When you only write CRUD programs like everyone else, you're just plugging together other's libraries and not creating novel techniques or fancy math. In this world all CL's advantages are negated.
metroholografix
Common Lisp on Emacs via SLIME is not competitive with Smalltalk, re: "interactive experience" since Emacs is not the native substrate of CL, but essentially an out-of-process experience. If you want to experience CL at its best, you need to run Symbolics Genera.
Emacs with Emacs Lisp on the other hand offers a great interactive experience that also manages to easily surpass every modern Smalltalk incarnation in practicality and size of development community. So if running Genera isn't easily doable, this will give you a taste of what Lisp interactivity is all about.
thorondor83
To me development with SLIME is much better than with a fast-compiling language.
- Debugger is always ON.
- I can inspect the data I'm working with.
- I can redefine things without starting everything all over, avoid losing current context. Fast restart is not the same.
- I can evaluate pieces of code without the need of a REPL. Point to an s-expression and evaluate that piece of code, inspect the result.
I don't see how Smalltalk is much more interactive. It is more powerful at graphics and tools integration, but SLIME provides an interactive enough experience IMO, and it is significantly better to any fast compiling + restart language.
johnisgood
I prefer Factor for that.
jodrellblank
> "We don't "need" Lisp machines. We "need" Lisp software."
Nobody goes into Java because their self identity is "a Java programmer" to gather a team of people to create a Java machine running Java software to unleash the power of Java for the masses by enabling them to do everything in Java for the sake of doing everything in Java, By Java, With Java, For Java. And if they do talk like that they would be a Sun Microsystems marketing pamphlet from 1999, or a joking reference to Zombo.com, or suspected of having zero interesting ideas and defaulting to Ouroboros-naval-gazing.
Adobe Photoshop Lightroom is C++ and Lua. Blender is C++ and Python. Excel is C++ and Visual Basic for Applications. LibreOffice Calc is C++ and Python. These are large, popular, programmable systems which exist today and are good enough; good enough for people to spend lots of money on them, good enough for people to spend years skilling up in them, good enough that once they existed people wanted them to keep existing and they haven't faded into the past.
The added allure of an imaginary rebuild of them like "if you took the lid off Excel you'd see VBA inside so you could rework the way it handles multiple sheets using only a skill you have and software design skills and Excel-internals knowledge you don't have" would get a hearty side-eye and slowly backing away from most Excel users. "Everything inside looks the same" is as attractive as "if you open your car trunk you'll see leather seats and plastic dashboard components making it move" or "if you sit in this car you're sitting on engine parts because the top priority is that a welder in a scrapyard can build the entire car without leaving their comfort zone". There are certainly people who want that, but the way the world hasn't developed that way suggests it isn't particularly desirable. Even when such things have been built, people can today use a Smalltalk, an APL, save their running work in a memory-dump and reload it and rewrite parts of it in itself, people flocked to Jupyter notebooks instead.
> "[1] Kandria is a neat platformer developed entirely in Common Lisp"
https://cdn.akamai.steamstatic.com/steam/apps/1261430/ss_a3f...
Without mocking a team who has built, developed, polished and planned to release a project, because that is respectable, it also looks like computer gaming of the early to mid 1990s Intel 386/486 era; remeniscent of Prince of Persia, Gods, Lemmings, Monkey Island. But it needs an Intel i5, 4GB RAM and 1GB storage. It's not even released yet and has no reviews, but you describe it as 'superlative' ("Of the highest order, quality, or degree; surpassing or superior to all others") - are you rating it so highly based on it being written in Lisp or what?
reikonomusha
I don't know how to respond to the whole "Lisp programmer identity" stuff; it doesn't seem relevant to anything I said. I also didn't suggest anybody rewrite anything in it. The success of Lisp doesn't depend on the existence of fancy machines, it depends on people choosing to write software in it. That's basically all I meant to say.
As for Kandria, did you play the demo, or did you just look at screenshots and system requirements and make your brazen judgment? I don't think Kandria is at all amateur or sloppy, regardless of to which aesthetic era you think it belongs. Many have claimed that it's not even possible to write a game that doesn't blow because Lisp is dynamically typed and garbage collected. Now the goalposts have moved to, "well, it takes 1GB of memory and doesn't even look like it's from 2022."
I commend anybody who ships.
jodrellblank
It's relevant in the sense of being a reply to "we need software written in Lisp" and how, if you substitute Java and say "we need software written in Java", people would just shrug and ask "why do we?". People are saying "we need sofware written in Rust" and other people are asking "why?" and one answer is "to avoid the memory and race condition problems we have from C and C++ code". Maybe correct or not, maybe compelling or not, but it's a practical, outward-looking concrete reason. The answer for Lisp from your comment is "The success of Lisp doesn't depend on the existence of fancy machines, it depends on people choosing to write software in it" and that's the kind of self-referential Ouroboros loop I mentioned. "OK but why?". The success of COBOL depends on people choosing to code in COBOL, but nobody uses that as a reason to support COBOL. People who identify as "Lisp programmers" are going to care about that because if it dies, their identity dies (which is daft because as you say, anyone can choose to write a LISP environment at any time).
> "Many have claimed that it's not even possible to write a game that doesn't blow because Lisp is dynamically typed and garbage collected. Now the goalposts have moved to, "well, it takes 1GB of memory and doesn't even look like it's from 2022.""
JavaScript appeared, took over the world, demonstrated good games in a dynamically typed and garbage collected language at least a decade ago. Goalposts move, time moves on. Some AI people complain "once you wanted AI to beat chess, now that's not good enough? Stop moving goalposts!". Software today can recognise people, generate images from text descriptions, complete sentences, describe photos, self-drive cars over constrained environments, walk robots over rough terrain and jump onto ledges, steady cameras on flying drones following a person. DeepBlue beating Kasparov was impressive in 1996, it's not impressive now. There are AI experts today who were born after that.
Especially contrasted with "it's the best language", "superlative applications", linking something which looks like software of 25 years ago on 10,000x less powerful computers is a big difference in expectations. (It may actually be an amazing game, hence me asking what it was that made you say it is, before it's released). Years ago a company writing Transport Tycoon in assembler was very impressive. Now a single person can write a math animation video generator in Python as a hobby side-project while being a double-major student.[1] Expectations ramp up, year on year, and "coding on the libraries of giants" is a real effect. Pythagoras calculating the length of a hypotenuse was impressive. A school student doing it today isn't.
Hacker News is written in Arc. It's impressive to build a language and build a forum in that language, even though forums existed years ago. But if someone claimed it was the best language which needs to be preserved because it can do the best things, and then used HN as the example, anyone who has used a modern forum with all the trimmings would do the "yes, Grandpa, everything was better in the old days" polite smiling and nodding.
Electron isn't bad because it lacks Lisp, it's bad because it's sluggish and ramps up fans and drains battery life. WhatsApp isn't great because it was written in Erlang, it's great because it connected hundreds of millions of people on all kinds of featurephones and early smartphones. Visual Studio Code isn't very customisable because it's written in JavaScript, it's customisable because they built it to have lots of extension points. The answer to why we need software written in Lisp is like the answer to why we need software written in APL or Prolog: we don't. We also don't need softwre written in Java or C# or Python or Ruby. We may need software written in C, x64 Assembler, because of hardware lock-in. Tools are for doing things with, not for falling in love with.
p_l
Regarding Kandria:
You might have heard of this thing called "Art", and that it has styles, and that not only one of them is called "pixel art" for celebrating that kind of limitations, Art as a whole is often talked in terms of self-imposed limits used in creation of a work.
That said, a game can deliberately target such style, and yet hide considerable richness of implementation (consider: Noita, Dwarf Fortress).
Another thing with Shinmera and his team is that they produce both interesting stories, interesting games, but also code I'd argue is art too.
jodrellblank
> "That said, a game can deliberately target such style, and yet hide considerable richness of implementation (consider: Noita, Dwarf Fortress)."
It definitely can, and Monkey Island and other SCUMMVM games are examples of having lots of gameplay and comedy, despite constrained graphics. As are card games and board games, for that matter. Fancy 3D isn't the be-all, end-all. In more recent years SpeedRunner[1] isn't pixel art but it's quite simply styled, and is fun for leaning so heavily on a single game mechanic.
I'm not meaning to diss Kandria which might very well be a great game. I meant to call out the gulf between Lisp as "the best language" and "superlative applications" being developed with it then linking to Kandria might set one up to think of the "best games" of recent years, by popularity or profitability or ambitiousness or multiplayability, or replayability, or storyline, or VR support, e.g. Pokemon Go, Grand Theft Auto series, Dark Souls series, Fortnite, Fifa, Half Life Alyx, Tony Hawks Pro Skater remakes, Spiderman PS5, Elite Dangerous, Roblox and Minecraft, Final Fantasy series. Which all seem to have done alright without Lisp, and no game development houses have done the Paul Graham "Lisp as secret weapon" to make a game nobody else can make with other tools.
[1] https://steamuserimages-a.akamaihd.net/ugc/58020036161797293...
chubot
Meh the problem is "Which Lisp?" There are dozens of incompatible Lisps. Even this site is written in a Lisp dialect written by its author (Arc).
In fact I conjecture that this is the reason Unix is more popular than Lisp -- because Lisps don't interoperate well. They haven't built up a big ecosystem of reusable code.
Whereas Python, JavaScript, R, C, C++, and Rust programmers can reuse each others' code via Unix-style coarse-grained composition. (Not just pipes -- think about a web server running behind nginx, or git reusing SSH and HTTP as transports.)
You can also use link time composition. It takes some work but it's better than rewriting your Common Lisp code from scratch in Clojure.
-----
Honest question: how do you communicate between two Lisp processes on two different machines? I know Clojure has EDN (which is sort of like JSON : JavaScript), but I haven't heard of the solutions for other Lisps.
I wrote about this problem here: A Sketch of the Biggest Idea in Software Architecture http://www.oilshell.org/blog/2022/03/backlog-arch.html
> The lowest common denominator between a Common Lisp, Clojure, and Racket program is a Bourne shell script (and eventually an Oil script).
I'll definitely update it if there's something I'm missing.
I would say the design of Unix is "rotting", but the answer is to IMPROVE Unix. Not dream of clean slate designs that will never be deployed. Plus this post doesn't actually propose anything. If you actually start trying to build your Lisp machine, I believe you will run into dozens of reasons why it's not a good idea.
bitwize
You are aware that Lisp machines understood several different flavors of Lisp? The Symbolics ones understood Zetalisp and Common Lisp at least. Were they on the market today they could be convinced to run Clojure and Scheme as well. There are a few old-timers developing Common Lisp applications to run on modern hardware, using Symbolics hardware.
In fact, Symbolics shipped compilers for other non-Lisp programming languages, including C and Ada. These interoperated smoothly with Lisp code, much more so than they do under Unix. In this demo, Kalman Reti compiles a JPEG decoder written in C, and replaces the Lisp JPEG decoder that came with the system with it, yielding a performance boost:
chubot
OK interesting, I will check out the link.
I still think various Lisps don't interoperate enough today, but I'm not very familiar with the Lisp machines of the past. If it can interoperate with C and Ada that's interesting. But I also wonder about interop with JavaScript :) i.e. not just existing languages but FUTURE languages.
These are the M x N problems and extensibility problems I'm talking about on the blog.
fiddlerwoaroof
If you’re fine with ES3, there’s https://marijnhaverbeke.nl/cl-javascript/ (I’ve been intending to use Babel to compile modern JS to ES3 and then load it into CL, but no time)
Or parenscript: https://parenscript.common-lisp.dev/
Or CLOG: https://github.com/rabbibotton/clog
And, there’s always the option of writing an HTTP or websocket server and serving JSON.
I personally use the SLIME repl in emacs for a ton of things I used to use bash one-liners for and you can push it pretty far (this is an experiment, I’ve wrapped it up more nicely into my .sbclrc since):
https://twitter.com/fwoaroof/status/1502881957012668417?s=21...
bitwize
I'm not sure what you mean by "The lowest common denominator between a Common Lisp, Clojure, and Racket program is a Bourne shell script (and eventually an Oil script)." You can get two Lisp programs (or two Python programs, or two mixed-language programs, etc.) to intercommunicate without involving the shell at all.
It'd be more accurate to say "The lowest common denominator between a Common Lisp, Clojure, and Racket program is sexpr notation." By using sexprs over a pipe, named pipe, or network socket, you can very easily get any of those Lisps to intercommunicate deeply structured data with any other. This is how SLIME and its SWANK protocol work. I don't even think the shell is involved; Emacs handles spawning the inferior Lisp and setting up the communication channel itself.
The thing the Lisp machines had was a very robust ABI. Lisp has a specific notion about what a function is in memory. This is largely because Lisp is always "on" on a Lisp machine, there is no point at which you cannot take advantage of the entire Lisp runtime, including the compiler, in your own programs. Accordingly the Lisp machine C compiler output Lisp functions containing code compiled from C, that could be called by Lisp functions directly (and vice versa). Presumably a JavaScript runtime for Lisp machines would be able to do the same thing.
By contrast, C has no notion of what a function is; and the C ABI used by many operating systems presents a function as simply a pointer in memory that gets called after some parameters are pushed to the stack or left in registers. (How many parameters, their type, and their size, is unspecified and simply agreed upon by caller and callee.) Nothing about memory management is provided by the runtime either. All that has to be provided by user programs. All this adds friction to interoperation by function call, and makes IPC the most straightforward way to interoperate.
But oh well. Our computers are all slowly turning into JavaScript machines anyway, so maybe those Lisp/Smalltalk happy days can return again soon.
foobarian
> In fact I conjecture that this is the reason Unix is more popular than Lisp -- because Lisps don't interoperate well. They haven't built up a big ecosystem of reusable code.
And why are there so many? IMO the language is too flexible for its own good. It promotes this curious intellectual solo-competition where you try to prove you are worth of the elite who get all the fancy FP stuff and make all this bespoke functionality.
It's almost impossible for Lisp to be popular. To be popular means a lot of people use it, but that means it can't be complicated much above the median. But because it lets individual programmers push intellectual boundaries it self-selects itself out of this pool. Any big collaborative project will attract this type of developer and soon the lesser developers (who don't dare object for fear of appearing dumb) are not able to contribute.
Just my opinion, if a little dramatic.
zozbot234
Haskell has its share of fancy FP stuff, and people manage to develop workable things in it. I still think Lisp really is too dynamic to be useful beyond a small scale of development.
Jtsummers
> Honest question: how do you communicate between two Lisp processes on two different machines? I know Clojure has EDN (which is sort of like JSON : JavaScript), but I haven't heard of the solutions for other Lisps.
Probably TCP or UDP based protocols like essentially every cross-network communication in every language today.
EDIT: Also, it should be noted that JSON does not, itself, allow you to communicate across a network. It's just a serialization format. You still have to select some protocol for actually doing the communication. If your answer to the question "How do you communicate between two JavaScript processes on two different machines?" is "JSON", you've failed at actually answering the question.
chubot
Right, so that is what I'm getting at. If you have two Lisp programs on different machines, or two programs written in different Lisp dialects, the way you compose them is basically "Unix" -- serialize to some common format and send over a pipe / socket.
gmfawcett
What alternatives are you possibly allowing for, if you're taking IP and wire protocols off of the table?
armitron
The canonical Lisps still widely used today are Common Lisp, Scheme and Emacs Lisp. They all belong in the same family, and syntax / semantics are close. Porting code from Scheme to Common Lisp can be a lot easier than going from Python 2 to Python 3.
Clojure is something else entirely which is why a lot of people don't consider it a Lisp.
> Honest question: how do you communicate between two Lisp processes on two different machines?
If you want to use built-in object serialization, there is print and read.
iak8god
> Common Lisp, Scheme and Emacs Lisp... all belong in the same family
Could you say more about what you mean by this? Is there another family of Lisps that excludes these three? I've met people who make a big deal about lisp-1 vs lisp-2 (https://en.wikipedia.org/wiki/Lisp-1_vs._Lisp-2), and which is the right way to be a Lisp, but I think maybe those people just enjoy being pedantic.
armitron
I was referring to Clojure which going by syntax/semantics does not belong in the same family as Common Lisp, Scheme, Emacs Lisp.
zozbot234
An unappreciated means of code reuse under *nix is the static and dynamic library. This seems to be the go-to whenever you need something more involved than simply reusing a full binary via pipes.
scj
C's greatest feature is that it trivially maps onto .so files. Linking to and creating .so files isn't just cheap, it's effectively free.
Most higher level languages I've worked with seem to focus on using .so files rather than producing them.
This means the lowest common denominator for the unix ecosystem is what C can provide. Otherwise stated, unix is marching forward at the pace of C.
samatman
This is one of the things I like so much about Lua, LuaJIT in particular: it's designed around being just another .so file, from another clib's perspective Lua is a library for manipulating a calling stack, and the LuaJIT ffi makes it short work to adapt a header file so that structures and functions can be used from within Lua.
pjmlp
No wonder, given that C was created to turn UNIX from an originally Assembly written OS into a portable one, UNIX is effectively a C Machine.
fiddlerwoaroof
> Honest question: how do you communicate between two Lisp processes on two different machines?
Depends what level of integration you want: custom tcp/udp protocols, HTTP and websockets are all pretty easy. But, you can also use something like this over a trusted network/vpn: https://github.com/brown/swank-client
spacemanmatt
> how do you communicate between two Lisp processes on two different machines?
Same as every other language: you pick a protocol and use it on both sides. Many of us already have enough JSON in play that it makes sense to start there.
convolvatron
> Honest question: how do you communicate between two Lisp processes on two different machines?
using continuations with sexps is insanely powerful
send (+ 1 3 (lambda (x) `(handle-response, x)))
rst
Some of this needs checking -- you could not run Unix on Symbolics hardware. LMI did have machines that ran both OSes -- but Unix was running on a separate 68000 processor; see, e.g. http://www.bitsavers.org/pdf/lmi/LMI_lambdaOverview_1982.pdf
(3600-series Symbolics machines also had a 68k "front end processor", but no Unix port was provided for it; they also ultimately had a C compiler that could generate code for the "Lisp processor", but the code it generated was intended to run in the Lisp environment.)
It's also worth noting that systems-level code for Symbolics machines (and, I presume, LMI as well) made frequent use of "unsafe subprimitives", misuse of which could easily crash the machine. And, unfortunately, if you needed to, say, get anything close to hardware bandwidth out of the disk drives, some of this became well-nigh unavoidable, due to poor performance of the OS-level file system (LMFS).
lispm
What one could do was running hardware Lisp Machines from Symbolics on VME boards inside a SUN: the UX400 and UX1200.
Later Open Genera was sold as a Virtual Lisp Machine running on a DEC Alpha / UNIX system.
skissane
Apparently Open Genera now even runs under macOS on Apple M1s: https://twitter.com/gmpalter/status/1359360886415233029
I think the big problem with Genera is the licensing. Although it comes with source code, it is proprietary software, and buying a license is expensive. I think the owners of the Symbolics IP have prioritised squeezing the maximum revenue out of a declining user base over trying to grow that user base.
I'm surprised "Open Source LispOS" projects have largely failed to gain traction. Writing your own OS is (at least in some ways) easier than it used to be (especially if you target virtualisation rather than bare metal). There seem to be a lot more people saying "LispOS is what we need!" than actually writing one or contributing to an existing effort to write one.
mportela
This post would benefit from further expanding some of these statements.
> UNIX isn’t good enough anymore and it’s getting worse
Why exactly?
> A new operating system means we can explore new ideas in new ways.
LISP machines were not only OSes but also hardware. Is the author also proposing running this OS on optimized hardware or simply using our x86-64/AMD/M1 CPUs?
> With lisp machines, we can cut out the complicated multi-language, multi library mess from the stack, eliminate memory leaks and questions of type safety, binary exploits, and millions of lines of sheer complexity that clog up modern computers.
Sure, but it also requires rewriting a lot of these things, introducing and fixing new bugs... It feels like the good ol' "let's rewrite this program" that quite frequently doesn't live up to the expectations [1].
[1] https://vibratingmelon.com/2011/06/10/why-you-should-almost-...
traverseda
>> UNIX isn’t good enough anymore and it’s getting worse
>Why exactly?
Personally? We're in a bit of a transition point, and a lot of the technologies aren't working together like they used to.
An example, on my laptop I want to run android apps. The way to do this that actually works well (waydroid) only supports wayland. Unfortunately I use x2x to control another display remotely, and x2x doesn't work properly under wayland, and never will due to wayland's security choices.
So like, what am I supposed to do here? Not run android apps? Not use tools like barrier/synergy/x2x?
This is one of many many frustrations I've had from this new generation of wayland/systemd/etc. Hopefully it gets better eventually but it does feel a lot like the rug is constantly being pulled out from under me for no good reason...
Now I don't think a lisp machine is going to fix that mind you, but it is a concern.
sseagull
I am finding that everything is just becoming more and more fragmented. Programming languages, ecosystems, frameworks, blah.
I have had ideas for some applications, but can’t do them because the libraries I need are written in different languages, which don’t interoperate well (and one I am not familiar in).
Every post here looking for recommendations has many responses with different packages/ecosystems doing the same thing.
Sometimes I feel like there are too many developers and not enough of them really interested in the actual hard problems. So they just make another python package manager or webapp framework.
eternityforest
I think they are too invested im hard problems, but not invested enough in tedious problems.
I don't want a new language, or a new preprocesser, or a new way of thinking about programs... I just want a one click way to take a folder of HTML that looks like a static site, and package it up into cross platform apps with all the proper API access.
I don't care about cryptocurrency and global decentralized databases, I just want to be able to run a site by buying a NAS appliances, putting files on it, and sharing the QR code on it without any signups and cloud accounts.
The hard problems are mostly solved. There's probably some long dead random github repo that does anything you want. They're just not packaged for all platforms and maintained professionally.
I don't need a phone without any binary blobs at all... I just want a mesh network feature and a fair trade stamp.
Because... it sucks to maintain something novel even if it's all just pieced together from npm. It sucks to be the one responding to issues. It sucks to actually run a project and keep track of devops stuff.
All the tech has been invented dozens of times over, but never polished, and always ruined with some unnecessary immutable log feature or performance destroying random access low latency routing thing like IPFS has.
pjmlp
It isn't nice? Back to the 8 and 16 bit home computers, when we had plenty of choice and real fights on the school playground about which were better.
tacitusarc
I’m curious what you would classify as a hard problem. Personally, I think writing a good, easy to use Python package manager that both gains adoption and addresses the myriad of corner cases is a hard problem.
PaulDavisThe1st
So a perfect use-case for the obligatory XKCD post about "too many standards, invent one more to try to solve the issue => too many standards plus one".
zozbot234
You can actually start a Wayland compositor/session in a X window. That plus existing solutions for Wayland network transparency should be enough.
traverseda
That's what I'm doing now, but it's a pretty bad user experience and definitely isn't seamless. Also completely breaks any integration with my app menu meaning I have to rewrite all the desktop files to launch the app in cage. I imagine someone will eventually make a seamless wayland compositor for x though.
>That plus existing solutions for Wayland network transparency should be enough.
If you're saying it will fix my x2x use case, well it won't. Wayland's security model fundamentally prevents this use case. Maybe someone will add extensions to it eventually but right now the way compositors handle mouse capture seriously prevents this use case, and I'm skeptical that all the different compositors will agree on a solution any time in the next 10 years...
So I'm stuck using X on both displays for the foreseeable future if I want to use x2x/synergy like functionality, and I'm certain that it's going to become harder and harder to keep using X over time...
pjmlp
Xerox PARC workstations could run Interlisp-D, Smalltalk, Mesa/XDE, Mesa/Cedar, thanks to this little thing RISC failed to kill, microcoded CPUs.
pmoriarty
> > UNIX isn’t good enough anymore and it’s getting worse
> Why exactly?
Two reasons:
1 - systemd (which is moving linux towards becoming a systemd OS)
2 - developers and companies moving ever more towards web apps (which will eventually make the underlying OS irrelevant, as the browser becomes the OS) (incidentally, web assembly seems to herald the end of the open/transparent web too, as we're eventually all be running opaque binary blobs on our web browsers)
amelius
> 1 - systemd (which is moving linux towards becoming a systemd OS)
Or, a microservices OS.
DonHopkins
There's a old book all about just that, which included and popularized Richard P. Gabriel's paper, "The Rise of Worse Is Better":
https://en.wikipedia.org/wiki/The_UNIX-HATERS_Handbook
https://web.mit.edu/~simsong/www/ugh.pdf
>The year was 1987, and Michael Travers, a graduate student at the MIT Media Laboratory, was taking his first steps into the future. For years Travers had written large and beautiful programs at the console of his Symbolics Lisp Machine (affectionately known as a LispM), one of two stateof-the-art AI workstations at the Lab. But it was all coming to an end. In the interest of cost and efficiency, the Media Lab had decided to purge its LispMs. If Travers wanted to continue doing research at MIT, he discovered, he would have to use the Lab’s VAX mainframe.
>The VAX ran Unix.
>MIT has a long tradition of mailing lists devoted to particular operating systems. These are lists for systems hackers, such as ITS-LOVERS, which was organized for programmers and users of the MIT Artificial Intelligence Laboratory’s Incompatible Timesharing System. These lists are for experts, for people who can—and have—written their own operating systems. Michael Travers decided to create a new list. He called it UNIXHATERS:
Date: Thu, 1 Oct 87 13:13:41 EDT
From: Michael Travers <mt>
To: UNIX-HATERS
Subject: Welcome to UNIX-HATERS
In the tradition of TWENEX-HATERS, a mailing list for surly folk
who have difficulty accepting the latest in operating system technology.
If you are not in fact a Unix hater, let me know and I’ll remove you.
Please add other people you think need emotional outlets for their
frustration.
https://www.amazon.com/UNIX-Haters-Handbook-UNIX-Haters-line...https://www.goodreads.com/en/book/show/174904.The_UNIX_Hater...
https://wiki.c2.com/?TheUnixHatersHandbook
>I'm a UnixLover, but I love this book because I thought it was hysterically funny. Many of the war stories are similar to experiences I've had myself, even if they're often flawed as a critique of Unix itself for one reason or another. But other UnixLovers I've loaned the book to found it annoying rather than funny, so YMMV.
>BTW the core group of contributors to this book were more Symbolics Lisp Machine fans than ITS or Windows fans. ITS had certain technical features superior to Unix, such as PCLSRing as mentioned in WorseIsBetter, but having used it a bit myself, I can't see that ITS was superior to Unix across the board. The Lisp Machine on the other hand, although I never used it, was by all accounts a very sophisticated environment for programmers. -- DougMerritt
https://news.ycombinator.com/item?id=13781815
https://news.ycombinator.com/item?id=19416485
>mtraven on March 18, 2019 | next [–]
>I founded the mailing list the book was based on. These days I say, Unix went from being the worst operating system available, to being the best operating system available, without getting appreciably better. (which may not be entirely accurate, but don't flame me).
>And still miss my Lisp Machine. It's not that Unix is really that bad, it's that it has a certain model of computer use which has crowded out the more ambitious visions which were still alive in the 70s and 80s.
>Much as the web (the Unix of hypertext) crowded out the more ambitious visions of what computational media for intellectual work could be (see the work of Doug Engelbart and Ted Nelson). That's a bigger tragedy IMO. Unix, eh, it's good enough, but the shittiness of the web makes humanity stupider than we should be, at a time when we can ill afford it.
https://medium.com/@donhopkins/the-x-windows-disaster-128d39...
mportela
I really appreciate your writing this reply (esp. the links). Thanks a lot, mate!
kkfx
>> UNIX isn’t good enough anymore and it’s getting worse
> Why exactly?
Beside the defects well stated in the Unix Hater's Handbook, unix violate it's own principles since many years. Original unix idea was: desktops like Xerox SmallTalk workstations are too expensive and complex for most needs, so instead of a real revolution of an extraordinary outcome we decide to limit ourselves to most common needs in exchange of far less costs. No GUIs, no touchscreen, no videoconferencing and screen sharing [1] just a good enough CLI with a "user language" (shell scripts) for small potatoes automation and a bit of IPCs for more... Well... For more there is a "system language" (C) that's easy enough for most really complex task.
That was a success because no one really like revolutions and long terms goals especially if they demand big money while many like quick & done improvements at little price.
However in few years unix start to feel the need of something more than a CLI and some GUIs start to appear, unfortunately differently than original Xerox&co desktops those UIs were not "part of the system, fully integrated in it" but just hackish additions with so interoperability, just single apps who have at maximum cut&paste ability.
> Sure, but it also requires rewriting a lot of these things, introducing and fixing new bugs... It feels like the good ol' "let's rewrite this program" that quite frequently doesn't live up to the expectations
We need desktops again, witch means not just "endpoints" or "modern dumb terminals of modern mainframes named cloud", but desktop computing, since desktop development is essentially abandoned since many years and even back then was in a bad shape we need to restart from the classic desktops. LispM was ancient, hackish, but are still the best desktop we have had in human history so a good starting point. We have some kind of LispM OS/OE here: Emacs, still alive and kicking so there is something to work with, that's is. Emacs is already a WM (EXWM) have countless features and it's already "plugged" in modern bootloader OSes to have hw, driver and services. It just need to evolve.
[1] yes, you are reading correctly and no, I'm not wrong, I'm talking about the famous NLS "Mother of all the Demos" from 1968 https://youtu.be/yJDv-zdhzMY
tytrdev
I agree, especially with the statement that Unix isn’t good enough and getting worse.
I feel like that was one of the core assumptions and point of the article, but it didn’t have any explanation beyond “multiple programming languages.” Feels a bit flat to me.
PaulHoule
What ramblings.
Optane is the best performing SSD but the worst performing RAM you ever had. It is too expensive at any speed, even if Intel is losing money on it. HP memristors are vaporware.
LISP machines, Java machines, and similar architectures specialized for complex language runtimes are a notorious dead end. They just can’t keep up with performance-optimized RISC, pipelined, superscalar, SIMD, etc. architectures paired with compilers and runtimes that implement efficient abstractions (e.g. garbage collection, hotspot compilers) on top of those very fast primitives.
lispm
Before Lisp Machines were killed in the market it was clear that new architectures were needed and a few were under development, even RISC like CPUs. They weren't released.
But Lisp at that time was already fast enough on standard RISC chips (MIPS, SPARC, ALPHA, POWER, ...). Later the 64bit RISC chips also provided enough memory space. SPARC also had some tricks for Lisp implementors.
Currently the assembler coded Ivory emulator is 80 times faster on Apple's M1 than the last Ivory hardware (the Ivory Microprocessor from Symbolics was released end 80s).
imglorp
Speed is relevant for some use cases, sure, but not at all for a ton of others. Memory, disk and CPU are almost free in this new world, so why are we computing like it's 1990 still? It's time for some different abstractions than file -> process -> file.
The vast productivity gains of Smalltalk and Lisp were because they discarded those abstractions and programmers were free for others.
Presumably OP posted this after noticing Phantom came up a few days ago. https://news.ycombinator.com/item?id=30807668
bigbillheck
> Memory, disk and CPU are almost free in this new world, so why are we computing like it's 1990 still?
Elsewhere on this very site you'll find no ends of complaints about, say, Electron apps.
PaulHoule
For general purpose computing applications expand to fill the performance available (that includes real value and bloat!)
I dabble in microcontrollers for fun and there it's different. I am an AVR-8 fanatic and sometimes I think "this is so fast" and "2K of RAM is plenty" and "I can fit CRC-32 tables in 32k of flash because that's what counts as an 'operating system' for me"
Then there are the applications where it just doesn't have the power and I am so glad to have a box of RP2040's because in 2022 the most important attribute of a microcontroller is that it is available.
undefined
zozbot234
The RISC-V folks are working on additions for special support of "complex language runtimes". Pipelined, SIMD and superscalar are all well and good, but what kills pure software-side support is always heavy branching and dispatching. These operations are genuinely much faster and more power-efficient when implemented in hardware.
DonHopkins
How is the ARM not a "JavaScript Machine"?
https://stackoverflow.com/questions/50966676/why-do-arm-chip...
>Why do ARM chips have an instruction with Javascript in the name (FJCVTZS)?
https://community.arm.com/arm-community-blogs/b/architecture...
PaulHoule
That instruction is a very small hack that uses just a few transistors to speed up a bit of data conversion that JS runtimes do frequently. That’s a far cry from a specialized chip.
nanochad
> You could open up system functions in the editor, modify and compile them while the machine was running.
Why would you want to do that other than hot patching a system that can't go down? Testing new changes requires more time than rebooting. If you just want to test simple changes, most debuggers can do that.
> Everything worked in a single address space, programs could talk to each other in ways operating systems of today couldn’t dream of.
And with a single address space you have win9x security.
> A modern UNIX system isn’t self-contained. I have 4 UNIX systems on my desk (Desktop, laptop, iPhone, iPad) I’m contentiously using the cloud (iCloud for photos, GitHub for text files, Dropbox for everything else) to sync files between these machines. The cloud is just a workaround for UNIX’s self-contained nature
This is just your use habbits. Nothing is stopping you from using NFS or SSHS. Someone who feels the need to use iCloud for whatever trivial convenience it provides is unlikely to benefit from a Lisp machine's ability to edit code on the live system.
> Then we add a gazillion programming languages, VMs, Containers, and a million other things, UNIX is a bloated mess of workaround for its own problems. We need a replacement, something that can be built for the modern world using technologies that are clean, secure, and extendable
The same thing will happen with any OS given enough time. Lisp is also not secure. It's prone to side channel and eval bugs.
> eliminate memory leaks and questions of type safety,
Lisp is not type safe.
reikonomusha
> Lisp is not type safe.
It is type safe. While Lisp is not statically typed, its typing discipline is strong: operations performed on incompatible types signal recoverable errors.
Too
Crashing at runtime, recoverable or not, is usually not what people mean when they say type safe. Spare me the static vs strong academia. Type safe when spoken, in practical every day terms, normally means enforced at compile time with IDE autocompletion support, usually implying static typing.
zozbot234
It's not crashing at runtime, it's crashing at compile time. Or rather, a purely REPL-focused language like Lisp dispenses with the phase separation between compile- and run-time altogether. But then this applies just as much to dependently-typed languages, which come from the "compile time type safety" line of research. You can't be okay with those while dismissing Lisp.
maydup-nem
Recoverable error handling at runtime is usually not what people mean when they say crashing.
undefined
Decabytes
> Lisp is not type safe.
Typed Racket is, and that is why I love it
maydup-nem
> Testing new changes requires more time than rebooting.
no it doesnt
> Lisp is not type safe.
yes it is
zozbot234
> And with a single address space you have win9x security.
Address space != protection boundaries. These are nearly orthogonal concerns. Where single address spaces might become less useful today is in dealing with Spectre vulnerabilities, though formalizing more explicit requirements about information domains (as in multilevel security, which is a well-established field of OS research) might help address those.
mark_l_watson
I don’t really agree. I had a Xerox 1108 Lisp Machine in the 1980s and loved it, but special purpose Lisp hardware seems like a waste of effort. I set up an emulator for the 1108 last weekend, and yes, I really did enjoy the memories, and things ran an order of magnitude faster than on the 1108 in the 1980s.
Then, I appreciated my M1 MacBook Pro running SBCL, LispWorks, Haskell, Clojure, and various Scheme languages - all with nice Emacs based dev setups. Life is really good on modern hardware.
lispm
The 1108 wasn't really special purpose Lisp hardware. One could run other operating systems on it. What made it special purpose was the loaded microcode for the CPU.
> Life is really good on modern hardware.
Agreed: On modern CPUs.
More support for the additional hardware features like GPUs, media processing engines and the neural network engines (see the M1 Pro/Max/Ultra) would be welcome.
mark_l_watson
The best bet for getting GPU deep learning support, I use Anaconda/conda, using the Apple M1 channel. That said, I usually use my Linux GPU rig or Colab for deep learning.
mst
I feel like a lot of posts like this are pining for the complete lisp machine -user environment- and overestimating how necessary/important the hardware architecture would be to getting back to that today.
I can manage to context switch between different lisps fine but I do sometimes wonder in e.g. a slime+SBCL setup how much that context switching is costing me.
throw10920
I think that, while the idea is solid (Unix is poorly-designed and we should have better) some of the specific ideas mentioned are lacking:
> Everything worked in a single address space, programs could talk to each other in ways operating systems of today couldn’t dream of.
No! Bad! We have enough problems securing software on separate VMs running on the same metal, single address spaces are completely out of the question until someone manages to build a feasible trusted compiler system.
> Then we add a gazillion programming languages, VMs, Containers, and a million other things, UNIX is a bloated mess of workaround for its own problems.
A lot of these problems could happen with a Lisp machine - you could have a billion different Lisps, for instance (although, to be fair, with better (i.e. non-Unix) OS design you wouldn't need containers).
> With lisp machines, we can cut out the complicated multi-language, multi library mess from the stack, eliminate memory leaks and questions of type safety, binary exploits, and millions of lines of sheer complexity that clog up modern computers.
This is partially true, but a lot of the complexity in modern software doesn't come from Unix, but just...bad design decisions. Webtech doesn't really care whether it's running on Windows or Unix, after all.
Also, high-level CPUs are a bad idea: http://yosefk.com/blog/the-high-level-cpu-challenge.html
I think the good in this post is along the lines of: text bad, typed IPC good, runtime-aware OS good, standardized VMs good, interactive systems (Lispy stuff, Jupyter) > batch-processing systems (Unix, C).
ogogmad
I've started using Mathematica recently. I quite like it: I've used Sympy before, which was good, but nowhere near as "good" as Mathematica. How does it compare to the Lisp Machine operating systems? There's some vague resemblance to Lisp in treating symbols as a basic type of object. In the Mathematica use-case, these symbolic values are used to stand for algebraic variables or unknowns. Undeclared variables by default have symbolic type, with their own names being their values. (I know that other CASes do similar things here). Also, algebraic manipulations produce expressions which double as Mathematica code, which resembles the meta-programming features of Lisp. There's even glimpses of reactive programming in the way you construct interactive plots.
I know this is "uncouth" because it's commercial software, but Mathematica is one of the most interesting programs I've ever used. [edit] Might something like this be the future?
nine_k
> With lisp machines, we can cut out the complicated multi-language, multi library mess from the stack, eliminate memory leaks and questions of type safety, binary exploits, and millions of lines of sheer complexity that clog up modern computers.
Oh man, wat?
I love lisp as much as the next guy.
But you absolutely can have library mess, memory leaks, and millions of lines of code using a Lisp. You arguably can have a "multi-language" mess, too, because Lisp gives you wonderful tools to create DSLs; I'd say creating a language that fits your needs, and then using it, is the right way to use Lisp.
I use Emacs daily, and see how an all-Lisp environment can make for a good, productive interactive experience. More efforts in this area would be quite welcome, but this is a shell, not a kernel.
I still suppose that systems software, and especially the key parts of an OS, need a language more like Rust than like Lisp, with a good affinity to raw hardware, and a ton of static guarantees.
amelius
The main problem with Unix right now is security / permission control. Unix was built from the perspective that users potentially don't trust each other, but users all magically trust the applications that are run. In the age of the internet, this doesn't hold anymore, and we need strong permission control.
daly
Who cares if Lisp is popular?
The "lisp epiphany" is real. Either you "get it" or "you don't".
I've been writing lisp programs for 50 years. I've been paid to program in 60 different languages but nothing compares with lisp.
There is an intellectual "distance" between a problem and its machine solution. I call this the "impedence problem". Lisp lets you think at the most abstract and write to the most specific.
Writing changed the world. But if you give most people a blank piece of paper they don't know what to do with such freedom. Lisp is the "blank piece of paper" of programming languages. Everything, literally everything, comes from you.
I loved my Symbolics machine. It was the closest expression of a "thinking platform" I've ever used. IDEs are horrible for thinking, ever interrupting at every keystroke.
Lisp isn't "popular" because it provides a "thinking platform" you can shape to your thoughts.
Lisp will never be popular. The reason should be obvious.
mbrodersen
A number of Lisp fans seems to care. That’s why we regularly see articles on HN trying to convince other developers to use Lisp. It has been going on for years. Article after article written by frustrated Lisp fans, not understanding why the language they love is not mainstream. Often making bizarre claims about non-Lisp developers not being smart enough to “get” Lisp or whatever. Not having a clue that most developers care about a lot more than just the programming language. The best way to show the “power” of Lisp is to develop commercially successful software using it. That will do way more to convince smart developers to try out Lisp than writing yet another Lisp article trying to “sell” Lisp.
daly
For the record, I made (and make) no claims about "non-Lisp developers not being smart enough".
My claim is that Lisp is perfect for thinking about a new idea or a new approach.
For example, I implemented a program that merged Expert Systems and Knowledge Representation into a single system (KROPS) that allowed a domain expert to express their knowledge as rules or as facts. Anything the system learned by either method could be expressed in either representation. Thus, the two representations were "unified".
Another effort involved Human-Robot Cooperation to change a car tire (TIRES). The system could interact with the human through pseudo-natural language, learn rules dynamically, and expand its knowledge base of the current situation in real time. So the system self-modifies and learns through human interaction on the task.
Both of these systems required self-modifying code which is rather more difficult to do in other languages. In Lisp this is trivial.
As for commercial sales witness:
Axiom, a 1.2 million line Computer Algebra program written in Common Lisp, was sold commercially by the Numerical Algorithms Group.
YESOPS, an IBM Expert System program implemented in Common Lisp, was sold commercially.
mbrodersen
> For the record, I made (and make) no claims about "non-Lisp developers not being smart enough".
I am glad to hear that.
mbrodersen
It’s great that there is commercially successful software written in Lisp. However a lot more needs to be written in Lisp to compete with the hundreds of thousands of commercially successful applications written in other languages. I don’t think that will ever happen. But that’s fine of course. Pick the language that makes you happy and work with that. It doesn’t have to be a popular language to make you happy.
lispm
> frustrated Lisp fans
Why be frustrated? For example I'm using the latest Apple Silicon laptop and there are a dozen Lisp (and related) systems already ported to it. Years ago it took a lot longer to move to a new platform, especially for open source software implementations with native code compilers.
Happy times.
mbrodersen
I agree. Nothing stops Lisp fans from using Lisp for their own projects. The fact that Lisp isn’t popular, and probably never will be, really shouldn’t matter.
Get the top HN stories in your inbox every day.
I'm as big of a Lisp fan as can be. I'm a proud owner of Symbolics and TI hardware: a MicroExplorer, a MacIvory, two 3650, and two 3620. Not to mention an AlphaServer running OpenGenera.
Today, we have computers that run Lisp orders of magnitude faster than any of those Lisp machines. And we have about 3–4 orders of magnitude more memory with 64-bits of integer and floating point goodness. And Lisp is touted to have remained one of the most powerful programming languages (I think it's true, but don't read into it too much).
Yet, it appears the median and mean Lisp programmer is producing Yet Another (TM) test framework, anaphoric macro library, utility library, syntactic quirk, or half-baked binding library to scratch an itch. Our Lisp programming environments are less than what they were in the 80s because everybody feels the current situation with SLIME and Emacs is good enough.
We don't "need" Lisp machines. We "need" Lisp software. What made a Lisp machines extraordinary wasn't the hardware, it was the software. Nothing today is impeding one from writing such software, except time, energy, interest, willpower, and/or money.
Don't get me wrong, there are some Lisp programmers today developing superlative libraries and applications [1], but the Lisp population is thin on them. I'd guess that the number of publicly known, interesting (by some metric), and maintained applications or libraries that have sprung up in the past decade probably fits on one side of a 3"x5" index card. [2]
Though I won't accuse the article's author of such, sometimes, I find, in a strange way, that pining for the Lisp machines of yore is actually a sort of mental gymnastic to absolve one for not having written anything interesting in Lisp, and to excuse one from ever being able to do so.
[1] Just to cherry-pick a recent example, Kandria is a neat platformer developed entirely in Common Lisp by an indie game studio, with a demo shipping on Steam: https://store.steampowered.com/app/1261430/Kandria/
[2] This doesn't mean there aren't enough foundational libraries, or "batteries", in Lisp. Though imperfect, this is by and large not an issue in 2022.