Get the top HN stories in your inbox every day.
Syzygies
jasonwatkinspdx
Years ago I talked to an older APL pro, that got lucky enough to work on a project where he wrote APL interactively on a Cray. I can't remember most of the details, but holy cow, that must have been like being handed a fighter jet in the biplane era.
tdeck
> The tough questions were “dead key problems”. How do you write the following program, if the following keys are broken?
I've never heard of this style of programming question. It's fascinating that we've gone through so many styles of interview over the decades, from pair programming against unit tests to Google-style whiteboard algorithms to early-2000s Microsoft-style brain teasers and "why are manhole covers round".
Syzygies
It's very APL specific. Keys were APL symbols, operators in one keystroke. So if the sum key wasn't available, you'd decode base 1. That sort of thing...
drno123
This, sir, is why I read HN comments. Thank you.
GordonS
Also just wanted to say thanks for sharing!
I grew up with PCs and a Commodore 64 in the 80's, which I think was the golden era of personal computing, but I absolutely love stories from earlier!
Syzygies
My grandfather had to drop out of high school, after his dad died from an infection. He unloaded trains for the A&P grocery chain, and proposed that they reorder shipments to save money. He retired in charge of half of New England. His computer center was a bank of women at fancy adding machines.
Fresh out of college, my father helped a senior programmer at Kodak automate testing for their film developing chemical baths. This replaced dozens of people. Kodak found them other positions, and moved my dad to the research labs. He used computers from the beginning, and told me many stories. On a family night in the 1960's I got to play the original "space wars".
In 1974 he came up with the color sensor pattern now used in nearly all digital cameras.
dboreham
I also got to play space wars on a pdp-1 but my father didn't invent digital photography :(
Wondering though...I visited the Rochester labs in the mid-80s -- hardware I had a hand in was being used there for projects that I suspect were secret at the time such as computer simulation of photochemistry and warping of photoint images to conform to map data. I wonder if I met your father?
scottlawson
He invented the Bayer pattern??!
gnicholas
Fellow Swattie here, 2004 vintage. The CS department has changed dramatically even since my time there. We took classes in the observatory building, but they’ve long since outgrown that space. They’re now in the new Science Center building, and I heard they’ve actually outgrown that space as well. It’s a very popular major!
dboreham
Read that as "We took classes in observatory building". Imagining the prof explaining how to grind a mirror, how revolvers can be used to make a very high resolution angular positioning system, and so on.
gnicholas
Sadly, the only linkage between the intended purpose of the building and the classes we took inside were the the Sun workstations.
retzkek
Thanks for sharing, I love these types of stories. Really makes me pine for the "old" days, and wonder if there's a parallel universe where technology took a very different route such that languages like APL, Lisp, and Smalltalk are used instead of JavaScript, Java, and C#, and what that world looks like.
> Some Lisp programmers had similar experiences, back in the day.
About 20 years ago (so not quite so far back) I was an engineering intern in an industry (nuclear energy) with two main tools: heavy number crunching software in Fortran, and Excel for everything else. The plant I was working at had just gotten some software for managing and tracking fuel movement (reactor cores are comprised of several hundred fuel bundles, which are re-arranged every 1-2 years), and my task was to set it up and migrate historical data from Excel spreadsheets, either by entering data manually with the GUI (which wasn't really that good) or using the primitive built-in import/export functions (CSV-based probably). Good intern task, right?
At some point I noticed this odd window running in the background whenever I started the program: "Gold Hill Common Lisp". Hm, what's this, it seems to have some kind of command line... and so I dived down the rabbit hole of the CL REPL and live image manipulation. I discovered the apropos command (or maybe GHCL had an actual UI for it?), which let me learn about all the internal data structures and methods, which I was able to use to quickly configure the plant specifics and import data.
"Oh, you're done already? OK next we need to get these custom reports out, talk to the vendor about implementing them. And see if you can now export data into our old report generator" (another spreadsheet of course). So I dutifully started the requisition process to get custom reports added, but while that was working through the system, I was stretching my new-found Lisp knowledge to not just dump out report data, but add the functionality to the UI. Coming from a background in C and Fortran I was fully ingrained with "write, compile, run" being how things worked. Image how much it blew my mind when I found out I could call a function in the REPL and actually add a menu to the running program!
One feature of the software was it could be used for "online" fuel movement tracking, which was traditionally done on paper, in duplicate. It's probably still done that way for good reasons, but still nice to have that electronic tracking. I was so proud when we were demonstrating it to the reactor operators, and they asked if we could add some little functionality (the details escape me), and I was able to say "yep no problem!" No requisitions, then back-and-forth with the vendor, eventually getting the feature in a year maybe. Really wish all software was so powerful (although admittedly my hijinks were a bit of a QA nightmare, but the software wasn't considered safety-related since there were many checks and paper records).
Fast-forward a couple years, after much coursework in Fortran and Matlab, I'm about to graduate and am now interviewing with the vendor. Question comes up "so what would you change about our software?" "Well, the interface is a bit clunky, I'd probably want to re-write it in a modern language like C++" :facepalm:.
Only years later, re-discovering CL, along with Racket and Clojure, did it occur to me how much was wrong with that response, and how sad that the key lesson of that semester internship had gone over my head.
Syzygies
I love Lisp and Scheme too. Though like sex at 19, nothing has ever felt quite like the first months with APL.
For "production" math research, nothing comes close to Haskell for me. Every time I consider straying, the ease of parallelism draws me back.
I have written a fair bit of Scheme, using my own preprocessor that avoids most parentheses. I stay clear of that loaded debate, this is for personal use. The code is poetic, though not as dense as APL or Haskell. As Bill Joy once opined, what you can fit on a screen matters.
My own favorite Haskell code is a very terse implementation of monadic parsing. Typed trees are not far off from strings, and then one has algebraic data types as parsing. (Pattern matching is parsing a bit at a time, without reifying the activity for program control.)
APL gets unexpected mileage from a core of array handling. I dream of a Lisp-like language tuned to parse algebraic data types as its core activity, with macros as its "middle of the plate" pitch rather than as a strapped-on afterthought. (I'm not trolling here, this is my honest opinion. In 10,000 runs of the simulation, I doubt Lisp macros would be this clumsy in most runs.)
Ultimately we all long for better language tools to express pattern as code. My electric guitar never sounded like what's in my head, and for code I'm still reaching too. Though APL was wonderful in its day.
FullyFunctional
Playing with J (~ APL) certainly feels magical (though I can never remember the syntax a day later) and APL like Lisp gets a lot of leverage from a powerful vocabulary on a rich data structure (arrays and lists respectively). However the "One Great Datastructure" flattens the domain and doesn't self-document, nor constrain you from unintended uses, the way a rich type system does, so I find reading and maintaining Lisp (and I assume the same applies to APL) to be frustrating and tedious.
Writing this, I'm reminded how J felt full of "tricks" much like using Perl: there are these tricks you can use to get the result you wanted that isn't necessarily the most faithful expression of the problem.
pjmlp
> Thanks for sharing, I love these types of stories. Really makes me pine for the "old" days, and wonder if there's a parallel universe where technology took a very different route such that languages like APL, Lisp, and Smalltalk are used instead of JavaScript, Java, and C#, and what that world looks like.
Easy, here is some time travel,
"Alan Kay's tribute to Ted Nelson at "Intertwingled" Festival"
https://www.youtube.com/watch?v=AnrlSqtpOkw
"Eric Bier Demonstrates Cedar"
https://www.youtube.com/watch?v=z_dt7NG38V4
"Yesterday's Computer of Tomorrow: The Xerox Alto │Smalltalk-76 Demo"
https://www.youtube.com/watch?v=NqKyHEJe9_w
However to be fair, Java, C# and Swift alongside their IDEs are the closest in mainstream languages to that experience, unless you get to use Allegro, LispWorks or Pharo.
Qem
Nice links. For a taste of what to program in Pharo is like, see https://avdi.codes/in-which-i-make-you-hate-ruby-in-7-minute...
pattisapu
This post is almost Pynchonesque! Great story and thanks for sharing!
rattray
Very fun story, thank you for sharing! Small question of clarification:
> I took a professor’s 300 line APL program translated literally from BASIC, and wrote a ten line program that was much faster.
The professor translated BASIC->APL, and you translated APL->more concise APL?
Syzygies
Yes [fixed]. And in an interpreted language, moving the work to a single operation was a big win.
hhyndman
Back in university (1974), I took a course in AI. The prof wanted us to write a brute-force solution to solve the 8-queens problem -- any language!
I wrote the solution in APL in about an hour and it only had 3 lines of code. The rest of the class spent days on their terminals and keypunches trying to solve it. Most solutions took hundreds of lines of code.
I got a D from my professor. I questioned why and was told that it was unreadable, and that the solution was inefficient. This annoyed me because he didn't know APL, and I figured that since I solved the problem in one hour, while the rest took days, it was very efficient.
I protested the result with the department head (who liked APL) and ended up getting an A+. As you can imagine, all the rest of my assignments, written in a variety of languages, were graded by that AI prof with significant prejudice.
I passed nonetheless. I loved APL and ended up working for one of the major APL providers as my first job out of school.
travisjungroth
> I questioned why and was told that it was unreadable, [...] this annoyed me because he didn't know APL.
I often tell people that Spanish is unreadable if you don't know Spanish. This also applies to language features! It's only fair to call things "unreadable" if you have the full context to understand but still find it hard.
S4M
> I got a D from my professor. I questioned why and was told that it was unreadable, and that the solution was inefficient.
That is such a bad faith argument, how can a brute force solution be efficient or inefficient?
jiggawatts
Constant factors, heuristics, memory usage, etc...
There was a discussion on array programming languages here recently where someone proudly showed off a K language solution to a simple problem, stating that the K solution was efficient because it could solve it in 1.2 microseconds. I used Rust to solve it in 5.5 nanoseconds, which is nearly 200x faster. Both used "brute force" with no cleverness, but there's "bad" brute force and "good" brute force.
I've had a similar experience at university while using Haskell. It's a very elegant lazy pure functional language, but it's glacially slow. The natural, low-friction path is very inefficient. The efficient path is unidiomatic and verbose.
I hear people making similar observations about F# also. It's fast to program in, but if you also need performance then it is no better than more mainstream languages -- in fact worse in some ways because you're "going against the grain".
whb07
Compared to Rust/C/C++? Sure!
But Haskell vs most others, it’s faster and compiles down to a binary executable.
F# runs on .NET and is comparable to Haskell but not AOT.
still_grokking
> I've had a similar experience at university while using Haskell. It's a very elegant lazy pure functional language, but it's glacially slow. The natural, low-friction path is very inefficient. The efficient path is unidiomatic and verbose.
Well, if you run a Haskell program on a "C-Machine" of course a comparable program in a "C-Language" will be faster — as it don't need to bridge any gap in execution semantics.
The point is: Mostly all modern computers are "C-Machines". Modern CPUs go even a long way to simulate a "PDP-7 like computer" to the outside world, even they're working internally quite different. (The most effective optimizations like cache hierarchies, pipelineing, out-of-order execution, JIT compilation to native instructions [CPU internal "micro-ops"], and some more magic are "hidden away"; they're "transparent" to programmers and actually often not even accessible by them). So not only there's nothing than "C-Machines", those "C-Machines" are even highly optimized to most efficiently execute "C-Languages", but nothing else! If you want to feed in something that's not a "C-Language" you have to first translate it to one. That transformation will almost always make your program less efficient than writing it (by hand) in a "C-Language" directly. That's obvious.
On the other hand running Haskell on a "Haskell-Machine"¹ is actually quite efficient. (I think depending on the problem to solve it even outperforms a "C-Machine" by some factor; don't remember the details, would need to look through the papers to be sure…). On such a machine an idiomatic C or Rust program would be "glacially slow", of course, for the same reason as the other way around: The need to adapt execution semantics before such "no-native" programs could be run will obviously make the "translated" programs much slower compared to programs build in languages much closer to the execution semantics provided by the machines hardware implemented evaluator.
That said, I understand why we can't have dedicated hardware evaluators for all kind of (significantly different) languages. Developing and optimizing hardware is just to expensive and takes to much time. At least if you'd like to compete on the status quo.
But I could imagine for the future some kind of high level "meta language" targeting FPGAs which could be compiled down to efficient hardware-based evaluators for programs written in it. Maybe this could even end the predominance of "C-Languages" when it comes to efficient software? Actually the inherently serial command-stream semantics of "C-Languages" aren't well fitted to the parallel data-flow hardware architectures we're using at the core now since some time (where we even do a lot of gymnastics to hide the true nature of the metal by "emulating a PDP-7" to the outside world — as that's what "C-Languages" are build for and expect as their runtime).
To add to the topic of the submission: Are there any HW implementations of APLs? Maybe on GPUs? (As this seems a good fit for array processing languages).
coopreme
Great story, it’s stories like these that make me still hold onto my undergrad work (as bad as it is). Maybe one c#, Visual Basic, asp, php, t-sql and other esoteric projects will be looked at as relics of the past.
tosh
If you want to learn more about array programming languages there is a new podcast series at https://www.arraycast.com with some banter, philosophy, history and a collection of related resources https://www.arraycast.com/resources
olodus
Found it when their first ep was posted here on HN a few months ago. Had seen array langs before but never dared to sit down with them. Their pod made me take the plunge. These langs are fascinating. As someone who likes func programming normally it feels related but with reduce on steroids.
lokedhs
I'm taking the opportunity to mention my project that implements a language that is inspired by, and is mostly compatible with APL. It has some major differences, such as being lazy evaluated and providing support for first-class functions.
It also supports defining syntax extensions which is used by the standard library to provide imperative syntax, which means you can mix traditional APL together with your familiar if/else statements, etc.
At this point there isn't much documentation, and the implementation isn't complete, so I'm not actually suggesting that people run out to try it unless they are really interested in APL. I just took this opportunity since APL is mentioned so rarely here.
https://github.com/lokedhs/array
There is an example of a graphical mandelbrot implementation in the demo directory, that may be interesting.
useful
As someone who has used APL professionally to maintain a legacy codebase https://en.wikipedia.org/wiki/Write-only_language
Anyway, I like reduce, shape, membership, find, and/or, and ceiling/floor. I actually like dealing with arrays in this way.
IMO, that is why numpy/matlab is so much better than APL.
sundarurfriend
A lot of people seem to have trouble with symbol-based languages, for eg. regular expressions, some parts of Perl, or APL in this case. That seems to be part of the appeal of Python too, for a lot of people, that it's unusually low on non-alphanumeric symbols. I wonder if it has something to do with "Head-voice vs. quiet-mind" [1]. I'm generally on the non-verbal quiet-mind side, and find APL-like languages very intuitive and appealing. Debugging or maintaining them doesn't feel any more difficult than more verbal languages either.
[1] http://web.archive.org/web/20210207121250/http://esr.ibiblio...
jacoblambda
I think that quiet vs verbal mind personality difference is really what separates whether people like which languages.
I personally can't stand languages that are "spoken description". I understand the appeal to others but the languages just don't mesh with my way of thought. When I'm programming or building a system I'm thinking in the sense of abstract transformations and structures not in any spoken structure. Often times for me it's easier to draw out what I'm thinking of rather than explain it since there's not necessarily a verbal representation behind what I'm thinking of until I sit down and try to come up with one.
Syzygies
There was study of Harvard undergraduates that demonstrated Greek letters made math harder.
I tell my students that Columbia undergraduates are of course smarter, but still...
xelxebar
Interesting. I was under the impression that Iversion intended APL to also read almost like English, provided you knew the names of operators and idioms.
Aaron Hsu has some talks showing this off about his co-dfns compiler.
jonstaab
What is your setup like? I was just messing around with it just now using homebrew's gnu-apl package, and it just seems like a toy language, for example scripting mode is sort of bolted on top of interactive mode, since you have to add an ")OFF" command at the end of your script. How do you handle modules?
mlochbaum
GNU APL is mostly a reimplementation of APL2 from the 80s, with some additions that in my opinion do nothing to get it out of the 80s. Dyalog has namespaces, but scripting support is only due to be released in the next version, 18.1.
So I don't know of any APL that allows module-defining scripts. This is really unfortunate since there's no technical reason to prevent it. With lexical scoping (Dyalog has it, GNU doesn't), it's easy to define a module system and I did this in my APL reboot called BQN: https://mlochbaum.github.io/BQN/doc/namespace.html .
lokedhs
BQN is really impressive, and implements a language which is similar to APL, but without a lot of the legacy baggage that Dyalog has gathered over the years.
For someone that wants to get started with array languages and does not have any need to be compatible with APL, then this is probably the best place to get started.
It also has good documentation, unlike my array language. I need to put a lot of effort into it to get even close to what BQN did.
neolog
I see it's self-hosted. How much code needs to be written in another language in order to bootstrap the whole thing?
useful
various interpreters have ways to make external calls via com/web/etc. APL is basically python calling C++/C#/Java/etc.
Seeing pure APL for XML parsing is.. interesting. Most interpreters support saved/read of functions in a more procedural way.
BiteCode_dev
I suppose if arrays language get popular enough, we will get a module to use them as a DSL for libs like numpy, just like we have regex for strings, instead of a whole language dedicated for them.
It would be a win / win, you gain the strength of J, K and APL for array processing, without the weakness of them for anything else.
And just like with regex, you'll get fat disclaimers in doc telling you to not get carried away too much as it can quickly become unreadable.
robomartin
APL is not unreadable any more than mathematical or musical notation are. Sure, to someone who doesn’t know the notation it looks like an incomprehensible mess. So does math, music, greek, chinese, arabic, hebrew, etc.
I used APL professionally every day for ten years. I can read it. I can touch-type it. And I don’t need a specially-labelled keyboard (even thirty years later).
This should not be surprising at all. A pianist does not need labeled keys and people familiar enough with the above-listed spoken languages can touch-type them without much effort.
While, sadly, APL has no practical application in modern software engineering (it stagnated and became irrelevant and impractical) it is wrong to look at the brilliant use of notation as a tool for the concise communication and expression of ideas and list it as a negative. Not being able to speak, read or write Chinese does not make it a bad language.
0xdeadbeefbabe
> Not being able to speak, read or write Chinese does not make it a bad language.
Well that's the problem. Not being able to read or write APL makes it more fun to learn.
O_H_E
In Julia: https://github.com/shashi/APL.jl
bluenose69
Thanks for posting this intriguing link. A glance at the Jupyter notebook on this site will bring a smile to a Julia user who grew up on (or has somehow encountered) APL.
snicker7
What an incredibly concise, elegant implementation.
gugagore
The regex DSL only works for strings. I lament that I cannot use something regex-like to match general sequences, e.g. a sequence of tokens, instead of only strings (sequence of characters).
The operations could be the same. There are classes, and operators for matching 0, 1, or more repetitions, etc.
Array languages are powerful especially when you have arrays with arbitrary elements, including arrays themselves. Good luck using a regex DSL to match a sequence of strings, where you might want to define a string class (analogous to a character class) as a regex itself.
mumblemumble
> I lament that I cannot use something regex-like to match general sequences, e.g. a sequence of tokens
You can. It's not typically built into a programming language's standard library. But there are plenty of general-purpose automata-building libraries out there, and some of them do provide DSLs. At least to the extent that the regular expressions you're using are actually regular (many aren't), all a regex is is a domain-specific specialization of nondeterministic finite automata.
I sometimes lament that I don't see them, or hand-coded automata, more often. This was CS101-level stuff when I was in college, and it's pretty easy to code your own even if you don't have a good library available in your language. And, for problems where they're appropriate, using them typically yields a result that's simpler and easier to maintain than whatever ad-hoc alternative you might see instead.
bmn__
> using a regex DSL to match a sequence of strings, where you might want to define a string class (analogous to a character class) as a regex itself.
http://p3rl.org/retut#Defining-named-patterns
https://docs.raku.org/language/grammar_tutorial#The_technica...
If you want a demo, reply with a concrete example I can implement.
jasonwatkinspdx
The now defunct Viewpoints Research org chased an idea similar to this. It was a meta language that was based around PEGs, intended to allow the easy creation of DSLs. I imagine the papers are still up somewhere. It was called OMeta.
hcarvalhoalves
There are a few, here's one: https://clojure.org/guides/spec#_sequences
b2gills
I'm not sure of the need to do that. Not for parsing anyway.
A well designed regex system is enough without tokenization.
The parser for the Raku language, for example, is a collection of regexes composed into grammars. (A grammar is a type of class where the methods are regexes.)
We could probably do the token thing with multi functions, if we had to.
multi parse ( 'if', $ where /\s+/, …, '{', *@_ ) {…}
multi parse ( 'for', $ where /\s+/, …, '{', *@_ ) {…}
…
Or something like that anyway.(Note that `if` and `for` etc are keywords only when they are followed immediately by whitespace.)
I'm not sure how well that would work in practice; as hypothetically Raku doesn't start with any keywords or operators. They are supposed to seem like they are added the same way keywords and operators are added by module authors. (In order to bootstrap it we of course need to actually have keywords and operators there to build upon.)
Since modules can add new things, we would need to update the list of known tokens as we are parsing. Which means that even if Raku did the tokenization thing, it would have to happen at the same time as the other steps.
Tokenization seems like an antiquated way to create compilers. It was needed as there wasn't enough memory to have all of the stages loaded at the same time.
---
Here is an example parser for JSON files using regexes in a grammar to show the simplicity and power of parsing things this way.
grammar JSON::Parser::Example {
token TOP { \s* <value> \s* }
rule object { '{' ~ '}' <pairlist> }
rule pairlist { <pair> * % ',' }
rule pair { <string> ':' <value> }
rule array { '[' ~ ']' <arraylist> }
rule arraylist { <value> * % ',' }
proto token value {*}
token value:sym<number> {
'-'? # optional negation
[ 0 | <[1..9]> <[0..9]>* ] # no leading 0 allowed
[ '.' <[0..9]>+ ]? # optional decimal point
[ <[eE]> <[+-]>? <[0..9]>+ ]? # optional exponent
}
token value:sym<true> { <sym> }
token value:sym<false> { <sym> }
token value:sym<null> { <sym> }
token value:sym<object> { <object> }
token value:sym<array> { <array> }
token value:sym<string> { <string> }
token string {
「"」 ~ 「"」 [ <str> | 「\」 <str=.str_escape> ]*
}
token str { <-["\\\t\x[0A]]>+ }
token str_escape { <["\\/bfnrt]> }
}
A `token` is a `regex` with `:ratchet` mode enabled (no backtracking).
A `rule` is a `token` with `:sigspace` also enabled (whitespace becomes the same as a call to `<.ws>`).The only one of those that really looks anything like traditional regexes is the `value:sym<number>` token. (Raku replaced non capturing grouping `(?:…)` with `[…]`, and character classes `[eE]` with `<[eE]>`)
This code was copied from https://github.com/moritz/json/blob/master/lib/JSON/Tiny/Gra... but some parts were simplified to be slightly easier to understand. Mainly I removed the Unicode handling capabilities.
It will generate a tree based structure when you use it.
my $json = Q:to/END/;
{
"foo": ["bar", "baz"],
"ultimate-answer": 42
}
END
my $result = JSON::Parser::Example.parse($json);
say $result;
The above will display the resultant tree structure like this: 「{
"foo": ["bar", "baz"],
"ultimate-answer": 42
}
」
value => 「{
"foo": ["bar", "baz"],
"ultimate-answer": 42
}
」
object => 「{
"foo": ["bar", "baz"],
"ultimate-answer": 42
}
」
pairlist => 「"foo": ["bar", "baz"],
"ultimate-answer": 42
」
pair => 「"foo": ["bar", "baz"]」
string => 「"foo"」
str => 「foo」
value => 「["bar", "baz"]」
array => 「["bar", "baz"]」
arraylist => 「"bar", "baz"」
value => 「"bar"」
string => 「"bar"」
str => 「bar」
value => 「"baz"」
string => 「"baz"」
str => 「baz」
pair => 「"ultimate-answer": 42
」
string => 「"ultimate-answer"」
str => 「ultimate-answer」
value => 「42」
You can access parts of the data using array and hash accesses. my @pairs = $result<value><object><pairlist><pair>;
my @keys = @pairs.map( *.<string>.substr(1,*-1) );
my @values = @pairs.map( *.<value> );
say @pairs.first( *.<string><str> eq 'ultimate-answer' )<value>;
# 「42」
You can also pass in an actions class to do your processing at the same time.
It is also a lot less fragile.See https://github.com/moritz/json/blob/master/lib/JSON/Tiny/Act... for an example of an actions class.
---
Note that things which you would historically talk about as a token such as `true`, `false`, and `null` are written using `token`. This is a useful association as it will naturally cause you to write shorter, more composable regexes.
Since they are composable, we could do things like extend the grammar to add the ability to have non string keys. Or perhaps add `Inf`, `-Inf`, and `NaN` as values.
grammar Extended::JSON::Example is JSON::Parser::Example {
rule pair { <key=.value> ':' <value> } # replaces existing rule
# adds to the existing set of value tokens.
token value:sym<Inf> { <sym> }
token value:sym<-Inf> { <sym> }
token value:sym<NaN> { <sym> }
}
This is basically how Raku handles adding new keywords and operators under the hood.RyanHamilton
I've written an open source version of q: http://www.timestored.com/jq/ It's implemented in java. You idea is interesting. Allowing q and java intermixed... combining it inside java like linq and C# would be interesting.
mlochbaum
Since this seems to have brought TryAPL down, there are other options listed at [0]. In particular, ngn/apl[1] is a JavaScript implementation and runs client-side. But it's limited relative to Dyalog (used on TryAPL) and no longer under development.
dang
> Since this seems to have brought TryAPL down
Apparently that always happens:
Try APL in your browser - https://news.ycombinator.com/item?id=9774875 - June 2015 (27 comments)
Try APL online - https://news.ycombinator.com/item?id=4090097 - June 2012 (4 comments)
sondr3
I had a blast learning to write and read APL for a course at my university where we chose and presented the papers from HOPL IV. If you want a fairly quick and easy read about the history of APL I can heartily recommend the paper "APL Since 1978". A small taste: `twoSum ← {1↑⍸⍺=(⍵∘.+⍵)}`, a dyadic function to find the indicies of (the) two elements in an array that sum to ⍺, for example: `9 twoSum 2 7 11 15` will return `0 1`. Though I doubt I'll ever write any larger programs with it, I've had a lot of fun with it.
tosh
> APL Since 1978
dang
One past thread:
APL Since 1978 [pdf] - https://news.ycombinator.com/item?id=23510433 - June 2020 (21 comments)
Jtsummers
If APL interests you, I worked through (most of) Mastering Dyalog APL [0] a while back. It is very well paced and organized. The vast majority of it also works with GNU APL, though not all.
cardanome
Does anyone know of an input method for APL that works similar to an IME that you would use for Japanese?
Basically you type the name of the operator in latin characters and get the proper symbol autocompleted.
I only see direct key to symbol mappings which might be fine for a full time APL dev but offer a bit too much of a learning curve for just trying it out.
WorldMaker
The Windows emoji keyboard (Win+. or Win+; whichever is more comfortable for you) has a lot of the Unicode math symbols under its Symbols tab (marked with an omega). It has a pretty good IME-ish type to search experience for regular emoji, but doesn't support type to search under the Symbols tab. (I wish it did and hope it is something they consider adding.)
Jtsummers
The linked site uses a couple methods:
`i => ⍳ (iota)
ii<tab> => ⍳
In some editors you can change the prefix character (in emacs I think the default is . or I changed it to . almost immediately). Also in emacs (though I didn't try this with APL) you can use an entry method based on TeX so if you type: \iota
You will get ⍳abrudz
The RIDE interface (https://github.com/dyalog/ride) allows you to type double-backtick and then a search word. Screenshot: https://i.imgur.com/kagYC73.png
keutoi
Emacs has a good quail-completion system for `GNU-APL`.
papito
Topical! One of the most recent Corecursive episodes has a guest with a fascinating take on APL, while the episode is completely unrelated.
https://corecursive.com/065-competitive-coding-with-conor-ho...
Get yo actuary tables on.
agbell
Thanks for sharing!
A question I had with APL was how do you actually type it in, and it turns out you just use a back-tick as a prefix, like a leader key in vim. Conor walked my through solving something with TryAPL in this video:
Avshalom
You can also set it as an altlayout on your keyboard. I'm on linux so I set it to holding Winkey switches to APL.
patrickthebold
The main reason I want to learn APL is for the white-boarding exercises during interviews. Most places will let you write in your strongest language.
gaogao
Yes, but go with something too far off the beaten path and you just get https://aphyr.com/posts/341-hexing-the-technical-interview, which while really fun, doesn't get callbacks.
(I've always wanted to try doing a white-boarding interview in a visual language like Scratch)
patrickthebold
I've been meaning to write 'bullshitting the technical interview' where I neither know how to solve the problem nor APL but end up convincing the interviewer that I know both.
haskellandchill
Haha using APL in interviews (outside of finance) would be legendary.
randomswede
When I was more regularly doing coding interviews, I would pick a problem suited for the strengths of the candidate's self-identified strongest language, but basically go "use whatever language you want, but beware that if it's not one of <short list>, I will have to transcribe your code and run it for the final assessment".
I had Haskell, OCAML, and Ruby thrown at me (no, none of those were on the list). None actually ran on the first try, but I could rehab the Ruby to working (it was a silly mistake, only a small amount of score knocked off), but I could not rehab the Haskell nor the OCAML, so that ended up with a "recommend no-hire".
Bit of a shame, had they chosen the Python that the CV indicated they preferred, it may well have been a "strong hire" (but, failure to produce runnable code, that can't be easily fixed, in a language explicitly recommended against indicates that there are some possible red flags).
haskellandchill
Bit of a shame? It was your decision. I don't see that as a red flag unless the candidate had time to produce running code on their own development setup. In a timed interview producing running code is irrelevant in my opinion, but hey you chose your rules and got your results.
Get the top HN stories in your inbox every day.
In the mid-seventies at Swarthmore College, we were mired in punched card Fortran programming on a single IBM 1130. The horror, a machine less powerful than the first Apple II. My job six hours a week was to reboot after each crash. People waited hours for their turn to crash the machine. I let a line form once people had their printouts. I'd find the single pair of brackets in a ten line listing, and I'd explain how their index was out of bounds. They thought I was a genius. Late one Saturday night, I made a misguided visit to the computer center while high, smelled sweat and fear, and spun to leave. Too late, a woman's voice: "Dave! I told Professor Pryor he needed you!" We didn't know that Fred Pryor was the economics graduate student freed in the 1962 "Bridge of Spies" prisoner exchange. Later he’d learn that I was his beagle’s favorite human, and I’d dog-sit to find steaks for me I couldn’t afford, but for now I feared him. So busted! Then I heard this voice “See these square brackets? See where you initialize this index?” He was spectacularly grateful.
One cannot overstate the rend in the universe that an APL terminal presented, catapulting me decades into the future. I quickly dreamed in APL. For $3 an hour for ten hours (a massive overcharge) I took a professor’s 300 line APL program translated literally from BASIC, and wrote a ten line APL program that was much faster. One line was classic APL, swapping + and * in an iterated matrix product for max and min. The other nine lines were input, output. The professor took years to realize I wasn’t also calling his code, then published my program.
Summer of 1977 I worked as a commercial APL programmer. Normally one never hires college students for the summer and expects them to be productive. The New York-based vice president was taking the train every day to Philadelphia because the Philly office was so far underwater, and desperate to try anything to save himself the commute. He knew Swarthmore had a terminal, and heard about me. At my interview I made a home-run derby of the questions from the Philly boss. The VP kept trying to intervene so he could put me in my place before hiring me. The tough questions were “dead key problems”. How do you write the following program, if the following keys are broken?
Our client was a mineral mining company, our task a reporting system. The reports were 2-dimensional projections of a 9-dimensional database. The accountants wanted all totals to be consistent across reports, and to be exactly the sums of their rounded components. I broke the news to our team that we needed to start over, rounding the 9-dimensional database once and for all, before generating each report. This took a few weeks; I wrote plenty of report generation helper routines. My coworkers overheard me say on a phone call that I was being paid $5 an hour, and at the time I didn’t understand which way they were shocked. I didn’t have much to do, the rest of the summer.
The mining company VP found me one morning, to ask for a different report, a few pages. He sketched it for me. He found me a few hours later to update his spec. He loved the printout he saw, imagining it was a prototype. “It’s done. I can make your changes in half an hour.”
At a later meeting he explained his own background in computing, how it had been key to his corporate rise. Their Fortran shop would take a month to even begin a project like I had knocked off in a morning, then weeks to finish it. He pleaded with me to pass on Harvard grad school and become his protege.
Some Lisp programmers had similar experiences, back in the day. Today, APL just sounds like another exotic language. In its heyday it was radical.