Brian Lovin
/
Hacker News
Daily Digest email

Get the top HN stories in your inbox every day.

emmanueloga_

Warning: Diving into SCIP/Lisp/Scheme can transform how you think about programming... food for thought is always welcomed! But applying those ideas wholesale to OOP codebases often backfires or gets pushback from teammates. Languages handle paradigms differently, and going against the grain usually leads to worse code.

Example: After Lisp, you might replace every for loop with forEach or chain everything through map/reduce. But unless you’re working in a language that fully embraces functional programming, this approach can hurt both readability and performance.

At the end of the day, it’s grounding to remember there’s mutable memory and a CPU processing the code. These days, I find data-oriented design and “mechanical sympathy” (aligning code with hardware realities) more practical day-to-day than more abstract concepts like Church numerals.

hmmokidk

The thing is, I find immutable is safer. Less side effects is more predictable. OO is used often when things can just be functions. I love gamedev, and OO makes a lot of sense there because all of the computations and unsafe code is kind of okay. But for webdev FP makes so much more sense. For SAAS something like elixir enables me to write more reliable / less buggy / better tested code.

oersted

Even in GameDev there is a strong trend towards data-oriented programming and stateless logic, led by the ECS (Entity-Component-System) paradigm.

ECS is all about composition rather than inheritance, and decoupling logic from data. It is not strictly immutable for performance reasons, but it has a similar character as the immutable functional state-management frameworks in WebDev (Redux, Elm and co).

It's not just about maintainability, it actually can be awkward to fit certain patterns into ECS, but it has significant advantages in terms of performance (particularly being CPU cache-friendly) and being able to massively parallelize safely and without having to think too much about it. It can also be a helpful abstraction for distributed computing and networking in multiplayer.

delusional

If I could get one wish, it would be to ban "makes sense" from any engineering discussion ever. Facts do not "make sense" the world is not a "sense making" machine. We make sense of the world and of the facts. Something feeling intuitive, which is what people often mean when they claim something "makes sense", just means it slots into your existing experience.

fud101

"Young man, in mathematics you do not understand things. You just get used to them.” - John von Neumann

anhner

How do you deal with lack of types? (I know elixir is adding types but from my understanding it's nothing like e.g. typescript)

I'm thinking about learning elixir but lack of types is kind of a turn off for me.

TacticalCoder

> How do you deal with lack of types?

Not GP but I'm using Clojure for both the front-end (ClojureScript) and the "back-end" (server running Clojure), sharing Clojure code between the two.

Clojure is not typed but I use Clojure specs. It's not a type system but it's really nice. You can spec everything, including specc'ing functions: for example you can do stuff like: "while in dev, verify that this function returns indeed a collection that is actually sorted everytime it is called". I'm not saying: "no types + clojure specs" beats types but it exists and it helps to solve some of the things types are useful for.

https://clojure.org/guides/spec

shawa_a_a

Not the other commenter, but my team has been using Elixir in production (soft real-time distributed systems) for several years to great success. The approachable syntax has been great for folks new to the language coming on board and sort of, not realising they’re “doing FP”.

Generally I’d say Elixir’s lack of “hard” static typing is more than made up for what you get from the BEAM VM, OTP, its concurrency model, supervisors etc.

That said if you’re interested in leveraging the platform whilst also programming with types I’d recommend checking out Gleam (https://gleam.run), which I believe uses an HM type system.

cess11

In Elixir specifically you've got structs that are somewhat similar to types, %MyStuff{}, and if you need to you can use Ecto to get some guarantees. You also tend to focus more on the shape of data than the name, e.g. with pattern matching in places like function declarations and case expressions, which emulates quite a bit of what you typically use a type system to accomplish.

Here's a summary of the type system they're exploring for Elixir: https://hexdocs.pm/elixir/main/gradual-set-theoretic-types.h...

tempodox

Fewer side effects.

pjmlp

Like Common Lisp Object System?

Besides, stuff like forEach, map/reduce was already present in Smalltalk collections, and was copied into Object Pascal, C++, even if without first class grammar for it.

Exactly because the underlying memory is mutual, there are mechanisms in ML languages to mutation, when really needed.

llm_trw

>At the end of the day, it’s grounding to remember there’s mutable memory and a CPU processing the code.

And yet goto is even less popular than Lisp despite being the only way that CPUs implement control flow.

What's even more bizarre is that despite caches being the only way to get any performance out of modern CPUs we still don't have a language that treats the memory hierarchy as a first class citizen. The closest is Linux flavored c with a sea of underscores.

taeric

Lisp has goto, if you want it. :)

I confess it has made transcribing some algorithms a bit easier.

renox

I don't find it bizarre: even assembly language don't have much control over the memory hierarchy, there's prefetch load and nothing else...

llm_trw

So much the worse for assembly language.

Lest we forget that modern x86 machine code is an assembly language and the machine code, or microinstructions in newspeak, is completely hidden from view.

marvinborner

They give a nice introduction to encoding state as pure functions. In fact, there are many more purely functional encodings for all kinds of data like trees, integers, sum/product types, images, monads, ...

The encodings can be a bit confusing, but really elegant and tiny at the same time. Take for example a functional implementation of the Maybe monad in javascript:

  Nothing = nothing => just => nothing
  Just = v => nothing => just => just(v)
  
  pure = Just
  bind = mx => f => mx(mx)(f)
  
  evalMaybe = maybe => maybe("Nothing")(v => "Just " + v)
  console.log(evalMaybe(bind(Nothing)(n => pure(n + 1)))) // Nothing
  console.log(evalMaybe(bind(Just(42))(n => pure(n + 1)))) // Just 43

solomonb

You can derive these implementations from the recursion principle for your type:

  data Maybe a = Nothing | Just a

  foldMaybe :: (Unit -> r) -> (a -> r) -> Maybe a -> r
The two higher order functions passed into `foldMaybe` are your `Nothing` and `Just` (modulo I added the Unit param to the Nothing case to be a little more precise).

gugagore

If you have any light to shed, I'm wondering about inductive data types and their expressive necessity here: https://news.ycombinator.com/item?id=42166709

SkiFire13

You can see this as replacing an inductive type with its recursor's function type. It's pretty cool in type theory, but not so good for actually programming stuff.

gleenn

Honest question: why is that bad for actual programming stuff? Is it because the type theory is interesting but doesn't really help? Performance?

SkiFire13

In this particular case IMO it's bad because it essentially removes nominal typing for arguably no benefit.

Even in Lean, a dependently typed language where recursors can be made explicit, people prefer using pattern matching instead of them. There is even sugar for transforming some recursors-like functions into pattern matching like syntax. FYI in Lean recursors are marked as non-computable due to performance concerns, so you can use them to write proofs but not programs.

Seen from yet another point of view, this is transforming inductive types in a function corresponding to a visitor. And yet functional programming folks spent years trying to convince people to replace visitors with proper inductive/algebraic data types and pattern matching, so this idea is a step backwards even for them.

jrvieira

i am guessing that most people think that the cognitive load cost is usually not worth the benefits.

i agree that the cognitive load in a language like js which is not prepared to accommodate this paradigm is not worth it

even when deciding to use Haskell we need to weigh the pros and cons wrt the project's goals

akira2501

It may be elegant mathematically, but then conveyed through a language that is strictly in the ASCII character set without any alignment or internal justification, they're really just painful to look at.

tempodox

In short, the untyped lambda calculus is Turing-complete.

hinkley

[flagged]

marvinborner

I think it's all right if you're used to the notation. The first two lines are tagged unions and will be recognisable as such if you're familiar with encodings like Scott/Church pairs/lists/numbers. Once you understand the structure, the definition of `bind` becomes obvious, as its two arguments represent the cases "is nothing" and "is just", where in the first case Nothing is returned, and in the second case the function is applied to the value inside the Just.

I think that writing such code, if only for educational purposes, can be really helpful in actually understanding how the state "flows" during the monadic bind/return. Typical monad instantiations of Maybe do not give such deep insight (at least to me).

> Just because you can do a thing doesn’t mean you should.

Of course you should, where would be the fun in that?

salawat

>I think it's all right if you're used to the notation.

Higher mathematics in a nutshell.

>Of course you should, where would be the fun in that?

Also higher mathematics in a nutshell.

Narrator asks: Who should we put in charge of <<thing that will effect people in a tangible way>>?

Not the mathematicians! echo the crowd in unanmity.

Narrator asks: Who will we delegate the task of <<abuse of notation>> to?

The crowd grumbles, arguing amongst themselves whether such a question even warrants an answer. A mathematician stands up, proclaiming "We'll take it!", following up with, "Once you understand the notation involved in my previous statement, you will understand why this outcome is inevitable."

The crowd, seeing the wisdom of not even embarking on that tribulation, assents to the delegation, given the task of undoing the abuse of notation for the legibility of the layperson is also delegated to the aspiring mathematician.

Scene opens on current day...

tmtvl

> The first two lines are tagged unions

Are they? But in the Nothing you have 2 identical members (`nothing' without arguments), won't that throw an exception?

To borrow Rust syntax (pun intended):

  enum Nothing {
    nothing,
    just
    nothing
  };
That's just weird.

IshKebab

It's alright once you get used to it usually means it isn't alright in my experience. There are exceptions of course.

williamcotton

It’s definitely easier to read in an ML language, that’s for sure!

undefined

[deleted]

6510

If there is a wrong way to do something someone will do it.

hinkley

There’s an old saying attributed to the Inuit: everyone enjoys the smell of their own farts.

zahlman

I've watched the actual SICP lectures before (the 1986 recordings on MIT OCW). They're often praised for the information density, but it actually still wastes a lot of time listening to students' Q&A, the lecturers drawing the class' attention to various attempts at "multimedia" presentation in the classroom, simply not having the entire lesson plan worked out in advance (i.e., not being able to preempt the Q&A) etc. For that matter, the sheer amount of time spent on writing things on a chalkboard really adds up.

And of course the order of the material could be debated and rearranged countless ways. One of my future planned projects is to do my own video series presenting the material according to my own sensibilities.

It's nice to hear that the course apparently still stays true to its roots while using more current languages like Python. Python is designed as a pragmatic, multi-paradigm language and I think people often don't give it enough credit for its expressive power using FP idioms (if not with complete purity).

rjagy

The course is using Python to implement a Scheme, then uses Scheme to implement a Scheme. Python could and should be removed from the course.

Python has very poor support for functional programming. Lists are not cons based, lambdas are crippled, pattern matching is horrible and not even expression based, namespaces are weird.

Python is not even a current language, it is stuck in the 1990s and happens to have a decent C-API that unfortunately fueled its growth at the expense of better languages.

nyrikki

While I am a huge lisp fan, oh...the irony of saying that python is struck in the 1990's when CONS, CAR and CDR are artifacts from the IBM 704 and Fortran :)

While I do find it annoying that python used 'list' to mean 'dynamic array', it is a lot better than a ton of church encoding in the other common teaching language, Java.

Linked lists may not be native in python but it is trivial to implement them.

liontwist

The names are artifacts, but the concepts are not. It’s about lists being recursively definable.

hackboyfly

Interesting, I think this is the first time I have seen anyone bash Python this hard.

Why would a decent C-API fuel its growth? Also can you give me some examples of better languages?

Am no senior developer but I find python very elegant and easy to get started with.

linguae

I’m not the parent poster, but I’ve seen two major spurts of Python’s popularity: (1) the mid-2000s when Python became a popular scripting language, displacing Perl, and (2) beginning in the first half of the 2010s when an entire ecosystem of Python APIs backed by code written in C, C++, and even Fortran made up the infrastructure for machine learning code (e.g., NumPy, SciPy, scikit-learn, Pandas, etc.). If Python didn’t have a good way of interfacing with code written in languages like C, then it might not have been as popular among machine learning researchers and practitioners, who needed the performance of C/C++/Fortran for numerical computing but wanted to work with higher levels of abstraction than what is provided in those languages.

What drew me to Python back in 2006 as a CS student who knew C and Java was its feeling like executable pseudocode compared to languages that required more “boilerplate.” Python’s more expressive syntax, combined with its extensive “batteries included” standard library, meant I could get more done in less time. Thus, for a time in my career Python was my go-to language for short- and medium-sized programs. To this day I often write pseudocode in a Python-like syntax.

Since then I have discovered functional programming languages. I’m more likely to grab something like Common Lisp or Haskell these days; I find Lisps to be more expressive and more flexible than Python, and I also find static typing to be very helpful in larger programs. But I think Python is still a good choice for small- and medium-sized programs.

fiddlerwoaroof

I try to avoid python in production code bases as much as possible: dependency management issues alone are a good reason to do so for anything that will last longer than a year or two.

bitwize

It was relatively easy to lash Python as a higher-level orchestration layer to popular number crunching libraries, yielding NumPy and similar, which made Python popular for machine learning applications.

If you're used to Scheme, Common Lisp, or Haskell, Python's arbitrary decisions about e.g. lambda or how scopes work may be grating. But Python is the BASIC of the modern day, and people laughed at BASIC in the 80s too... except businesses ran on BASIC code and fortunes had been made from it.

tikhonj

Not very Schemey, but at least modern Python has basically full-on algebraic data types thanks to type hints, immutable dataclasses and structural pattern matching.

It's still not great for functional programming, but far, far better than it used to be.

nextos

IMHO the main problem is the fact that lambda expressions have been deliberately crippled. Ruby is often described as a good-enough Lisp despite it is not homoiconic. That's because, like all modern Lisps, it makes pervasive use of blocks, procs and lambdas. Python could have been a very similar language, but Guido held a vocal anti-FP stance. Perhaps this can be addressed now, as other interesting features like the ones you outlined have been added to the language, but it'd have a very deep impact on nearly every API.

melodyogonna

Calling Python old in the same comment talking about a lisp dialect is funny.

Anyway, Python is intentionally not functional because Guido dislikes functional programming

zelphirkalt

FP in Python is rather weak. Even JS does a better job there. Some of the code exercises will need completely different solutions than in Scheme, due to not having TCO. What do instructors do, when their 1 to 1 translated code fails? Tell the students, that due to the choice of language it does not work that way, and that they simply need to believe it? Or do they treat it all as externalize the stack problems and solve it that way?

It seems rather silly to force SICP into Python.

FreakLegion

CPython doesn't do TCO, but you can implement it yourself in Python.

This version is one of my all-time favorite StackOverflow answers: https://stackoverflow.com/questions/13591970/does-python-opt...

zelphirkalt

Looks like trampolines inside Y combinator. Interesting approach. Still of course with its own costs. I think some of the Scheme in JS implementation(s?) use trampolines as well, or at least considered doing that.

agent281

Pyret would be a good alternative. It's designed for education by racketeers.

https://pyret.org/

CodeArtisan

There is also the course from ArsDigita University. The site is offline now but the courses are available on archive.org.

https://en.m.wikipedia.org/wiki/ArsDigita#ArsDigita_Foundati...

https://archive.org/details/arsdigita_01_sicp/

They were selling USB keys with the entire curriculum, if someone could upload an iso, that would be amazing. https://web.archive.org/web/20190222145553/aduni.org/drives/

jyounker

The issue with Python and most other non-LISP family programming languages is that they don't allow you to easily treat programs as data. Things that are simple to express with Scheme become complicated exercises when using mot another languages. Instead of focusing on the underlaying concept students end up having to focus on the implementation language's details in a way that they don't with Scheme.

liontwist

The cons/car/cdr implementation as lambda was magical the first time I saw it. But it just shows that the language runtime must implement key/value dictionaries and you are able to borrow that implementation to make other data structures.

hinkley

I find the destructuring logic in elixir much more interesting, and the watered down version in ES6 much more practical.

In elixir you can pop off as many as you like.

liontwist

Can you share any resources about it?

hinkley

I’m a bit of an elixir noob, but Enum functions like slice let you cut up lists in various ways, and you can pattern match values in maps in function definitions:

http://www.skuunk.com/2020/01/elixir-destructuring-function....

Which can let you unroll function preambles, or apply different rules if for instance an admin user runs a function versus a regular user.

undefined

[deleted]

EuAndreh

Not key value dictionaries, just pointers are needed.

A closure with no behaviour is just a pointer to the enclosed variable. A closure with 2 pointers is a pair, which you can get the car and cdr.

The runtime needs to make the pointee available outside its definition, so escape analysis, garbage collection, etc. But no dictionary is needed.

gugagore

I recently came across the notion that you need inductive data types (and can't just use Church encodings) if you want to do theorem proving, like proving that `0 != 1`.

I threw up some content up here: https://intellec7.notion.site/Drinking-SICP-hatorade-and-why... , along with an unrelated criticism of SICP.

I'd like to better understand what the limitations are of "everything is just a function".

schoen

I think you could prove 0 ≠ 1 if you had some other concrete fact about inequality to make use of. You could reason from the theorem "f = g -> f x = g x" to create your inequality fact on the right side and then take the contrapositive.

It seems correct to me that you can't directly prove inequality between Church numerals without starting with some other fact about inequality. Whereas with inductive data types, a proof system can directly "observe" the equality or inequality of two concrete instances of the same inductive type, by recursively removing the outermost constructor application from each instance.

schoen

Looking at the linked references, I'm not sure what else is being assumed to be available or not available when considering proofs using the Church numerals. I guess that will matter a lot, and I don't know enough to make more general statements about what is or isn't sufficient here.

solomonb

Well for theorem proving you need Sigma and Pi types plus some notion of equality. Can you achieve those with Scott or Church encoding?

WillAdams

The book itself is currently being discussed at:

https://news.ycombinator.com/item?id=42157558

Is there a reason why the link goes to the discussion at the bottom of that page rather than the beginning?

Could this be folded into the other discussion? (I don't see that the link has been posted there yet)

lifeisstillgood

David Beazley is a bit of a legend in the python world and honestly this course seems a surprising idea but it took about two seconds thought before it seemed perfect match and Inhave signed up for the next one.

The relevant part is that this is basically how “software engineers continual education” is going to look like

wrycoder

If he didn’t cover Chapter five on compiling, he didn’t cover SICP’s best part.

upghost

He has a whole course on compiling

https://www.dabeaz.com/compiler.html

AtlasBarfed

"legend in the python world"

That's a fun statement.

js2

There's a typo in the code in "the substitution model" section:

  ("+", ("fib", ("-", "n", 2)), ("fib", ("-", "n", 1))),
The two calls to `fib` are surely meant to be `fibonacci` since the latter is defined, but not the former. Indeed, the code is correct in the github repo:

https://github.com/savarin/pyscheme/blob/0f47292c8e5112425b5...

kurinikku

OP here. Thank you!

User23

I think SICP is wonderful.

As I've learned more and studied more math though, I've come to the conclusion that the relation really is the more fundamental primitive. Every function can be represented as a kind of restricted relation, but the converse is not true, at least not without adding considerable extra gadgetry.

While of course relational databases and SQL are the best known examples of relational programming, and are highly successful, I still believe it's a largely untapped space.

However, my interest currently is less in the space of programming language design and more in teaching very young children the basics of math. And for whatever reason it's considerably easier to teach a predicate like "is big" as a 1-ary relation and "is bigger than" as a 2-ary relation than trying to capture the same concepts as functions.

asah

Not a fan of everything-is-a-function because it's oversimplistic and often unhelpful. Some of the issues:

- functions that don't fit in cache, RAM, disk, etc.

- functions that have explosive big-O, including N way JOINs, search/matching, etc.

- functions with side effects, including non-idempotent. Nobody thinks about side channel attacks on functions.

- non-deterministic functions, including ones that depend on date, time, duration, etc.

- functions don't fail midway, let alone gracefully.

- functions don't consume resources that affect other (cough) functions that happen to be sharing a pool of resources

- function arguments can be arbitrarily large or complex - IRL, there are limits and then you need pointers and then you need remote references to the web, disk, etc.

(tell me when to stop - I can keep going!)

nephanth

Oversimplifying can be great at times. In this case, the lambda-calculus model (which is the base for this type of "everything is just a function" approach) is a great model of computation because it is so simple, while being easy to handle /reason about (compared to eg. Turing machines), which is why it is at the base of most computer logic/proof systems

lmm

Many of those things can be modelled as functions, they just need to actually be written that way (e.g. if you have a function that requires some resource, maybe it should require that resource! If it depends on the date/time, maybe it should depend on the date/time! If it returns a nondeterministic value, maybe it should return a nondeterministic value!). I think the functional programming approach shines partly because it forces you to take these things seriously: if you want to use e.g. implicitly shared resources, you need to model that, and "functions" that rely on implicitly shared resources are going to be explicitly distinct from actual function functions.

zelphirkalt

Half of what you call functions in that comment, are not actually functions and in the FP world many would not call them functions. Rather they are procedures. Functions are procedures, but not all procedures are functions.

SilasX

Which is also a problem with thinking this is a helpful abstraction: apparently, not everything you need to do can be captured by functions (in that sense)!

magicalhippo

I'm just used to the words from Pascal. What's the definition of a procedure in the FP world?

zelphirkalt

Afaik, the definition is just "a sequence of steps/instructions". That is of course much broader than a (simple) function, which among other things for one input gives you one output, and for the same input always the same output. A procedure can do that too, but it can also give you different outputs for the same input, due to side effects.

anon-3988

I think its the only hope for theoretical purposes.

bob1029

vmilner

jazzyjackson

Cool, available on archive.org too

Did you read the book in isolation or was it a part of a class / MOOC ?

https://archive.org/details/all-the-mathematics-you-missed

vmilner

I just found it by chance somehow when trying to review gaps in my maths education - it’s excellent for reviewing complex analysis (say) if you did it years ago - or less obviously - doing a quick pass over a topic before you learn it for the first time. Then I found the excellent math sorcerer channel and his view is much the same as mine.

rapnie

Fellow is doing an aggressive function. I wouldn't dare put contrary functions against his output.

qrush

this guy has Chris Fleming energy.

ysofunny

alternative take: everything is just sets

both can be a foundation for mathematics, and hence, a foundation for everything

what's interesting is how each choice affects what logic even means?

MathMonkeyMan

I learned functions in terms of sets. Domain and codomain are sets. Function is a set of ordered pairs between them.

How could we go the other way? A set can be "defined" by the predicate that tests membership, but then how do we model the predicates? Some formalism like the lambda calculus?

reuben364

Not sure of details to make it a mathematical foundation but:

A category can be defined in terms of its morphisms without mentioning objects and a topos has predicates as morphisms into the subobject classifier.

undefined

[deleted]

ysofunny

i think predicates are functions returning booleans

lambda calculus would provide a computational way to determine the truth value of the predicate, any computable predicate that is.

Daily Digest email

Get the top HN stories in your inbox every day.