"First a punch is just a punch, Then a punch is not only a punch, And finally a punch is just a punch" (heard from Bruce Lee)
Basically it means that in the beginning we punch like we can.. then a martial arts student learns the proper and different ways to punch.. so a punch is no longer just a punch... Is a lot of instructions.. Then to sheer practice and repetition the right way to punch becomes second nature.. so the right way to punch is the only way so it's only a punch again.
So coding it's just ifs and loops... Then is OOP and Functional and Logic... But then once you transcend all those paradigms you understand that is all ifs and loops.
Before one studies Zen, mountains are mountains and waters are waters; after a first glimpse into the truth of Zen, mountains are no longer mountains and waters are no longer waters; after enlightenment, mountains are once again mountains and waters once again waters.
Translation from http://www.livinglifefully.com/zensayings2.htm
I started off writing a lot of 68000 machine code. I was always amazed what you could accomplish with a lot of loops and branches. I never lost that sense that at whatever higher level I was at later on, it was all loops and branches.
Have you seen this? https://news.ycombinator.com/item?id=25788317
Seems like we lose a lot of good technology and progress for random reasons, like the “ram” of society is finite.
Thank you GhettoComputers for the link! And thank you Zhyl for the summary.
Sounds like the 2 hard things in Computer Science returning! Cache invalidation, naming things, and off-by-one errors. 
* "Knowledge is ephemeral": Cache invalidation
* "tower of abstraction": naming things, indirection 
* "Reduce complexity, reduce dependencies": minimise entropy, maximise connectedness
* Learn: welcome newcomers; recurse to the next generation
 All problems in computer science can be solved by another level of indirection" "...except for the problem of too many layers of indirection."
Yes, it's a good talk. Think of all the extremely talented internal combustion engine engineers that will be obsolete in 20 - 30 years.
Reminds me of biological metabolic systems. All loops and branches…
Stephen Wolfram thinks the entire universe is built from simple rules
With a lot of fuzziness, some state and temporal stuff.
Yes, one of my early jobs was writing Z80 code and I have the same sense.
When I eventually learned C it took me awhile to stop optimizing and let the compiler do it
Ultimately all the interesting stuff happens in loops and branches, the rest is just organization.
The organization is only needed for the inevitable bloat you add later to justify the rewrite.
Its something like the sage uncovers the boiler, points at it and announces: Look! It is a beautiful marvel of technology! Why would you want to cover it up? Look! You might learn something!
It’s not about how many loops you have; it’s about what you put in them and how deep.
Coding is not "ifs and loops, then $foo". That's a false premise. If you want to be fundamentalist about it (which I don't advise), a CPU is just a state machine.
assembly instruction + machine state => new machine state
Our idea of "jumping around code" and "control flow", as in for loops or if statements, are themselves abstractions of control state (an instruction pointer) and data state (perhaps a CPU register).
So coding is really "the execution of a notional interpreter (aka semantics), then $foo." That gets to the absolute bottom of what "code" even is: instructions for an interpreting device (be it physical or mathematical).
Oh, but CPU instructions can be built up from a series of μops which are then executed out of order or in an overlapping fashion, making both "instruction" and "state" boundaries in your equation more fuzzy than it looks. So at the absolute-absolute bottom of "code", it is instructions for an abstract model of an interpretative device.
(I'm not sure if your "but" is a retort, or an addendum. Assuming retort...) We can consider one more level of implementation machinery, but I fail to understand how uops aren't another form of instruction, and CPU internal gobbledygook like register files, caches, etc. aren't another form of state. It doesn't seem so fuzzy.
> Oh, but CPU instructions ...
And who says your code runs on a CPU? :^)
That's more or less shu-ha-ri https://martinfowler.com/bliki/ShuHaRi.html. Very useful concept, I often refer to it: for example to argue why "just be agile" probably won't work when there's juniors or when the team is new-ish. That's skipping to ri, start with shu (scrum or whatever set process).
That’s what disappoints me in modern Java. There is practically no ifs. It makes it inaccessible to beginners on the project, and the streams are about twice as slow… just because devs think they are more clever if they pile 7 “.map()” to transform a list.
list.stream().filter(Objects::nonNull).map(User::getUsername).filter(Objects::nonNull).filter(name -> name.contains(searchString)).map(name -> “We have found your user: “ + name).orElse(“We haven’t found”);
for (User user : list) if (user != null && user.getName() != null && user.getName().contains(searchString)) return “Found it!”;
return “Not found”;
That’s a failure of Java’s language design, not a failure of the functional/declarative paradigm.
Your for loop can do all kinds of damage to the list, and you have to read it all to find out what it does. Saner languages make the functional version more expressive.p:
I’m not saying JS is a sane language; it has an anemic standard library without a robust “first()” or “contains()”. But that code gets you the first user where the name matches, without resorting to indexes or null checks or worrying about how a for loop can mutate lists that it should not.
return first(users.filter(u => contains(u.name, searchString))
EDIT: I should have used filter(list fn) here, I was trying to write something plausible in JS instead of a purer functional language and misses that translation.
>"Your for loop can do all kinds of damage ..."
So can programming in general
>and you have to read it all to find out what it does.
Which isn't a problem, because as long as its all loops'n branches, the code is easy to understand.
JS has Array.some() for "contains"
the two snippets you posted do different things
In a list
searching for "Jo" returns all three of them in the first example, it stops when the first has been found in the second.
John Jonhatan Joy
Start including the tedious bits about adding found items to the list and the waste of intermediate variables and your "clear" code is wrapped around a lot of repetitions, that only add noise.
It just happens that you are more familiar with the second style, but pipelines are better in many other ways, clarity of intentions being one of them.
The code he posted isn't actually valid because you can't "orElse" a list. That being said, I would presume it was meant to include a "findFirst". Something like
list.stream().filter(Objects::nonNull) .map(User::getUsername).filter(Objects::nonNull) .findFirst(name -> name.contains(searchString)) .map(name -> “We have found your user: “ + name) .orElse(“We haven’t found”);
The snippets actually do the same thing, as long as you add a .findFirst() to the first example to make it valid Java code.
Intermediate stream operations like map or filter are always lazy. And .findFirst() is a short-circuiting terminal operation, that does not need to consume all stream elements.
You forgot to add null-check ifs for `list` and `searchString`. Oh and check-wrap for exceptions too!
Modern Java looks horrible to be honest. I create abominations like this in JS pretty often, but it isn't good code. Best if wrapped in a huge try/catch where the error case juwt prints "didn't work for some reason...".
https://esolangs.org/wiki/Pancake_Stack Pancake stack doesn't need this hipster crap like functional programming or ifs or fors.
I don't write java, but assuming it has a null coalescing operator you can likely do all that in a single map in a way that is (in my opinion) cleaner and easier to read than both of your examples.
Kotlin does but sadly Java’s Optional<T> has no syntactic sugar, so it ends up looking like
Optional.ofNullable(user).map(User::getUsername).map(name -> name.contains(searchString))
Java has never really liked to do in 10 characters anything that could be done in 30 characters instead, especially if it obfuscated things a bit more at the same time.
I just started using Java's functional side a few weeks ago. Lambdas are really nice and often make the code clearer. I like writing more declaratively.
But the stream interface is just horrible. I had to read a lot before I understood how to just filter a list and collect the results. And even after that, it still somehow didn't scream out what it was doing to me. I guess you get used to it.
Coding is values, arrays, dicts, ifs, loops and functions. An expressive set-like operations library over arrays (& dicts) gives superpowers.
Perceived complexity peaks at intermediate skill. There's a concept in learning theory that perceived complexity is diamond shaped.
I wonder whether you could apply this analogy to physics and science in general. If so it seems like we haven't reached the peak/middle yet.
Trying to explain Zen by starting with "basically" is a bit ambitious of your part haha :)
These are all from the Zen philosphy, it says:
When getting start with Zen, I see hill as hill, water as water When get experienced with Zen, I see hill as more than hill, water as more than water. When finally got mature with Zen, I once again see hill as hill, and water as water.
That first reply is so funny to me because it hits too close to home
The more I do this, the more I gravitate towards the simple things.
New developers write simple but shortsighted code. It gets the job done, but it will be painful to work with in the future.
Intermediate developers, bitten by their past mistakes, decide to future proof their work. But they don’t just look one or two steps ahead, rather they try to look five steps ahead and identity problems that do not and may never exist. Consequently, they over-engineer, over-abstract, and over-complicate everything. Carmack’s recent “But here we are” speech resonated with me.
Advanced developers identify the right balance for each task between simple and complex, concrete and abstract, and pragmatic and idealistic. In my experience, they favor simple and pragmatic solutions, use abstraction effectively, and can satisfy near-term goals quickly without painting themselves into a corner. “As simple as possible, but not simpler.”
I try to avoid working for tech leads stuck in the second phase, which is not uncommon. If you suggest taking the “ifs and for loops” approach of solving a simple problem with a simple solution, they’ll assume you’re on the wrong side of the bell curve.
Had a boss once who insisted that all if statements should be pushed out to factory classes, and all control logic should be done by constructing a different instance of an interface. It was a pretty rigid system, but at least it did force me to write small focused classes that were easy to test.
Debated for a long time whether that methodology was stuck in the second phase or if it was actually the third. Still don't have an answer, but these days I think having a plan is better than just letting engineers run roughshod, as long as the conventions are easy to follow.
That just sounds like enterprisey Java to me, which I firmly believe is closer to 2 than 3.
Programming alone vs programming in a team are very, very different occupations. A lot of what applies to one doesn’t apply to the other. I’m still painfully learning this, after all these years.
The key insight to "at least destroying your architecture makes it easy to unit test" is that being able to unit test is not actually that important. There's other kinds of testing out there!
Q: What's the difference between a novice and an expert?
A: The novice thinks twice before doing something stupid.
I don't understand this saying.
There is another important difference: the expert thinks twice before doing anything at all.
And that's also why a lot of Architecture Astronaut that looooved Java, didn't see Python coming.
Funny, I took over a modern python service and I was pretty shocked at what I inherited. Long gone are the days of "There's one way to do things".
Instead, this thing would give the most "enterprisey" Spring JEE application a run for its money with its endless annotations, dependency injection magic, all sorts of pseudo-types - both the "built-in" Python 3 ones like Set and List, but also the libraries like Pydantic. But unlike Java, these types aren't even really guaranteed by the language at compile time, so even if your IDE plugin can successfully detect them, things will still (silently) slip through at runtime.
The async functionality that's been bolted on to the language is worse than even the old and confusing Java multi-threading primitives, and the funny thing is it still doesn't actually run things in multiple threads. For that, your simple Rest API is running on layers of C programs like Uvicorn which itself is then wrapped by another few processes running Gunicorn which in turn is probably running behind NGINX. LOL, and we thought the Servlet stuff with Tomcat and JBoss was clunky - this is insane.
To be honest, if there ever was a sweet spot for Python, it would have been for smaller code bases that weren't complex enough for big "enterprisey" langs like .Net or Java, but were more permanent and complex than shell scripts or (back in the day) Perl could handle.
But nowadays, I don't think modern Python fits any use case real well. It's still dynamically typed, slow, single-threaded, and has a poorly managed ecosystem and hodge-podge of tools like virtualenv, pyenv, poetry, etc. that never quite become standardized and stable.
So unless you've got a bunch of Python experts who aren't interested in moving to a better lang, I'd find it hard to go with Python for new projects.
Recently posted this, which wasn’t well received.
Where does this fall?
That site is a piece of shit, because it uses google analytics, thus snitches to a big brother.
I'm currently working with a team of Advanced developers for the first time in my career. Nobody is padding their CV with fancy shit. Everyone has gotten the "new shiny" out of their system.
Everything is as simple as it can be, no simpler and no more complex. Sometimes a bunch of flat JSON files in an S3 bucket is enough of a database, you don't need a 42 machine Aurora cluster.
All fancy "Machine learning" stuff really is just a bunch of ifs and for loops internally :D
> Carmack’s recent “But here we are” speech resonated with me.
Watching this right now and all I can think about is Microsoft Bob.
Well said, this hits home. Also constant refactoring and maintenance of the codebase without adding any new features.
Despite being a joke, I know it's the "Ha Ha Only Serious"  sort. I can't help but think this is severely biased by the trends of "enterprise software," where you eventually "give up", and clock your 9–5 writing if+for making a fine living, but erroneously pass that off as a mature, Jedi-like way of thinking about programming, like the meme suggests. (And, consequently, you spend no less time debugging hairy, nested, imperative conditionals with for-loops that are off-by-1.)
I have no beef with if+for, but a large part of the reason they're "goto tools", if you will, is because industry is slow to assimilate many state-of-the-art ideas, sometimes by as much as 40 years.
Simpler building blocks does not necessarily mean simpler solution. If only!
> you eventually "give up", and clock your 9–5 writing if+for making a fine living, but erroneously pass that off as a mature
This comment sure indicates to me where you most likely are on the curve.
In all seriousness, I think this is considerably off the mark. After enough experience you realize that expressivity and convenience are antipatterns and don't actually simplify things but are harbingers of complexity, bugs, tech debt, even the downfall of organizations and products.
Saying it is all ifs and for-loops is completely true. Everything else, all the abstractions and high level features, are just sugar.
I try to keep a healthy and balanced diet, myself.
> industry is slow to assimilate most state-of-the-art ideas, sometimes by as much as 40 years.
Most of those ideas are terrible. The industry is so incredibly young and has experienced so much change over those 40 years that I have a hard time accepting the notion that the industry is slow to adopt. The reason the old building blocks are still popular is because they are a thin abstraction over how computers work, and ultimately that is at the root of everything we do.
Really, there are no if+for, just compare and jump. Why don't we use what the metal uses, instead of these "expressive abstractions"?
If+for have no deeper foundational significance in the construction of programs or computations, literally, than say a lambda function. But because the latter is unfamiliar, it's spoken about in the same manner you present: as if it is some highly abstract, complicating, high-level feature (when truly that take is just baloney).
Personally I disagree. We should be using state machines and pure functions. If+for loops are just what's easiest to express in the major languages of today. they are no more or less computationally expensive but due to lack of tooling they are often cheaper to write.
In languages and libraries that allow FSM and pure functional kernel based designs you can get just as clear logic that is expressible not just to the programmer but also to business personnel. It's counter-intuitive to a certain extent because so much of programming is built around imperative programming but FSM based logic is and will continue to be easier to understand long term because you can trivially visualise it graphically. This ultimately is what a lot of the functional paradigm is built around. Use the mathematical and graphical representations we've used to understand systems for decades. They are well understood and most people can understand them with little to no additional education past what they learned in their business or engineering bachelors degrees.
> Saying it is all ifs and for-loops is completely true. Everything else, all the abstractions and high level features, are just sugar.
You could just as well say that ifs and for loops are just sugar for gotos and all programming is just gotos.
The reason ifs and for loops are used instead of gotos is that they are very useful abstractions that are easy to reason about and save the programmer lots of mental effort. But they are not the only such abstractions.
To the extent that other abstractions can create problems, it's not because they're really just sugar for ifs and for loops, it's because they are not well crafted abstractions so they are not easy to reason about and don't really save the programmer any mental effort. But there are plenty of abstractions other than ifs and for loops that are well crafted and do save the programmer mental effort, in many cases lots of it.
> After enough experience you realize that expressivity and convenience are antipatterns and don't actually simplify things but are harbingers of complexity, bugs, tech debt, even the downfall of organizations and products.
Suggesting that experience leads to jettisoning expressivity is at odds with my direct observations of experienced software engineers working in large teams. The more experience, the _better_ the engineer gets at picking the right level of abstraction to write code that can be maintained by others. Picking a single point on the abstraction spectrum (just above goto but not below it!) is far too rigid for the diversity of tasks that software engineers need to solve.
> Saying it is all ifs and for-loops is completely true. Everything else, all the abstractions and high level features, are just sugar
You sound like the kind of person who thinks the ancient Greeks figured out all of math and everything that has happened since then is just fancy abstractions and sugar. Either software engineering can advance as a discipline, or it can't. You seem to be assuming it can't.
> This comment sure indicates to me where you most likely are on the curve
Never mind, you just sound like an asshole.
It’s just so sad that the lowest common denominator has become the standard now. When I first learnt Clojure it entirely changed the way I think and solve problems. The code really was elegant.
Obviously, it can only be read by someone who can also understand programming beyond ifs and fors. That’s a non-starter in most environments - enterprise or otherwise.
Funny enough, I see most innovations coming from consultants who do the same work for multiple companies and realise the repeating patterns and extract an abstraction.
Ifs and fors are the easiest concepts to explain to non-developers, so it makes sense to start there.
I wouldn't say that they are the standard now, but using and mastering all features in a language is hard.
Add to that design patterns, classes and code layout it becomes a full-time job to keep up.
I have been in contact with code most of my professional life, but still isn't comfortable writing large amounts of code. The simple reason is that i don't do it full-time.
Here are the features in C# just to illustrate how complex a programming language is.
>I have no beef with if+for, but a large part of the reason they're "goto tools", if you will, is because industry is slow to assimilate many state-of-the-art ideas, sometimes by as much as 40 years.
For assimilation to happen, the state-of-the-art solution also has to result in a net gain over the existing solution, and the higher the differential in complexity between the two, the bigger that gain has to be.
Functionally, this looks like selling off your client base and closing the doors rather than rewriting internal tools that mostly still work.
There's no "rubber meets the road" in OPs position because there's no cost in their calculations.
And, these days, "net gain" in an industrial context is typically tied to almost no aspect of the quality of the code, but more to the management of large groups of people, as well as stability and growth of the business.
What this really means that once you get to a certain level of experience and seniority the actual code you write in the small is pretty much irrelevant. What matters is the overall architecture of what you’re building: the data structures and APIs. The challenge becomes about working together as a team, and with other teams within your ecosystem. Sophisticated language constructs don’t actually help you solve those problems, and imo their benefit is marginal where they do help.
I view map/filter as better abstractions than for loops, and do not consider them to be sophisticated language constructs. They correlate with descriptions of a program’s goals much more naturally than for loops do: get the name of each user, remove everyone under 21, only show accounts with positive balances, etc. Reduce is more arguable, but I think it also applies: show the total amount of money in all of this user’s accounts.
“If” however seems pretty fundamental.
With at additional level of abstraction you could say “goto jumps”, but “if and loops” gives an commonly understandable logic for everyone; deeper abstractions increase reading complexities, while higher abstraction is achieved via functions and overall architecture.
Scaling up those “if and loops” is the challenge as a team or a single, with the common goals being to keep the software under control.
Meh, most business logic really is "if" and "foreach". That doesn't mean it's not complicated, as you say. But all that category theory stuff, at the end of the day, really is just an attempt to manage the complexity of lots of conditional branching and iteration.
If and loop aren't mathematical in the same sense as set theory and set operations are.
> they're "goto tools"
I see what you did there.
> debugging hairy, nested, imperative conditionals with for-loops that are off-by-1
Isn't this just a complicated case of ifs and fors?
Sure, but the word "just" is doing a lot of work. It seems to be where a code base of uncomplicated ifs and fors leads to asymptotically, because both of those constructs don't prohibit you in any way from sneaking in "just another line" to either of them.
There are a lot of sophisticated problems dealing with enterprise software even in higher languages and even in situations where things like performance or resource usage is not a primary concern.
For example, how do you handle authorization, logging, and how do you make the code maintainable? That's a really tough problem that requires a lot of thought about the overall system design.
And of course it's always a lie to say that performance and resource usage aren't a concern -- they're not a concern until they are.
- it's okay to use printing instead of a debugger
- you don't need to write classes for everything
- it's okay to write something in a more verbose way to make it clear for other people
- your tools don't need to be perfect to get things done
I need more of these, maybe some that aren't as reductionist as Carmacks's original post.
The really useful 'print' debug lines might be kept at additional 'debug' flag levels. This is particularly useful for tracing the behavior of programs that no longer have debug symbols and are running in real environments.
This post by Aaron Patterson made me realize it's fine to debug with print statements
In rare cases I pull out a real debugger, but most of the time the right prints in the right places are just as good. I can also iterate much faster because I'm not jumping between the code the the debugger, or pulling the debugger out of the loop it's stuck in.
I've come to the conclusion that it's a good skill to have since it's (or logging which is basically the same) the only debugging method that's always guaranteed to be available. For example there's lots of build and CI tools out there that have no Real Debugger.
I'd never seen that meme before, but there's a Bruce Lee quote (maybe apocryphal) that has had a lot of meaning for me ever since I got over the same hump myself.
“Before I learned the art, a punch was just a punch, and a kick, just a kick. After I learned the art, a punch was no longer a punch, a kick, no longer a kick. Now that I understand the art, a punch is just a punch and a kick is just a kick.”
makes me think of the Buddhist "Before enlightenment: chop wood, carry water. After enlightenment: chop wood, carry water".
not my favorite source since it doesn't go into the 'scaling the mountain' bit, but every source that talks abt that part seems to be...eh: https://buddhism.stackexchange.com/questions/15921/what-is-t...
I always took that to mean something like this:
Q: What is the difference between an enlightened person and an ordinary person?
A: There is no difference, but only the enlightened person knows this.
“Before enlightenment: if then. After enlightenment: if then.”
Funny, I posted this on another HN thread  recently, but it's perfectly relevant again:
We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time.
T. S. Eliot - Little Gidding
How then am I so different from the first men through this way?
Like them, I left a settled life, I threw it all away
To seek a Northwest Passage at the call of many men
To find there but the road back home again
Stan Rogers, "Northwest Passage"
Sounds good. I still remember my BASIC.
That image[a] is so funny because it is so true:
Anyone who has ever written any software has felt like the unenlighted half-person in the middle of that distribution at least once -- for example, when learning how to code with a different approach in a new language.
I have felt like that more than once. Everyone here has, I suspect.
I'm may be biased because I spent too much time arguing about this
but you hear those $fancy_principles / fp / hard oop / "clean code" evangelists, and then you go to any repo of real world software - linux, compilers, kubernetes, git, blablabla and everywhere you see for loops, goto, ladders of if statements
I mean, you cherry-picked by quite a criteria there. It’s all C and Go, and they somewhat lack higher level abstraction capabilities. On the other hand compilers are often written in C++, or are bootstrapped in the very same language though. Also, what about OpenJDK, elastic search, all the web servers running the whole cloud? Linux might be the underlying OS, but that’s not the program actually doing business logic. If anything, it’s just another layer of abstraction.
Also, let’s be honest, C does all these “virtual method” magic on a per-project basis which will not be understood by any tool ever (all those function pointers to whole new implementations passed from God knows who, with barely any typing). At least FP and OOP knowledge somewhat transfers and is queryable by tools.
Ok, another example: C# Compiler, .NET internals/std/yadayada, same stuff.
My claim is:
Almost all real world, OSS, big, battle-proven code bases are nowhere even close to evangelist's "sanity".
They're full of loops, nulls, gotos, ifs and all ""bad stuff""
>all the web servers running the whole cloud
I literally opened first file that I've seen in nginx repo and take a look at this:
shitton of ifs, for loops gotos and all of that in one, a few hundreds lines long method.
>On the other hand compilers are often written in C++
>or are bootstrapped in the very same language though.
I gravitate towards useful abstraction. I've written the same loops (find an element, find a maximum, do something to every element, filter elements, etc.) 10 000s of times by now. It got old after the first 100.
I also like "all web development is basically fancy string concatenation", and as a web dev I feel seen.
George Hotz said something once that most modern developer jobs are depressing because you're not doing any actual programming. I.e. you're not given a problem to solve with code, you're just taping together frameworks and pieces of code that someone else wrote to order. It's a bit like studying to be a chef for five years and then having to put together one of five types of burgers.
Like everything Hotz says it's spiced up of course, but there's a kernel of truth to it.
I love George, but he's a bit of a reactionary to a fault and this anecdote is a perfect example. A person with deep knowledge and thoughtfulness will make almost the exact same point with much more nuance, aka Jim Keller: https://www.youtube.com/watch?v=Nb2tebYAaOA&t=1363s
The choice quote in contrast to Hotz is "executing recipes is unbelievably efficient -- if it's what you want to do"
There's definitely an assumption by Hotz that programming and solving "real" problems is what everyone should aspire to, and that anything else is just meaningless. Like anything in life, what's meaningful is of course completely subjective, all the way from some people actually finding it fulfilling to others just not being interested in putting in that much effort into their career and preferring to do other things with their time.
Making crud apps feels like I'm doing a data entry job
I've for sure felt this. I'm impedance-matching rather than making things that work.
It's almost like "a series of tubes" that do nothing more than squirt text around.
"U+1F346 is an eggplant" and other oddities from the tubes we build modern society on
String concatenation has its own quirks & pitfalls of course.
Pretty much. Many frameworks exist to make it safe.
It may optimize down to string concatenation, or better yet streaming output, but you really shouldn't be doing that concatenation directly. https://glyph.twistedmatrix.com/2008/06/data-in-garbage-out....
and file access
Well, quite: if you add that code must run sequentially it's the Boehm-Jacopini theorem: https://en.wikipedia.org/wiki/Structured_program_theorem
Pure functional programming languages take this a step further and say everything is just composition. (From my understanding)
Thats why i am seeing DB schemas without indexes lately. End of the day people with this kinda thinking make others to fix the broken code that they left behind.
As a senior software engineer i had to spend a lot of time at night fixing code written by junior devs and interns.
the code that company and devs (just ifs and loops gang) proud of was a pain in the ass for me so i quit the job entirly and do freelancing these days.
I tried to explain how something was wrong and why but no one would listen all believe they were 10x developers, Life lesson learned, never ever try to correct an idiot.
Here are some of the practices they followed * No indexes on tables, not even unique * No DRY, NO KISS * Switching to new shiny framework/library every week * no tests * Keeping entire codebase in a single file (4k LOC) * no versioning * Exposing DB credentials to the entire internet and not acting when being warned
I think there's a general lesson to be learned here, which is that you should be wary of any colleague within any branch of IT that honestly claims the things you're working on are "simple". I'll take an insecure colleague that wants to do a good job over a confident colleague that refuses to listen any day.
Writing code that works is simple. Writing code that doesn't break is not simple.
Isn’t this just a rant that the senior management/devs failed to set up an environment to stop this sort of thing happening? I don’t really understand how it’s related to the linked tweet or the subtext of it.
 either a policy like strict code review or perhaps a cultural change about quality or responsibility or mentorship.
I was brought into to work on a 2 years old project built by junior devs and interns.
Companies thought they were saving money and devs thought they were born coders so no to read a book on software architecture/engineering and hence i had to deal with the big pile of ....
Yup, when I encounter these massive abstract code bases, with some tyrant "genius" running the team, I just walk away quietly.
4K LoC is small.
And really, aren't loops just ifs and jumps under the hood? So coding is just ifs
Indeed. Jump on zero and integer manipulation are sufficient for turing-completeness. For example: https://en.wikipedia.org/wiki/Counter_machine
Can it compare two values and do one thing or another depending on that comparison? (IF)
Can it do something multiple times? (LOOP)
Can it change a piece of data?
Congrats, it's Turing complete.
it’s NANDs all the way down
It's NP-junctions all the way down.
all the way up, shurely?
Until you find your NANDs are made of XORs :)
Sequence, selection, iteration (or recursion if available).
Note that the child overlooked the assumption that it's sequential.
Haha, yeah I was thinking the same.
I started in GW-BASIC when I was a kid, and all I needed was IF and GOTO. You could do anything with that. No loops, no functions, no nothing. IF's and GOTO's!!!
Ach no! It's arrays, too.
iterating and conditionals?
If you want to go even lower than that, coding is basically just saying yes or no in response to a yes or a no.
Sure, that's oversimplifying it, but that's the smallest unit of information being changed during computation.
But yes, once you learn the basics that are shared between most programming languages and don't get distracted by the nuances, it doesn't take that long to pick up a different language. Being proficient is of course another question, but achieving a basic understanding shouldn't take all that long when you just focus on how to if-else, how to setup loops, and how to assign variables.
All chemistry is just sharing electrons.
Well, it's more complicated than that - you see, occasionally the electrons are not shared, but given up entirely.
but that's one-sided sharing.
and electrons are just quarks, and quarks are just a state in a quantum field, and a quantum field is just...
Not quite. It has to somehow store and later retrieve a few (infinite) “yes-nos” to be general enough.
All machines that satisfy a test of whether they can do yes-no, store the result of that yes-no, and then go to another yes-no are verifiably considered yes-no machines, that is, they are yes-no complete.
yes, at the lowest level its binary code, 0 and 1
I like to think of myself, actually, as not a code writer, but an author. I just use zeros and ones instead of words 'cause words will let you down.
I like that. Someone on my team referred to us as (data) plumbers and I thought that was a pretty fitting analogy too.
This is funny, but it's like saying "Math is basically pluses and minuses".
I see coding as playing with hardware without having to use soldering iron.
A marathon is just putting one foot in front of the other, after all. What’s the big deal? I mean a two year old can do that, and they can’t even handle logic yet.
Mathematics is "just" set theory and everything else, including arithmetic, can be built on top of that.
Honestly that's not a great example given that you can't understand ZFC until you already know enough set theory to understand the motivations for ZFC.
Well, it is just counting natural numbers and making up placeholders for whenever a subtraction or division wouldn't work out.
Yeah we went beyond Presburger a long while ago
Maybe math is just equations and sets.
It is just sets! Set theories like Zermelo–Fraenkel can be the foundations for all of mathematics.
I think he knows.
Math is just writing on a blackboard. Equations and sets optional.
UGH. Back in my day the only language was BASIC and we only had IF and GOTO. Dijkstra has made these children SOFT and I'd piss on his grave if I could be arsed to get out of my rocking chair.
What version of Basic were you using that lacked FOR? Even the shitty one crammed into my Sinclair ZX80 had FOR.
Oh, I can't say that it didn't. But I was only 7, and more amused by things like
than, what, reading a non-existant manual? My parents weren't programmers, so I learned by stumbling in the dark.
10 PRINT "POOP" 20 GOTO 10
But, you've got me curious. I recall using BASICA, GWBASIC and QBASIC -- reading over their respective histories, I'd have gotten my start on BASICA on a hand-me-down 286. I'm not finding good docs on the language, but a familiar program  uses FOR loops -- so they were supported. But I distinctly recall hand-rolling a for-equivalent loop with GOTO and IF, counting down to zero.
More visually pleasing, and several keystrokes shorter, for rapid input on an unattended store demo machine:
10 print "poop ";:run
FWIW, the very first version of BASIC in 1964 supported FOR (https://en.wikipedia.org/wiki/Dartmouth_BASIC#First_Edition). I never used DOS machines but https://gunkies.org/wiki/Microsoft_BASIC claims that BASICA was based on Microsoft's BASIC, and that definitely had FOR.
Young you must have arrived at the concept of a for loop without knowing there was a keyword in the language designed to support that, which is pretty cool!
In my twenties, I wanted to use all the cool PL techniques: meta-programming, reflection, monads, macros, type-level programming, etc. I'm getting older and now I want my programs to be 90–95% functions, arrays, structs, enums, and loops and only parsimoniously throw in more advanced language features.
Yes. The 'paradigms' have seriously diminishing marginal returns.
Super basic imperative typed language with maybe some kind of nice way of handling nulls and that doesn't use pointers ... is most of what we need.
Everything is very expensive optimization.
There’s a joke in the fp community I can’t find right now that describes the evolution of programs from imperative side-effectful statements to a for comprehension, with exception catching, that looks nearly identical.
Probably not the right one, but "The Evolution of a Haskell Programmer" sounds like a similar idea which goes from a Freshman Haskell programmer's simple factorial to the abstract heights of a Post-doc Haskell programmer, then back down to a Tenured professor's simple factorial.
Then what the fuck is a monad?!?
Seriously I still don’t know what a monad is and apparently it’s just a bunch ifs and for loops, so I guess I’m pretty stupid.
It's like if you wrapped every statement ending in ; in a C program with the same macro.
I’m a Python programmer. We don’t have macros … or semicolons.
A monad is just a monoid in the category of endofunctors... duh!
It's not "the" monoid in the category of endofunctors though, just "a" monoid. That also describes Applicative (less powerful) and Arrow (more powerful).
A monad is like when your cat poops on the carpet, so you wrap your cat in toilet paper. Then your cat has kittens, and the kittens are born with toilet paper wrapped around them!
No here’s a better explanation:
A monad is like when you have a clock, but it’s broken and goes too fast. So you rewind it, and the next morning you wake up and it’s 5000 BC.
Or here’s a better explanation:
A monad is like when Donald Trump won the election, and then stacked the Supreme Court, and now we’re totally fucked!
That’s it! That’s exactly what a monad is!
It's what you add to a search to get more interesting answers on Google.
I don't understand monads either.
They look like concise programming for mathematicians, which in my opinion, is tending towards set theory notation. https://xkcd.com/2545/
Personally I prefer more names, more XML tags, more comments, more parables, more hyperlinks, more different ways of expressing the same thing, to make it less ambiguous and easier to communicate and agree what we're all talking about.