Found in 10 comments on Hacker News
PaulHoule · 2025-06-27 · Original thread
There's closures and there's being able to transform the expression tree.

Graham's On Lisp is a really interesting book

https://paulgraham.com/onlisptext.html

which is allegedly about programming with macros but I'd say 80% of the time he implements something with closures and then makes a macro-based implementation that peforms better. That 80% can be done in Python and the other 20% you wouldn't want to do in Python because Python already has those features... And if you wanted to implement meta-objects in Python you would do it Pythonically.

Graham unfortunately doesn't work any examples that involve complex transformations on the expression trees because these are hard and if you want to work that hard you're better off looking at the Dragon book.

You can work almost all the examples in Norvig's Common Lisp book

https://www.amazon.com/Paradigms-Artificial-Intelligence-Pro...

in Python and today Norvig would advocate that you do.

PaulHoule · 2025-02-03 · Original thread
I read On Lisp by Graham recently and first thought "this is the best programming book I read in a while", and then had the urge to make copy editing kind of changes "he didn't define nconc" and then thought "if he was using Clojure he wouldn't be fighting with nconc", and by the end thought "most of the magic is in functions, mostly he gets efficiency out of macros, the one case that really needs macros is the use of continuations" and "I'm disappointed he didn't write any macros that do a real tree transformation"

Then a few weeks later I came to the conclusion that Python is the new Lisp when it comes to metaprogramming. (and async in Python does the same thing that he coded up with continuations.) I think homoiconicity and the parenthesis are a red herring, the real problem is that we're still stuck with parser generators that aren't composable. You really ought to be able to add

   unless(X) { ... } 
to Java by adding 1 production to the grammar, a new object for the AST tree, and a transformation for the compiler that rewrites to

   if(!X) { ... } 
probably the actual code would be smaller than the POM file if the compiler was built as if extensibility mattered.

Almost all the examples in this book (which claims to be a tutorial for Common Lisp programming)

https://www.amazon.com/Paradigms-Artificial-Intelligence-Pro...

are straightforward to code up in Python. The main retort to this I hear from Common Lisp enthusiasts is that some CL implementations are faster, which is true. Still, most languages today have a big helping of "Lisp, the good parts". Maybe some day the Rustifarians will realize the wide-ranging impacts of garbage collection, not least that you can smack together an unlimited number of frameworks and libraries into one program and never have to think about making the memory allocation and deallocation match up.

PaulHoule · 2025-01-07 · Original thread
Lately I read Graham's On Lisp and first felt it was one the greatest programming books I'd ever read and felt it was so close to perfect that the little things like he made me look "nconc" up in the CL manual (so far he'd introduced everything he talked about) made me want to go through and do just a little editing. And his explanation of how continuations work isn't very clear to me which is a problem because I can't find a better one online (the only way I think I'll understand continuations is if I write the explanation I want to read)

Then I start thinking things like: "if he was using Clojure he wouldn't be having the problems with nconc that he talks about" and "I can work most of the examples in Python because the magic is mostly in functions, not in the macros" and "I'm disappointed that he doesn't do anything that really transform the tree"

(It's still a great book that's worth reading but anything about Lisp has to be seen in the context the world has moved on... Almost every example in https://www.amazon.com/Paradigms-Artificial-Intelligence-Pro... can be easily coded up in Python because it was the garbage collection, hashtables on your fingertips, first class functions that changed the world, not the parens)

Lately I've been thinking about the gradient from the various tricks such as internal DSLs and simple forms of metaprogramming which are weak beer compared to what you can do if you know how compilers work.

PaulHoule · 2024-09-06 · Original thread
A while back I concluded that you could code up most of the examples from

https://www.amazon.com/Paradigms-Artificial-Intelligence-Pro...

in Python without doing anything too out of the ordinary. I just finished

https://www.amazon.com/Lisp-Advanced-Techniques-Common/dp/01...

and came to almost the same conclusion in that most of the macro use in that book is in the syntactic sugar or performance optimization category. If I didn't have a lot of projects in the queue and people demanding my time I'd try coding up the ATN and Prolog examples in Python (sorta kinda did the "CLOS" example in that I built a strange kind of meta-object facility that made "objects" backed by RDF triples)

In Java I did enough hacking on this project

https://github.com/paulhoule/ferocity/blob/main/ferocity0/sr...

to conclude I could create something that people who think "Java Sux" and "Common Lisp Sux" would really hate. If I went forward on that it would be to try "code golf not counting POM file size" by dividing ferocity into layers (like that ferocity0 which is enough to write a code generator that can stub the whole stdlib) so I could use metaprogramming to fight back against the bulkification you get from writing Java as typed S-expressions. (I'm pretty sure type erasure can be dealt with if you aren't interested in JDK 10+ features like var and switch expressions, on the other hand a system like that doesn't have to be able to code generate syntactic sugar sorts of structures because ferocity is all about being able to write syntatic sugar)

PaulHoule · 2024-07-25 · Original thread
I'm going to argue that Lisp already won.

That is, other programming languages have adopted many of the features of Lisp that made Lisp special such as garbage collection (Rustifarians are learning the hard way that garbage collection is the most important feature for building programs out of reusable modules), facile data structures (like the scalar, list, dict trinity), higher order functions, dynamic typing, REPL, etc.

People struggled to specify programming languages up until 1990 or so, some standards were successful such as FORTRAN but COBOL was a hot mess that people filed lawsuits over it, PL/I a failure, etc. C was a clear example of "worse is better" with some kind of topological defect in the design such that there's a circularity in the K&R book that makes it confusing if you read it all the way through. Ada was a heroic attempt to write a great language spec but people didn't want it.

I see the Common Lisp spec as the first modern language spec written by adults which inspired the Java spec and the Python spec and pretty much all languages developed afterwards. Pedants will consistently deny that the spec is influenced by the implementation but that's absolutely silly: modern specifications are successful because somebody thinks through questions like "How do we make a Lisp that's going to perform well on the upcoming generation of 32 bit processors?"

In 1980 you had a choice of Lisp, BASIC, PASCAL, FORTRAN, FORTH, etc. C wound up taking PL/I's place. The gap between (say) Python and Lisp is much smaller than the gap between C and Lisp. I wouldn't feel that I could do the macro-heavy stuff in

https://www.amazon.com/Lisp-Advanced-Techniques-Common/dp/01...

in Python but I could write most of the examples in

https://www.amazon.com/Paradigms-Artificial-Intelligence-Pro...

pretty easily.

wkyleg · 2024-07-18 · Original thread
I like Peter Norvig's book "Paradigms of AI Programming," where you learn old fashioned symbolic AI with LISP and Prolog. Is it outdated? Absolutely, but it is a classic read.

Maybe a use case for new AI models could be creating more old fashioned expert systems written in LISP or Prolog that are easier for humans to audit. Everything tends to come back full circle.

https://www.amazon.com/Paradigms-Artificial-Intelligence-Pro...

boysabr3 · 2018-01-05 · Original thread
Haven't read it personally but heard great things about: Paradigms of Artificial Intelligence Programming: Case Studies in Common Lisp - Peter Norvig

https://www.amazon.com/dp/1558601910

stiff · 2014-01-03 · Original thread
It is actively harmful to teach students that software architecture is something that somehow arises from diagrams or that those kinds of silly pictures capture anything important about it. Powerful architectures come out of powerful ideas that in turn come from accumulated hard work of many people in different disciplines. One can learn much more from walking through the actual source code of some classic projects and from trying to understand the ideas that make them tick:

https://github.com/onetrueawk/awk - UNIX philosophy of small tools, DSLs, CS theory: state machines / regular expressions, Thompson algorithm ...

https://github.com/mirrors/emacs - Both a program and a VM for a programming language, hooks, before/after/around advices, modes, asynchronous processing with callbacks, ... Worth to think of challenges of designing interactive programs for extensibility.

https://github.com/rails/rails - Metaprogramming DSLs for creating powerful libraries, again a lesson in hooks (before_save etc.), advices (around_filter etc.), ...

https://github.com/git/git - The distributed paradigm, lots of CS theory again: hashing for ensuring consistency, DAGs everywhere, ... By the way, the sentence "yet the underlying git magic sometimes resulted in frustration with the students" is hilarious in the context of a "software architecture" course.

One of computer algebra systems - the idea of a http://en.wikipedia.org/wiki/Canonical_form

One of computer graphics engines - Linear algebra

...

There are loads of things one can learn from those projects by studying the source in some depth, but I can't think of any valuable things one could learn by just drawing pictures of the modules and connecting them with arrows. There are also several great books that explore real software design issues and not that kind of pretentious BS, they all come from acknowledged all-time master software "architects", yet all of them almost never find diagrams or "viewpoints" useful for saying the things they want to say, and they all walk you through real issues in real programs:

http://www.amazon.com/Programming-Addison-Wesley-Professiona...

http://www.amazon.com/Paradigms-Artificial-Intelligence-Prog...

http://www.amazon.com/Structure-Interpretation-Computer-Prog...

http://www.amazon.com/Unix-Programming-Environment-Prentice-...

http://www.amazon.com/Programming-Environment-Addison-Wesley...

To me, the kind of approach pictured in the post, seems like copying methods from electrical or civil engineering to appear more "serious", without giving due consideration to whether they really are helpful for anything for real-world software engineering or not. The "software engineering" class which taught those kind of diagram-drawing was about the only university class I did not ever get any use from, in fact I had enough industry experience by the point I took it that it just looked silly.