Found in 3 comments on Hacker News
rektide · 2023-01-28 · Original thread
The very nature of what we do is so different. We have so many more interactions with so many different libraries & technologies than we used to have.

I'm missing some of the other good recent discussions, but there's a core idea I love & think is true: that modern software dev is much more about "glue" code, about combining existing libraries, services, systems. Some recent pieces: https://www.oreilly.com/radar/thinking-about-glue/ https://blog.metaobject.com/2021/06/glue-dark-matter-of-soft...

By comparison, software dev used to be a purer act of raw artifice: I remember we had our db.perl & db.php at the little webshack I started at. We authored/maintained/grew it, and it was just our little library of helpers, totally independent of the rest of the world. Imported across the dozens of properties we webmastered. That's how software was! Independently developed, owned, & operated. Now, there is much assumed context, countless tools & technologies in play in every situation. Rather than big levers, we adjust large systems, or graft on new sub-machines.

Perl, in my mind, was the real starting place, the pivot where open source let us start to consume each other's code rapidly: CPAN was a great early way to start finding & using code, rather than writing it yourself. Since then, online package repos have only increased.

And some years after that, hosted services started growing in popularity. Cloud platforms started emerging. Rather than being a master-crafter pulling stuff in, now we were relying fully on others, "doing less" ourselves. The cloud made easy adoption into much more complex offerings enticing & easy.

Returning more to the article, most tech we would discover- to start learning about- is quite well established. Little of it has a good history, of what really changed when, what the other influences were. Now that we encounter so many more varied systems & libraries, the ability to orient, to gather context, to understand arbitrary things we are faced with, to have models in our head we can apply & adapt quickly & ably- those abilities to come into new situations & understand them, to experiment & explore our way forwards, is hugely important.

It's just a guess, but to me, where we are- surrounded by so much potential, but with so many long backstories lurking behind everything- strongly suggests the premise of Constructivism, (and underlying philosophy embraced by many of the One Laptop Per Child[1] (OLPC/XOPC) innovators, such as Seymour Papert).

> Constructivism is the theory that says learners construct knowledge rather than just passively take in information.

[1] https://wiki.laptop.org/go/Constructionism

rektide · 2021-10-20 · Original thread
I would love to see more innovation in scripting languages. Nothing (I've seen) comes as close to perl for swizzling together a decent, capable, handy programming language with a variety of system binaries & other shell scripts.

There's a couple efforts like shell.js[1] or the far simpler/more-contemporary zx[2] to give shell-scripting like ease to js. zx's core is just at tagged template string for running system() calls, basically, ex: $`echo 2+2`. I still find the experience pretty sub-par compared to how shell & scripting first languages like Perl are.

Perl is just super powerful, keeping a foot in a more user-land like environment, versus the totalizing experience of libraries. Why do you need to pick a http client library to use when everyone is already quite intimate with curl & it's already on the system? Somehow coding keeps taking more turf, but I feel like Perl's scripting was extremely glorious & largely un-reproduced. Perl is still some of the best "glue" the planet has seen. And glue, as we've read a couple times on HN[3][4], & which I agree with, is really important, dominates what we build.

To the topic itself, as a Debian user, I appreciate Perl, but I wish it weren't on my OS taking up ~20MB of space & being more & more offbeat, but it's probably never going away. Not sure why it irks me that my computers will all need that ~20MB of Perl + Debian perl libraries. And it is pretty cool how debuggable & visible the Debian tools often are, because they're scripts, not compiled programs, by & large. But there's a lot of operators to lookup & a lot of implicit variables to remember in Perl programs, and it's a reminder how offbeat Perl is, every time I poke my head in.

[1] https://github.com/shelljs/shelljs

[2] https://github.com/google/zx

[3] https://news.ycombinator.com/item?id=27486706 https://blog.metaobject.com/2021/06/glue-dark-matter-of-soft...

[4] https://news.ycombinator.com/item?id=27880183 https://www.oreilly.com/radar/thinking-about-glue/