Found in 2 comments on Hacker News
gregjor · 2024-03-06 · Original thread
I have almost 45 years experience in software development, so I can describe the skills that led to my own longevity.

Hardware and the tools and languages available for writing software changed and improved in many ways during my career. My phone has more compute power and memory than the mainframes I worked on in the '70s and '80s. We have more expressive and safer languages, and better tools for writing and testing code. On the other hand I still use, every day, things that date from early in my career: C and C-family languages, relational databases and SQL, Unix in the form of Linux, modern versions of the vi editor. Very little about the practice of software development changed in the last 4.5 decades -- programming books written in the '70s have the same lessons as those published last year, you just have to read PL/I in one and Javascript or Rust in another. Read anything by Brian Kernighan to understand how little software development has actually changed in 40 years, and how programmers continue to repeat the same mistakes, "reinvent the flat tire," as Alan Kay put it.

A big lesson learned way back in the history of software development: Getting the requirements right is the hard part. That hasn't changed, software projects still succeed or fail mostly due to success or failure in the requirements and specification steps. A lot of programmers don't have the experience or interest to learn the business domain or engage with management and users, so they can't write or even correctly interpret requirements. When I started my career I worked with analysts who translated business domain expertise and business requirements into actionable software development plans for the programmers to implement [1] [2]. The analyst role faded away in many companies as software development changed from an ancillary activity (supporting an accounting or logistics department, for example) to a standalone discipline producing products that businesses would adapt to. Today software often dictates requirements rather than the other way around -- if you've ever worked at a company that adopted Salesforce or SAP or Oracle you know what I mean.

Another lesson known as least as far back as Brooks[3] wrote a book about it: People and organizational issues determine the success of software projects to a greater degree (I'd say much greater) than technical skill or choice of language. Learning how to manage teams and projects and work with all of the business stakeholders will prove more durable and valuable than learning the new language of the year.

Writing code got more mechanical over the years, and bugs and performance issues and resource usage became less critical because modern hardware can handle sloppy and bloated code, and modern operating systems and programming languages have a lot more guardrails and fault tolerance. That glosses over a lot of bad code, which in my opinion led to even more sloppy programming. Since LLMs will train mostly on modern code in modern languages I think we can expect more of the same -- workable code that just reiterates what someone else already wrote. That describes what a majority of programmers do now, plugging APIs together, so LLMs can join the team.

To answer your question: I would focus on the fundamentals, which have little to do with programming languages. The fundamental technologies that have stood up for decades with no sign of going away:

- Relational databases and SQL.

- Unix-family operating systems [4].

- C, because it influenced so many later languages, and expresses the essence of programming succinctly.

- Lisp, because it offers a counterpoint to the imperative C style that came to dominate, even if almost no one writes Lisp code today.

- Algorithms and data structures, and algorithmic complexity, which you can learn from '70s-era books by Knuth and Wirth.

I'll add networking and security, things we didn't have or worry much about pre-internet, but today I see a lot of demand for actual experts and few people going that direction, preferring to join the front-end web dev React swarm because of the relatively low bar to entry.

I would also focus on learning business domains and the systems that keep businesses running. I can't get a job today because I know Javascript or Go, but I can get a job because I know a lot about enterprise-level logistics. Outside of the Silicon Valley tech meat grinder you find millions of companies solving business problems they can describe and attach actual costs and risks to, not looking for the next disruption (or grift) with a small chance of success. Businesses don't have requirements like "We need another thousand lines of Javascript by next month." They have requirements like "We need to reduce late deliveries and mistakes by 20%." So learn at least one business domain (best to learn on the job), pay attention to the whole organization and the people, make contacts and friends who aren't fellow Rust geeks, and hone your skills translating business needs into actionable software specifications.

Learn and practice speaking and writing. People who can effectively communicate have a significant advantage over those who can't. [5]

You face a tough job market, especially for people trying to enter the field. Look beyond the big tech companies. You can't ignore AI (what we're calling "AI" this decade) and the possibility it will eventually write entire working software applications, though today that's more hype and VC dreams than reality. But if that does happen I think businesses will still need actual experts who have experience and skills, not just a good memory and algorithm for spewing out code. And I know businesses will need people who can solve business problems rather than just write code.

[1] https://www.amazon.com/Structured-Design-Fundamentals-Discip... (for example)

[2] https://www.amazon.com/Reliable-software-through-composite-d... (for example)

[3] https://en.wikipedia.org/wiki/The_Mythical_Man-Month

[4] http://doc.cat-v.org/bell_labs/utah2000/

[5] https://www.dreamsongs.com/Files/PatternsOfSoftware.pdf

wallstprog · 2019-07-22 · Original thread
"Those who do not learn history are condemned to repeat it".

I came up writing code in BAL, COBOL, PL/1 and a bunch of other non-OOP languages, and those who think procedural programming is better have probably never done it. The starting point for OOP, at least for me, was the splendid book "Reliable Software through Composite Design" by Glenford Myers, which was the first to study coupling, cohesion and other techniques that lead directly to OOP. This book still maintains a position of honor on my shelf, and I strongly recommend it for anyone doing software development. (You can get a pdf from https://archive.org/details/reliablesoftware00myer or buy your own hard-copy at https://www.amazon.com/Reliable-software-through-composite-d...).

Now, crappy programmers can write crappy code in any language, just as good programmers can write good code in any language -- the prevalance of crappy code speaks more to the numbers of crappy programmers than crappy languages. I would suggest that OOP enables crappy programmers to write code simply because without OOP crappy programmers wouldn't be able to write any code at all.

When functional programming is able to provide performance on a par with non-functional programming, we can talk about it. Until then it is a curiosity, but not ready for prime-time.

To paraphrase Winston Churchill's comments on democracy: "OOP is the worst programming paradigm ever invented -- except for all the other ones".

Fresh book recommendations delivered straight to your inbox every Thursday.