Found in 49 comments
rpeden · 2018-05-23 · Original thread
This might seem like a roundabout way to start, but I'd recommend Code by Charles Petzold[0].

It starts out with just wires, switches, and relays, and as the book progresses he goes through the process of building up a simple CPU and RAM one step at a time. The book even walks you through coming up with opcodes and assembly language.

Even if you already know this stuff, I found the book was helpful in developing an intuitive feel for how everything works and fits together.

After reading it, you'd probably have a good mental model of how you'd want to approach writing an emulator.

The Nand to Tetris courses and their accompanying textbook would probably be helpful here too[1][2].

[0] [1] [2]

fauria · 2017-12-27 · Original thread
Chapter 22 of "Code: The Hidden Language of Computer Hardware and Software" uses CP/M to illustrate the inner workings of operating systems. I highly recommend this book overall:
userbinator · 2017-10-14 · Original thread
Teaching CS is hard but I think the root of it is our ability to teach people how to think. Learning to program is easy compared to the problem of learning how to think algorithmically.

IMHO this is because a lot of CS courses start at a very high level with very abstract concepts, which might also be because they can't afford to start with the very basics of "how does a computer work" due to the limited time available.

On the other hand, I think CS should always start with a book like this one, which truly does start at the basics:

A large part of why beginners fail is because they expect too much of, and don't comprehend, how the machine works. I believe that once they see how this complexity is actually the result of a large number of very, very simple operations, they will have a better understanding of what it means to instruct the machine.

userbinator · 2017-06-28 · Original thread
But software is so pervasive that within a generation or two, not understanding how it works will put you at a severe disadvantage.

Unfortunately the corporations seem determined to put a stop to that sort of pervasive knowledge, if only for the purpose of controlling and monetising their users. They don't want people to know how easy it is to do things like strip DRM or remove artificial limitations from software. [See Stallman's famous story, or the Cory Doctorow articles on the demise of general-purpose computing.]

And thus most of the "learn to code" efforts I've seen have seemed to focus on some ultra-high-level language, removed from reality and often sandboxed, so that while people do learn the very-high-level concepts of how computers work, they are no less unclear about the functioning of the actual systems they use daily --- or how to use them fully to their advantage. In some ways, they're trying to teach how to write programs without understanding existing ones.

The document said children should be writing programs and understand how computers store information by their final years of primary or intermediate school.

However, this sounds more promising. Especially if they're starting more along the lines of this book:

Qrius · 2017-06-09 · Original thread
I'm extremely grateful and was not at all expecting such an explanation.

I wanna exlpain few things.

Let me rephrase what I meant by "minimize the time wasting". You see there are lot of great advice available online. You ask something on a subreddit or here and then people will share great resources. I love this and this kind of learning. My concern is that sometimes these resources and advice is given along the lines of "although its not completely necessary, it'll still be an experience in itself".

The problem here is that such kind of learning sometime waste too much of time and leave you with confusion. People daily ask so many questions on CompSci and you'll find books starting from complete basics of computer like Code, Nand2tetris course etc to something very sophisticated like AI. I hope you can understand that if a person spends too much time on these kinda things given that he's got a job or he's student in university with a sweet CompSci curriculum (you know what I mean) then its a problem. Although the above mentioned resources are exceptional there are others too which teaches the same thing. Can a person read all of them one by one "just to satisfy his curiosity and thinking that it'll help him in future"?

RE is already an extremely sophisticated and vast field which requires computer mastery. I'm in college and it has made me hate things I loved. I'm extremely curious guy and can spend 10-20 hours in front of PC easily. I've ~6 years of experience with linux. Now I'm literally not in a state to read 2-3 400-800 page books on a single topic which I don't even know would be required in RE. There are some topics which are quite difficult but at least if we have an idea that it IS mandatory for RE then you can be sure and refer other resources. If you don't even know what's your syllabus how can one concentrate and master it let alone learning. RE requires you to study every minute details or computer system but wasting too much of time on those horrible digital logics and design is really not worth it.

So My purpose is to make it completely clear what I actually need to know so that I can focus on it instead of reading each and every topic in complete detail thinking that if I'll miss the direction of even a single electron in I/O I won't be able to do efficient reversing. I'm literally fed up of those architecture diagrams with arrows and cramming those definitions ROM, EEROM, EEPROM.............. again and again for tests and assignments.

I've few questions for you:

You mentioned Computer Organization and Design which I think is authored by Patterson and Hennessy which is used by almost all Universities. I'm just curious about its not so good looking amazon reviews. Also what's your opinion on Tanenbaum's books which you've mentioned in that reddit link.

Now let's summarize what I've understood (PLEASE help me correct if I'm wrong)

>>>> UNDERSTANDING the system you want to hack

> Learn the most used fundamental programmming languages. (the way we TALK with computers) 1. C (also C++ in some cases) 2. Python or Ruby (given its dominance in industry right now thanks to its productive nature, also being used exploit writing) 3. Java or C# (object oriented programming which along with above languaged completes our programming fundamentals) 4. Assembly (obviously needed in RE) I think it need not be mentioned that we need to have good grasp of Data Structures and Algorithms with above languages (obviously not all)

> Understand each and every data flow and HOW a computer system work

Computer Organization and Design and Architecture

(OS fundamentals, memory management, virtual memory, paging, caching etc, Linux(macOS too) and Windows internals part I think comes here)

You restored my faith in humanity when you said I can skip the hardware and microcode part (please explain what specific topics, I swear I won't look at them again until I'm done with required topics.)

> Network Fundamentals and Programming Basics of http, TCP/IP and other protocols.... Socket programming


> Learning WHAT loopholes are there in this above process of data read write Types of attacks (buffer overflows, heap overflows....)

> HOW those loopholes are exploited

>Reverse Engineering (Learning tools of trade: IDA, gdb.....) learning and practising reversing. Fuzzing

>Exploiting the bugs making exploits.

Please review and correct. Thanks again.

ctrlp · 2017-06-05 · Original thread
If you consider C to be language-agnostic, here are some gems. These are personal favorites as much for their excellent writing as for their content.

The Unix Programming Environment was published in 1984. I read it over 20 years later and was astonished at how well it had aged. For a technical book from the 80's, it is amazingly lucid and well-written. It pre-dates modern unix, so things have changed but much that goes unstated in newer books (for brevity) is explicit in UPE. (Plus, the history itself is illuminating.) It gave me a much deeper understanding of how programs actually run over computer hardware. Examples in C are old-school and take a bit of close reading but oh so rewarding.

Mastering Algorithms in C. Another fantastically well-written book that shows (with practical examples) how to implement common algorithms. This is just such a great book!


Code (Petzold). This one is truly language-agnostic. Others have mentioned it already. Can't recommend enough if you're iffy on the internals of computers and programming.

Write Great Code (Volumes I and II). Randall Hyde's books are fantastic explications of the underlying computer operations. Examples are in assembly or pseudo-code but easy to understand.

relics443 · 2017-01-15 · Original thread
I still haven't read anything better than Code by Charles Petzold [1] and it's not even close.


jventura · 2016-12-28 · Original thread
After quitting a job with a lot of commute time in it, and having failed to monetize a side project, I finally landed a teaching position on a local technical university.

I always loved learning and teaching, and a side effect of this is that now I've regained the curiosity I always had about the fundamentals of our industry (I've a CS PhD). So now I'm back reading about the fundamentals of electricity and building 8-bit digital adders with basic AND/OR/XOR logic gates [0].

There's still lots of fundamental things that I want to re-learn, and for 2017 I'm thinking on writing a book about learning programming from exercises (with just enough theoretical concepts) starting from flow-charts and pseudocode, up-to some basic algorithms / abstract data structures/types (probably using Python). My idea is that there are lots of students out there that could benefit of learning how to program by solving focused exercises and learn enough about algorithms and structures to feel capable of doing more complex things (i.e, not feel the "impostor" syndrome).

[0] -

userbinator · 2016-12-24 · Original thread
I agree, the book isn't very "bottom-up" at all, perhaps with the exception of "Binary and Number Representation" being the second chapter; the rest of it looks like OS stuff.

This is what I'd consider "bottom up":

ranman · 2016-11-30 · Original thread
I have a physics background but not an EE background. I found verilog pretty easy to grasp. VHDL took me a lot longer.

To get some basic ideas I always recommend the book code by charles petzold:

It walks you through everything from the transistor to the operating system.

(Apparently I need to add that I work for AWS on every message so yes I work for AWS)

userbinator · 2016-11-27 · Original thread
I always recommend this book for those who would like to know how computers really work:

boatsock · 2016-11-20 · Original thread
Code by Charles Petzold would be the best in my opinion.


userbinator · 2016-10-12 · Original thread
I don't think we're near that many layers of abstraction (yet --- and hopefully never?), but certainly more than ten. Indeed it's all about switching binary signals eventually. There's a great book about that too:

Haven't read the book you're recommending, but I feel it's more or less close to Code[0] by Charles Petzold, which in itself is a fascinating read.


gooseus · 2016-08-07 · Original thread
This is the sort of thread that hits me right in the wallet.

Here are some books I've given as gifts recently:

* The Knowledge: How to Rebuild Civilization in the Aftermath of a Cataclysm, Lewis Dartnell[1]

* The Black Swan, Nassim Taleb[2]

* Siddhartha, Hermann Hesse[3]

* The Happiness Trap, Russ Harris and Steven Hayes[4]

* Code, Charles Petzold[5]






tonyonodi · 2016-05-09 · Original thread
I'm a programmer without a computer science degree and I'm quite aware that CS is a bit of a blind spot for me so I've tried to read up to rectify this a little.

I found The New Turing Omnibus[1] to give a really nice overview of a bunch of topics, some chapters were a lot harder to follow than others but I got a lot from it.

Code by Charles Petzold[2] is a book I recommend to anyone who stays still long enough; it's a brilliant explanation of how computers work.

Structure and Interpretation of Computer Programs (SICP)[3] comes up all the time when this kind of question is asked and for good reason; it's definitely my favourite CS/programming book, and it's available for free online[4].

I'm still a long way off having the kind of education someone with a CS degree would have but those are my recommendations. I'd love to hear the views of someone more knowledgable.

[1] [2] [3] [4]

lisper · 2016-05-04 · Original thread
> no one has time to learn everything

Nonsense. The fundamentals don't take a long time to learn. And once you know them, everything else becomes much easier to learn. That's the reason that learning the fundamentals matters: it's a huge lever.

Here's a single, small, very accessible book that takes you all the way from switches to CPUs:

SICP gets you from CPUs to Scheme (well, it goes in the other direction, but the end result is the same). That's two books to get from switches to compilers. Anyone who thinks they don't have time for that needs to learn to manage their time better.

erichocean · 2016-03-03 · Original thread
Code: The Hidden Language of Computer Hardware and Software[0]

My kids enjoyed this book, similar topic, but fairly playful in how it was put together and an extremely gentle introduction without actually shying away from how things actually work. It's hard to imagine a reader not coming away with a much better understanding of what computing is all about. It starts at gates and works up to actual (machine) code at the end of the book. Very good diagrams throughout.

Despite being from 2000, I don't think it's become outdated. I'd love it if there was a sequel that covered putting things together with a cheap FPGA.


carise · 2016-01-20 · Original thread
This book was something I found very fun to read:

N.B. I did read the book when I was in university studying CS, but I felt like it was a good balance of history and tech information.

spb · 2015-06-11 · Original thread
> Intro articles like this do a lot to reveal biases and misunderstandings.

This is one of the reasons I barely recommend any intro articles in Lean Notes ( almost every single one is just a stream of incomplete and incorrect statements about how the world works, based on the author's myopic personal experiences.

Rather than properly generalizing and consolidating what needs to be said to convey a full understanding of the topic, most intros settle for the first example they can think of that could be remotely construed as related to the words they've previously used for whatever subject, regardless of whether it has meaning in any context. (Example: saying that type safety prevents you from trying to "multiply seven by cats".)

It seems like a pretty Dunning-Kruger thing: the less broad your knowledge is, the more justified you feel in writing an introductory text to the field.

The only time I've ever seen somebody actually qualified to write an introductory text actually doing so (as I can immediately recall) is Charles Petzold's [Code: The Hidden Language of Computer Hardware and Software][Code] (although I suspect, from the few excerpts of it I've seen, that Brian Kernighan's "D is for Digital" is good, too).


barking · 2015-05-23 · Original thread
A great book and accompanying software imo lacks just 2 things.

1:the building of the Nand gate itself and 2: the building of a flip flop.

Both these tasks can be easily accomplished with reference to the book 'Code' by Charles Petzold (

and software such as this

rcthompson · 2015-03-26 · Original thread
Here's a book that I had a random encounter with at a teenager, that gave me an excellent understanding of how computers work at the lowest levels:

It basically starts with a "code" of two friends talking to each other between houses at night via blinking flashlights, and gradually builds up from there to a full, if somewhat barebones, microprocessor, logic gate by logic gate. And it does so in a way that teenage me was able to follow.

gary__ · 2014-12-11 · Original thread
The Soul of a New Machine by Tracy Kidder, the classic book following the development of a new minicomputer in the late 70s.

Stealing The Network: How to Own the Box. This is a collection of fictional accounts of "hacking" written by hackers. Real world techniques are described though its in lightweight detail, the aim of the book is more to give an insight into how an attacker thinks. It's quite an enjoyable read too.

Kingpin: How One Hacker Took Over the Billion-Dollar Cybercrime Underground by Kevin Poulsen. This one's a true story.

Code: The Hidden Language of Computer Hardware and Software By Charles Petzold. I still have to read this one, but I expect it would fit in with what you're after quite well.

bprater · 2014-10-19 · Original thread
If you want a comprehensive read on this topic, check out the book Code by Charles Petzold:
akmiller · 2014-08-20 · Original thread
I haven't read that one, but I'll definitely check it out. I'd also recommend Charles Petzold's book called "Code". One of the best technical books I've ever read!

agumonkey · 2014-07-13 · Original thread
I was part of that generation. IIRC most kids didn't give a damn about it. At that age everything is just a weird random novelty. It was exciting for the device though, TO7 and lightpen were cute. Beside in the 80s, computers weren't a thing, even video games were barely established at home. And LOGO didn't feel like programming, turtling was felt more about geometry (left is down if facing left) than anything else, at least to me. We didn't really go into iterations and such.

I hope the new effort will use books like Code or something similar that don't take a macbook air for granted but instead use down to earth first principles that can be shown, built and tested with kids hands.

For slightly older kids, wishing for HtDP inspired classes.

angersock · 2014-01-08 · Original thread
One of the best books I've seen takes this approach:


Starting from either extreme (pure maths or pure electrical engineering) is quite healthy--starting in the middle, though, does a disservice.

sanderjd · 2014-01-03 · Original thread
You may enjoy a book he wrote called CODE[0], which is one of my absolute favorites, but I doubt it will convince you to share his political opinions.


joshvm · 2013-11-16 · Original thread
Some resources on making tiny Hello World programs down to the kernel level that may be useful:

A wee bit heavy, but it's comprehensive. It deals with what happens when you run code, how the architecture of the computer works (by and large) including at the logic level:

If you want to go lower (and higher).. look at Understanding the Linux kernel for a good understanding of how an OS is put together, with specific examples i.e. Linux.

Code, by Petzold, deals with logic and computers from the ground up. It starts with relays and builds them up into gates and usable arithmetic blocks.

The physics is fairly simple, at least from a CRT or LED display perspective. Gets more tricky dealing with interconnecting microprocessors because a good chunk is vendor specific.

I think this kind of project is well suited to a guide on how to build a computer from the ground up, starting with logic gates, writing a real time OS and developing a scripting language that will run and compile on it. Then you can skip a lot of largely extraneous stuff and have a solid understanding of how the hardware works.

planckscnst · 2013-09-16 · Original thread
Petzold's "Code: The Hidden Language of Computer Hardware and Software" aids in the understanding of how computers work at the lowest levels.

capkutay · 2013-08-19 · Original thread
Cool article! "Code" by Charles Petzold[0] talks about morse code while covering ways that we encode data. May be a good read for anyone interested in this topic.


chrishenn · 2013-03-30 · Original thread
Thanks. The low level stuff certainly seems like a good thing to learn—I made an effort to read Code[0] a while back and most of it went over my head. The high level programming languages most people work in today are abstracted to a point where it's extremely hard to see their relation to the low level stuff. It would be nice to understand that link better.


pacaro · 2013-03-20 · Original thread
The best bottom up approach to this that I have seen is Charles Petzold's "Code: The Hidden Language of Computer Hardware and Software" [1] which starts with using a flashlight to send messages and walks up the abstraction chain (switch, relay, alu, memory, cpu...) to most of the components of a modern computer. It's very accessible.


tokenadult · 2013-01-12 · Original thread
The book Algorithmics: The Spirit of Computing doesn't read like a textbook to me, and it's quite interesting.

The New Turing Omnibus

is also good, as is Code by Charles Petzold.

AFTER EDIT: While I thought about the first three books I mentioned, I thought of another, Write Great Code, Volume 1: Understanding the Machine by Randall Hyde.

Irregardless · 2012-12-20 · Original thread
A great book for anyone curious about this topic (going from simple logic gates to more modern processing technology):
showerst · 2012-06-22 · Original thread
If you'd like a great non-technical tour of how computers really work conceptually, starting from simple morse-code switches through to assembler, Charles Petzold's "Code" is awesome:

Even having understood for years how computers work in principal, nothing quite put it together for me like this book.

There's a similarly great book on the history/methods of cryptography called "The Code Book" by Simon Singh that I recommend too - It's great because it traces the history but also walks you through how the cyphers actually worked, and provides the best intros I've ever seen to public key and quantum cryptography.

jasonlotito · 2011-12-06 · Original thread
Recently, I've come to know "Code: The Hidden Language of Computer Hardware and Software"[1], and I really enjoyed it. I don't think it's necessarily the first book every programmer should read, but I do think it's a book every programmer should read. It's an easy read, it's fun, and really does provide what it promises. Highly recommended.


DanBC · 2011-10-22 · Original thread
I love stuff like this. I really hope keen engineering youth are able to get involved with building toy CPUs. Maybe not that complex, but enough to grasp what memory mapping and registers actually are.

Anyone interested could read either: ( {the intro is too gentle for too long, then bamm it's too hard for many people.}

( the student lab manual for the art of electronics. Probably best with AoE, which is showing its age but still excellent.

kenjackson · 2011-08-02 · Original thread
If you want to learn it from the bottom up, read Code.

It's a page turner and you'll know more about computers than many developers. You still won't be a programming guru from this, but its a great holistic approach that you can then supplement.

mindcrime · 2010-12-20 · Original thread
I don't necessarily know of any one book that meets all of your friends requirements, but...

Tracy Kidder's The Soul of a New Machine might be good for your friend.

Another good option might be Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.

Or, how about Coders at Work?

Another one that I have (but haven't had time to read yet) is Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software by Scott Rosenberg. It might have something that your friend would find interesting.

Another one that may be inspirational, although it's more about personalities than computer science per-se, would be Steven Levy's Hackers: Heroes of the Computer Revolution.

pan69 · 2010-11-24 · Original thread
pan69 · 2010-07-18 · Original thread
While you're at it you might want to pick up a copy of "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold. One of the most compelling and well written books on the subject I've read. Reading this along side learning C and Assembly language will make it all "click".

hga · 2010-04-27 · Original thread, it's a "general interest non-fiction" book with lots of historical stuff, e.g. coverage of the 8080 and 6500.

I can't find it now, but it reminds me of a MIT Press book for the educated layman that covered electronics, the various generations of semi-conductors (and how at that period TI was the only company to negotiate all of them, this was written at the dawn of the LSI or VLSI era), the critical details of wafer yield and resultant profitability, etc.

Perhaps not the right book for the original poster, but for many people it could be very useful.

pan69 · 2009-10-24 · Original thread
OK. Depending on your programming background you're facing a steep learning curve. That's why I recommend a bottom up approach for you.

First read "Code" by Charles Petzold. This book will get you "in the mood" and in the right frame of mind:

Then I suggest you pick up a good book on Assembler. This might be a good choice:

Start writing some drivers for Linux. Like a memdrive or something. Do it all in Assembler! Oh, you need to read other books on how to do this...

Then pick up the K&R book on C. Now write your memdrive driver in C.

That should get you started. I think it will take you at least up to two years before you're passed the learning curve and to be comfortable with this level of programming.

Oh, you need to be willing to do it for the love of it because it's highly unlikely that you will make a living using these sort of technologies (nowadays).

Good luck!

PS: I miss the old days...

listic · 2009-10-17 · Original thread
For this kind of thing, I recommend the book

"Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold.

Being a CS major, I bought this book for the sheer amazement at how the gist of everything I had learned about computers in the university could be conveyed in one book in such easy and fun manner. The learning curve is so gentle, it just couldn't be easier. I cannot recommend this book enough to _every_ person who really wants to understand how computers work.

View on Amazon
Fresh finds delivered to your inbox every Thursday.   Preview