Link: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
It's got a very slow, methodical onramp with a lot of diagrams and a light, breezy style. You end up building a (very simple) computer in the end, including instruction processing.
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
It goes really well with Elements of Computing Systems (2nd ed) [2] which I kind of think of as a "lab manual" where you get to build a computer from first principles.
[1] https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
[2] https://www.amazon.com/Elements-Computing-Systems-second-Pri...
https://www.amazon.co.uk/Code-Language-Computer-Hardware-Sof...
I feel tempted to buy the current edition, but also to wait for the second.
Code by Petzold (https://www.amazon.com/Code-Language-Computer-Hardware-Softw...) - non-technical (in the sense it isn't something to "work through"), covers a lot of interesting topics. Especially approachable for that age.
Elements of Computing Systems by Nisan & Schoken (https://www.amazon.com/Elements-Computing-Systems-second-Pri...) - more technical (has content to work through). I've read the first edition, not the second. Has a companion site: https://www.nand2tetris.org. It's well-written, and a motivated high schooler could work through it.
The Code Book by Singh (https://www.amazon.com/Code-Book-Science-Secrecy-Cryptograph...)
The Codebreakers by Kahn (https://www.amazon.com/Codebreakers-Comprehensive-History-Co...)
I was always interested in ciphers and such as a kid so those two books got my attention when I found them in high school/college. I'm a bit fuzzy, now, about which one I was more interested in but both were good books. (I still have them, may give them a re-read next month.)
There are a few others I have in mind, but just can't recall the titles at the moment.
In general, I do also echo some of the other comments. If you are helping to design the app, you shouldn't necessarily need to understand the implementation details. In my experience, clients, whether they be external or internal or colleagues, getting too involved into what they think the implementation should be is usually a disaster. It puts pressure on the system to conform to how they think it should be, which is usually not necessarily how it should be, and it basically adds unnecessary constraints. The real constraints should be what the software should do and specifications on that, including how the software is intended to be maintained and extended.
Some thoughts on some specific courses and books that I think would be helpful to better understand the goals of software development and design and ways to think about it all:
Programming for Everyone - An Introduction to Visual Programming Languages: https://www.edx.org/course/programming-for-everyone-an-intro...
I think this course should be taken by managers, designers, and even software engineers. The primary result is that you'll come out of it knowing state charts, which are an extension to state machines, and this will be very useful for thinking about software and organizing what the software should do. Handling state is one of the primary problems in software, and you might notice that all of the various paradigms (OOP, functional, imperative, actors, etc.) in computer programming relate to the various ways people think about handling state in a computing system.
How to Code: Simple Data and Complex Data:
https://www.edx.org/course/how-to-code-simple-data
https://www.edx.org/course/how-to-code-complex-data
https://www.edx.org/micromasters/ubcx-software-development
These courses are taught by a designer of the Common Lisp language and based upon the excellent book How to Design Programs. It is essentially a language agnostic course that uses Racket to build up design paradigms that teaches you how to sort out your domain problem and designs into data and functions that operate on that data. The courses are part of a MicroMasters program, so if you really want to get into Java, that is taught in the follow-on courses.
Based upon your last comment, here are some book suggestions on how computers work:
Code: The Hidden Language of Computer Hardware and Software: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
The Pattern On The Stone: The Simple Ideas That Make Computers Work: https://www.amazon.com/Pattern-Stone-Computers-Science-Maste...
But How Do It Know? - The Basic Principles of Computers for Everyone: https://www.amazon.com/But-How-Know-Principles-Computers/dp/...
The Elements of Computing Systems: Building a Modern Computer from First Principles: https://www.amazon.com/Elements-Computing-Systems-Building-P...
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
Which goes from bits on up without shying away from circuit diagrams. It's also really well written and you can read it from start to finish.
It puts it in a historical context too which makes it fun to read.
Once you read that, they'll be fewer unknown unknowns.
The humour is subversive, the illustration is lovely, and these ("This is not my hat" is another) are great books for younger children. My child loved it, and the people I've given this to have gone on to buy other books by the writer or illustrator.
"Mr Birdsnest and the House Next Door": https://www.amazon.co.uk/Birdsnest-House-Next-Door-Little/dp...
Little Gems are a set of books printed on reduced contrast paper, with a large clear font. They're short, simple, but fun. They're good for younger readers or for slightly older reluctant readers. My child enjoyed reading this book, and loved the illustration. The other child I gave this to took out other books in the Little Gems series from the library, and bought other Julia Donaldson books with her pocket money.
"Code: The Hidden Language of Computer Hardware and Software" https://www.amazon.co.uk/Code-Language-Computer-Hardware-Sof... I had a friend who knew a lot about the software, and knew a lot about hardware but their hardware knowledge was a bit patchy. Code helped solidify their knowledge. If I could have afforded it I would have given them The Art of Electronics and the companion Student Manual. (This was in the 1990s. I haven't read the new version and I don't know how well it works today.)
"Bomber Command" https://www.amazon.co.uk/Bomber-Command-Pan-Military-Classic... I liked this book because it describes how we (the UK) went into world war 2 with ethical notions around not bombing civilian populations and ended up fire-bombing several heavily populated German cities. It's also eye-opening about the scale of this part of the war, and the cost in lives of aircrew.
What actually got me there was the book „Code“ by Charles Petzold[2] which traces the development from early circuitry like light bulbs and telegraph wires to modern digital logic. I found that after being introduced to these concepts, learning about the fundamental physics was much more accessible since it was framed in the context of contemporary application.
1: https://youtu.be/LnzuMJLZRdU
2: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
1. https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
From binary to a full computer
In the book [CODE: The Hidden Language of Computer Hardware and Softwar](https://www.amazon.com/Code-Language-Computer-Hardware-Softw...), Charles Petzold talks about how it's foundational to the eventual invention of the computer.
Back then, it also meant coast to coast communications were almost instantaneous. And soon after, transatlantic cable-enabled telegraph boosted commerce between America and Europe.
[1] https://www.amazon.co.uk/Code-Language-Computer-Hardware-Sof...
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
I've just finished reading Code (https://www.amazon.com/Code-Language-Computer-Hardware-Softw...) that builds up a RAM array and 8-bit CPU starting from relays. I'm familiar with the concepts. I'm just looking for something that explains how to express these concepts using Verilog.
I learned a lot of these things from taking Harvard's CS50 online, reading the blog posts I linked (and doing problems), and I would imagine this book (https://bigmachine.io/products/the-imposters-handbook/) would help out - a friend recommended it, but I never got around to it.
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
^Covers logic gates and the basics of how a computer is built to an incredible degree, but it's VERY time intensive. Still, I really recommend it.
It starts out with just wires, switches, and relays, and as the book progresses he goes through the process of building up a simple CPU and RAM one step at a time. The book even walks you through coming up with opcodes and assembly language.
Even if you already know this stuff, I found the book was helpful in developing an intuitive feel for how everything works and fits together.
After reading it, you'd probably have a good mental model of how you'd want to approach writing an emulator.
The Nand to Tetris courses and their accompanying textbook would probably be helpful here too[1][2].
[0] https://www.amazon.com/Code-Language-Computer-Hardware-Softw... [1] https://www.coursera.org/learn/build-a-computer [2] https://www.coursera.org/learn/nand2tetris2
IMHO this is because a lot of CS courses start at a very high level with very abstract concepts, which might also be because they can't afford to start with the very basics of "how does a computer work" due to the limited time available.
On the other hand, I think CS should always start with a book like this one, which truly does start at the basics:
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
A large part of why beginners fail is because they expect too much of, and don't comprehend, how the machine works. I believe that once they see how this complexity is actually the result of a large number of very, very simple operations, they will have a better understanding of what it means to instruct the machine.
Unfortunately the corporations seem determined to put a stop to that sort of pervasive knowledge, if only for the purpose of controlling and monetising their users. They don't want people to know how easy it is to do things like strip DRM or remove artificial limitations from software. [See Stallman's famous story, or the Cory Doctorow articles on the demise of general-purpose computing.]
And thus most of the "learn to code" efforts I've seen have seemed to focus on some ultra-high-level language, removed from reality and often sandboxed, so that while people do learn the very-high-level concepts of how computers work, they are no less unclear about the functioning of the actual systems they use daily --- or how to use them fully to their advantage. In some ways, they're trying to teach how to write programs without understanding existing ones.
The document said children should be writing programs and understand how computers store information by their final years of primary or intermediate school.
However, this sounds more promising. Especially if they're starting more along the lines of this book: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
I wanna exlpain few things.
Let me rephrase what I meant by "minimize the time wasting". You see there are lot of great advice available online. You ask something on a subreddit or here and then people will share great resources. I love this and this kind of learning. My concern is that sometimes these resources and advice is given along the lines of "although its not completely necessary, it'll still be an experience in itself".
The problem here is that such kind of learning sometime waste too much of time and leave you with confusion. People daily ask so many questions on CompSci and you'll find books starting from complete basics of computer like Code https://www.amazon.com/dp/0735611319, Nand2tetris course http://www.nand2tetris.com etc to something very sophisticated like AI. I hope you can understand that if a person spends too much time on these kinda things given that he's got a job or he's student in university with a sweet CompSci curriculum (you know what I mean) then its a problem. Although the above mentioned resources are exceptional there are others too which teaches the same thing. Can a person read all of them one by one "just to satisfy his curiosity and thinking that it'll help him in future"?
RE is already an extremely sophisticated and vast field which requires computer mastery. I'm in college and it has made me hate things I loved. I'm extremely curious guy and can spend 10-20 hours in front of PC easily. I've ~6 years of experience with linux. Now I'm literally not in a state to read 2-3 400-800 page books on a single topic which I don't even know would be required in RE. There are some topics which are quite difficult but at least if we have an idea that it IS mandatory for RE then you can be sure and refer other resources. If you don't even know what's your syllabus how can one concentrate and master it let alone learning. RE requires you to study every minute details or computer system but wasting too much of time on those horrible digital logics and design is really not worth it.
So My purpose is to make it completely clear what I actually need to know so that I can focus on it instead of reading each and every topic in complete detail thinking that if I'll miss the direction of even a single electron in I/O I won't be able to do efficient reversing. I'm literally fed up of those architecture diagrams with arrows and cramming those definitions ROM, EEROM, EEPROM.............. again and again for tests and assignments.
I've few questions for you:
You mentioned Computer Organization and Design which I think is authored by Patterson and Hennessy which is used by almost all Universities. I'm just curious about its not so good looking amazon reviews. Also what's your opinion on Tanenbaum's books which you've mentioned in that reddit link.
Now let's summarize what I've understood (PLEASE help me correct if I'm wrong)
>>>> UNDERSTANDING the system you want to hack
> Learn the most used fundamental programmming languages. (the way we TALK with computers) 1. C (also C++ in some cases) 2. Python or Ruby (given its dominance in industry right now thanks to its productive nature, also being used exploit writing) 3. Java or C# (object oriented programming which along with above languaged completes our programming fundamentals) 4. Assembly (obviously needed in RE) I think it need not be mentioned that we need to have good grasp of Data Structures and Algorithms with above languages (obviously not all)
> Understand each and every data flow and HOW a computer system work
Computer Organization and Design and Architecture
(OS fundamentals, memory management, virtual memory, paging, caching etc, Linux(macOS too) and Windows internals part I think comes here)
You restored my faith in humanity when you said I can skip the hardware and microcode part (please explain what specific topics, I swear I won't look at them again until I'm done with required topics.)
> Network Fundamentals and Programming Basics of http, TCP/IP and other protocols.... Socket programming
>>>> THE HACKING PART
> Learning WHAT loopholes are there in this above process of data read write Types of attacks (buffer overflows, heap overflows....)
> HOW those loopholes are exploited
>Reverse Engineering (Learning tools of trade: IDA, gdb.....) learning and practising reversing. Fuzzing
>Exploiting the bugs making exploits.
Please review and correct. Thanks again.
The Unix Programming Environment was published in 1984. I read it over 20 years later and was astonished at how well it had aged. For a technical book from the 80's, it is amazingly lucid and well-written. It pre-dates modern unix, so things have changed but much that goes unstated in newer books (for brevity) is explicit in UPE. (Plus, the history itself is illuminating.) It gave me a much deeper understanding of how programs actually run over computer hardware. Examples in C are old-school and take a bit of close reading but oh so rewarding. https://www.amazon.com/Unix-Programming-Environment-Prentice...
Mastering Algorithms in C. Another fantastically well-written book that shows (with practical examples) how to implement common algorithms. This is just such a great book! https://www.amazon.com/Mastering-Algorithms-Techniques-Sorti...
Also:
Code (Petzold). This one is truly language-agnostic. Others have mentioned it already. Can't recommend enough if you're iffy on the internals of computers and programming. https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
Write Great Code (Volumes I and II). Randall Hyde's books are fantastic explications of the underlying computer operations. Examples are in assembly or pseudo-code but easy to understand. https://www.amazon.com/Write-Great-Code-Understanding-Machin...
[1] https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
I always loved learning and teaching, and a side effect of this is that now I've regained the curiosity I always had about the fundamentals of our industry (I've a CS PhD). So now I'm back reading about the fundamentals of electricity and building 8-bit digital adders with basic AND/OR/XOR logic gates [0].
There's still lots of fundamental things that I want to re-learn, and for 2017 I'm thinking on writing a book about learning programming from exercises (with just enough theoretical concepts) starting from flow-charts and pseudocode, up-to some basic algorithms / abstract data structures/types (probably using Python). My idea is that there are lots of students out there that could benefit of learning how to program by solving focused exercises and learn enough about algorithms and structures to feel capable of doing more complex things (i.e, not feel the "impostor" syndrome).
[0] - https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
This is what I'd consider "bottom up":
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
To get some basic ideas I always recommend the book code by charles petzold: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
It walks you through everything from the transistor to the operating system.
(Apparently I need to add that I work for AWS on every message so yes I work for AWS)
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
link: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
[0]: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
Here are some books I've given as gifts recently:
* The Knowledge: How to Rebuild Civilization in the Aftermath of a Cataclysm, Lewis Dartnell[1]
* The Black Swan, Nassim Taleb[2]
* Siddhartha, Hermann Hesse[3]
* The Happiness Trap, Russ Harris and Steven Hayes[4]
* Code, Charles Petzold[5]
[1] https://www.amazon.com/Knowledge-Rebuild-Civilization-Afterm...
[2] https://www.amazon.com/Black-Swan-Improbable-Robustness-Frag...
[3] https://www.amazon.com/Siddhartha-Hermann-Hesse/dp/161382378...
[4] https://www.amazon.com/Happiness-Trap-Struggling-Start-Livin...
[5] https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
I found The New Turing Omnibus[1] to give a really nice overview of a bunch of topics, some chapters were a lot harder to follow than others but I got a lot from it.
Code by Charles Petzold[2] is a book I recommend to anyone who stays still long enough; it's a brilliant explanation of how computers work.
Structure and Interpretation of Computer Programs (SICP)[3] comes up all the time when this kind of question is asked and for good reason; it's definitely my favourite CS/programming book, and it's available for free online[4].
I'm still a long way off having the kind of education someone with a CS degree would have but those are my recommendations. I'd love to hear the views of someone more knowledgable.
[1] https://www.amazon.co.uk/New-Turing-Omnibus-K-Dewdney/dp/080... [2] https://www.amazon.co.uk/Code-Language-Computer-Hardware/dp/... [3] https://www.amazon.co.uk/Structure-Interpretation-Computer-E... [4] https://mitpress.mit.edu/sicp/full-text/book/book.html
Nonsense. The fundamentals don't take a long time to learn. And once you know them, everything else becomes much easier to learn. That's the reason that learning the fundamentals matters: it's a huge lever.
Here's a single, small, very accessible book that takes you all the way from switches to CPUs:
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
SICP gets you from CPUs to Scheme (well, it goes in the other direction, but the end result is the same). That's two books to get from switches to compilers. Anyone who thinks they don't have time for that needs to learn to manage their time better.
My kids enjoyed this book, similar topic, but fairly playful in how it was put together and an extremely gentle introduction without actually shying away from how things actually work. It's hard to imagine a reader not coming away with a much better understanding of what computing is all about. It starts at gates and works up to actual (machine) code at the end of the book. Very good diagrams throughout.
Despite being from 2000, I don't think it's become outdated. I'd love it if there was a sequel that covered putting things together with a cheap FPGA.
[0] http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
N.B. I did read the book when I was in university studying CS, but I felt like it was a good balance of history and tech information.
This is one of the reasons I barely recommend any intro articles in Lean Notes (http://www.leannotes.com/): almost every single one is just a stream of incomplete and incorrect statements about how the world works, based on the author's myopic personal experiences.
Rather than properly generalizing and consolidating what needs to be said to convey a full understanding of the topic, most intros settle for the first example they can think of that could be remotely construed as related to the words they've previously used for whatever subject, regardless of whether it has meaning in any context. (Example: saying that type safety prevents you from trying to "multiply seven by cats".)
It seems like a pretty Dunning-Kruger thing: the less broad your knowledge is, the more justified you feel in writing an introductory text to the field.
The only time I've ever seen somebody actually qualified to write an introductory text actually doing so (as I can immediately recall) is Charles Petzold's [Code: The Hidden Language of Computer Hardware and Software][Code] (although I suspect, from the few excerpts of it I've seen, that Brian Kernighan's "D is for Digital" is good, too).
[Code]: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
1:the building of the Nand gate itself and 2: the building of a flip flop.
Both these tasks can be easily accomplished with reference to the book 'Code' by Charles Petzold (http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...)
and software such as this http://logic.ly/
It basically starts with a "code" of two friends talking to each other between houses at night via blinking flashlights, and gradually builds up from there to a full, if somewhat barebones, microprocessor, logic gate by logic gate. And it does so in a way that teenage me was able to follow.
http://www.amazon.com/Soul-New-Machine-Tracy-Kidder/dp/03164...
Stealing The Network: How to Own the Box. This is a collection of fictional accounts of "hacking" written by hackers. Real world techniques are described though its in lightweight detail, the aim of the book is more to give an insight into how an attacker thinks. It's quite an enjoyable read too.
http://www.amazon.co.uk/Stealing-Network-How-Own-Cyber-Ficti...
Kingpin: How One Hacker Took Over the Billion-Dollar Cybercrime Underground by Kevin Poulsen. This one's a true story.
http://www.amazon.co.uk/Kingpin-Hacker-Billion-Dollar-Cyberc...
Code: The Hidden Language of Computer Hardware and Software By Charles Petzold. I still have to read this one, but I expect it would fit in with what you're after quite well.
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
I hope the new effort will use books like Code http://www.amazon.com/Code-Language-Computer-Hardware-Softwa... or something similar that don't take a macbook air for granted but instead use down to earth first principles that can be shown, built and tested with kids hands.
For slightly older kids, wishing for HtDP inspired classes.
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
~
Starting from either extreme (pure maths or pure electrical engineering) is quite healthy--starting in the middle, though, does a disservice.
0: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
https://blogs.oracle.com/ksplice/entry/hello_from_a_libc_fre...
http://www.muppetlabs.com/~breadbox/software/tiny/teensy.htm...
http://timelessname.com/elfbin/
A wee bit heavy, but it's comprehensive. It deals with what happens when you run code, how the architecture of the computer works (by and large) including at the logic level:
http://www.amazon.co.uk/Computer-Systems-Programmers-Randal-...
If you want to go lower (and higher).. look at Understanding the Linux kernel for a good understanding of how an OS is put together, with specific examples i.e. Linux.
Code, by Petzold, deals with logic and computers from the ground up. It starts with relays and builds them up into gates and usable arithmetic blocks.
http://www.amazon.co.uk/Code-Language-Computer-Hardware-Soft...
http://www.amazon.co.uk/Understanding-Linux-Kernel-Daniel-Bo...
The physics is fairly simple, at least from a CRT or LED display perspective. Gets more tricky dealing with interconnecting microprocessors because a good chunk is vendor specific.
I think this kind of project is well suited to a guide on how to build a computer from the ground up, starting with logic gates, writing a real time OS and developing a scripting language that will run and compile on it. Then you can skip a lot of largely extraneous stuff and have a solid understanding of how the hardware works.
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
[0]: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
[0] http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
[1] http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
http://www.amazon.com/Algorithmics-Spirit-Computing-David-Ha...
The New Turing Omnibus
http://www.amazon.com/The-New-Turing-Omnibus-Excursions/dp/0...
is also good, as is Code by Charles Petzold.
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
AFTER EDIT: While I thought about the first three books I mentioned, I thought of another, Write Great Code, Volume 1: Understanding the Machine by Randall Hyde.
http://www.amazon.com/Write-Great-Code-Understanding-ebook/d...
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
Even having understood for years how computers work in principal, nothing quite put it together for me like this book.
There's a similarly great book on the history/methods of cryptography called "The Code Book" by Simon Singh that I recommend too - http://www.amazon.com/The-Code-Book-Science-Cryptography/dp/... It's great because it traces the history but also walks you through how the cyphers actually worked, and provides the best intros I've ever seen to public key and quantum cryptography.
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
http://www.amazon.com/Elements-Computing-Systems-Building-Pr...
http://www.amazon.com/Structure-Interpretation-Computer-Prog...
http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/...
http://www.amazon.com/Pragmatic-Programmer-Journeyman-Master...
Good luck!
1. http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
Anyone interested could read either: (http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...) {the intro is too gentle for too long, then bamm it's too hard for many people.}
(http://www.amazon.com/Art-Electronics-Student-Manual/dp/0521...) the student lab manual for the art of electronics. Probably best with AoE, which is showing its age but still excellent.
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
It's a page turner and you'll know more about computers than many developers. You still won't be a programming guru from this, but its a great holistic approach that you can then supplement.
Tracy Kidder's The Soul of a New Machine might be good for your friend.
http://www.amazon.com/Soul-New-Machine-Tracy-Kidder/dp/03164...
Another good option might be Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
Or, how about Coders at Work?
http://www.amazon.com/Coders-Work-Reflections-Craft-Programm...
Another one that I have (but haven't had time to read yet) is Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software by Scott Rosenberg. It might have something that your friend would find interesting.
http://www.amazon.com/Dreaming-Code-Programmers-Transcendent...
Another one that may be inspirational, although it's more about personalities than computer science per-se, would be Steven Levy's Hackers: Heroes of the Computer Revolution.
http://www.amazon.com/Hackers-Computer-Revolution-Steven-Lev...
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
I can't find it now, but it reminds me of a MIT Press book for the educated layman that covered electronics, the various generations of semi-conductors (and how at that period TI was the only company to negotiate all of them, this was written at the dawn of the LSI or VLSI era), the critical details of wafer yield and resultant profitability, etc.
Perhaps not the right book for the original poster, but for many people it could be very useful.
First read "Code" by Charles Petzold. This book will get you "in the mood" and in the right frame of mind: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
Then I suggest you pick up a good book on Assembler. This might be a good choice: http://www.amazon.com/Professional-Assembly-Language-Program...
Start writing some drivers for Linux. Like a memdrive or something. Do it all in Assembler! Oh, you need to read other books on how to do this...
Then pick up the K&R book on C. Now write your memdrive driver in C.
That should get you started. I think it will take you at least up to two years before you're passed the learning curve and to be comfortable with this level of programming.
Oh, you need to be willing to do it for the love of it because it's highly unlikely that you will make a living using these sort of technologies (nowadays).
Good luck!
PS: I miss the old days...
"Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold.
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
Being a CS major, I bought this book for the sheer amazement at how the gist of everything I had learned about computers in the university could be conveyed in one book in such easy and fun manner. The learning curve is so gentle, it just couldn't be easier. I cannot recommend this book enough to _every_ person who really wants to understand how computers work.
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...