I was able to determine that the book is now open sourced, here is a pdf available online. If it's between reading the book and reading my post, read the book.
One of the best choices I ever made as a software developer was to read "Inside the Machine", a book that's almost entirely about hardware. As someone who is allergic to "textbooks", this book is an oasis in a sea of poorly worded, misguided attempts at explaining hardware to software developers.
Inside the Machine covers nearly the entire spectrum of the software/hardware relationship. The book starts, by introducing you to incredibly primitive constructs of micro-processors, such as an ALU. By the end, you will be learning and understanding incredibly complex microprocessor design and architecture.
Fun Fact: Inside the Machine is written by Jon Stokes, founder of Ars Technica and briefly editor of Wired!
Why This Book?
This book cuts the bullshit. The information is presented with the minimum amount of fluff to make it digestible. It's also incredibly consistent and uses a few base analogies to explain almost every concept. Inside the Machine, does what every great instructional book should do, cater to the reader. The intro of the book explains it best:
Inside the Machine is an introduction to computers that is intended to fill the gap that exists between classic but more challenging introductions to computer architecture, like John L. Hennessy’s and David A.
Patterson’s popular textbooks, and the growing mass of works that are simply too basic for motivated non-specialist readers. Readers with some experience using computers and with even the most minimal scripting or programming experience should finish Inside the Machine with a thorough and advanced understanding of the high-level organization of modern computers. Should they so choose, such readers would then be well equipped to tackle more advanced works like the aforementioned classics, either on their own or as part of formal curriculum.
The book’s comparative approach, described below, introduces new design features by comparing them with earlier features intended to solve the same problem(s). Thus, beginning and intermediate readers are encouraged to read the chapters in order, because each chapter assumes a familiarity with the concepts and processor designs introduced in the chapters prior to it.
TL;DR: If you're not a great programmer I recommend this book even more. I read this book very early on in my dev career (when I was bad) and I believe it heavily shaped my future growth.
It Gives You Respect For What's Happening Under the Hood
One of the greatest things about this book, is that by the end, you'll really understand the "why" of modern programming. Even for most experienced developers, how the underlying computer works is essentially magic. Using that analogy, this book is goddamn Hogwarts. You might go in not being able to cast a single spell, but by the end you'll be full-fledged witch/wizard. For some, it might seem strange that I'm suggesting that learning about hardware will help you write software. But you have to remember, at the end of the day, software is just a hardware abstraction.
The book is beautiful. As someone who is a bit of a perfectionist, I really appreciate the effort that not only went into the illustrations, but the overall style and design of the book. The illustrations really provide meaningful value, instead of the often contrived "figures" provided by normal text books. For example, here is one of the illustrations the author uses when teaching the reader about caching.
Another great example used to explain pipelining.
The amount of self awareness imparted into the book is one of my favorite aspects. The author, predicting that there might be some confusion about the previous picture, actually provides a very human rationalization of his choice.
It's Riveting At Points
I won't lie and say the entire book is a "thrill ride", but there are sections that cover the history of processors that are incredibly thrilling. Reading about PowerPC and x86 politics is surprisingly intriguing. To quantify it, I would watch the HBO series.
Whether you're just getting into software development, or already an experienced developer this book has something for you. Information is presented joyfully so you'll be dying (instead of dreading) to read the next chapter.
Update: I wanted to add a comment I received on Reddit from the /u/DingBat99999 because they had some amazing insight I wish I had written myself.
The suggested book looks great. But it illustrated an issue that I hadn't considered before now.
I've been working in software for about 40 years now. When I started assembly language was unavoidable. There just wasn't as much between you and the bare metal in those days. As a result, everyone kind of got an idea of how a computer worked at low levels naturally, even without hardware focused courses.
It's been a long time since I went to university so I'm not sure what they teach these days, but given the kind of new grads I see, at least some of them are graduating with no exposure to hardware at all. And that's perfectly fine. You can do quite well in the industry without it. But, as the OP's book demonstrates, it can be interesting and fun to learn about it anyway.
To wrap this up, for me, the have been a couple of books that have affected my view of creating software. They are "Clean Code" by Bob Martin, and "Working Effectively with Legacy Code" by Michael Feathers.
The author even started porting the book to Github!
Top comments (109)
You make this book sound really informative in a good way.
Could you give one example of how understanding the inside of a computer has enhanced your software skills/thinking?
Look forward to reading this.
When you know how computer work internally you start appreciating and give more importance to code optimisation and resource management. Understand why Cyclomatic complexity exists. Why a computer cannot be sure if a program is stuck (Halting problem) so no you can't solve that ever, you can only take a guess that it may be stuck. Just to name a few.
You're absolutely right, it's a very hard phenomena to describe to outsiders. Reading books like this one really just gives you an underlying "love" for how things work.
This is cool. We just ordered it.
BUT! - does that make it The Best Book to Read as a (WEB) Developer ??? "Software" is a huge field... but if seems like most of the talk around here is for 'web' stuff.
No matter how well it explains hardware: it's seems like there's at least 20 more important books that a dev should read - for the sake of our future.
It just came in the mail! Excited to check it out.
Just like @stephangalea said, You start to give more thought about your code, optimisation wise. I've studied Electronic Engineering so I know how the lowest levels (Transistor and Logic Circuit) work. But when you know the whole stack, how things work from the basic elements (transistor) to your code and all the intermediary levels, you will have a more complete understanding of what possible or not and go from there to inventing your own solutions.
That's a unique point that I haven't seen anyone else raise. I definitely agree that having a more end-to-end understanding helps you avoid "reinventing the wheel". Thanks for that perspective!
It's the same as any first principles argument: you build intuition.
ex: Studying algorithms allows you to write code that makes appropriate reuse of existing work and by extension arrive at the answer faster while writing more consistent, predictable, manageable code (re: classes of problems).
Similarly, studying computer architecture and processor design allows you to develop intuition about the behavior of processes implemented by the hardware you're using, how they work as abstract resources, what data you're supplying to them as a programmer and by extension allows you to "think" in hardware, which leads to more contextually appropriate code both in general and on a target-specific basis.
This is a great answer (better than mine). Intuition is one of the most fundamental aspects of programming but also the most neglected.
I would say that outside of the abstract value you've communicated, there can also be direct value from the book, depending on your context. If you ever plan on writing C/C++ code professionally, this book will be immensely valuable.
Thanks for leaving such a great insight!
Checkout out this talk from Scott Meyers on CPU caches. By knowing how the computer uses and access the cache you can write code that takes advantage of this and refactor existing code to be more efficient! For me it was rare to write such code in practice but it's nice to know.
Wow... Thanks, Denis.
Scott Meyer is such a great lecturer.
I didn't fully understand most of the concepts, but his easy to follow flow kept me engaged.
This video gave me a better 'picture' of how one might take advantage of their understanding of the computer hardware in some software cases.
Thanks again, Denis.
What a great resouce!
Sorry for the late reply!
It's a hard question because a lot of what I gained from the book isn't a direct "skill" but instead a deep understanding.
There are obvious places where direct information from the book helped me, such as designing my game engine. What the book taught me about CPU caching really inspired me to learn more about how the cache works. This eventually led to a partial rewrite of my engine, as I had learned about entity-component pattern because of it's cache friendliness. I could list countless situations like these, and from other work projects (such as implementing high performance AI algorithms).
Outside of high performance computing, this book taught me a lot about basic programming patterns and the intricacies of branch prediction. Those are both incredibly relevant concepts, no matter what language you're working in.
"software is just a hardware abstraction" 🤯
And with the popularity of FPGA's rising, things become even more inception.
Another great read to understand the machine is CODE by Charles Petzold
Yes. This is exactly what I was going to say. Great book, probably quite similar, it continually sucks you in like a novel and you don't even realize you're deep in the weeds of some hardware abstraction...
I literally Crtl-F'ed Petzold, it's exactly what I would suggest. It covered a lot of what I learned in college but in a much more user-friendly readable way. I wish I'd read it before I entered universities doors to be honest.
Ahah could have written the same comment. I now suggest people interested in CS or juniors to read CODE, they'll get the stronger foundations everyone should have, as early as possible.
CODE is a wonderful book, thanks for adding that to the discussion.
Is that free PDF copy legitimate? I know the author is going to open-source the 2nd edition, but the PDF copy of the 1st seems to be on an unrelated site, and the copyright page doesn't say anything about it being released as open source.
I've posted a comment on the author's site, and I'll update here when/if I hear back.
I'm pretty sure the author is ok with it
Not to be a buzzkill, but being excited about an article still isn't consent to give out their work for free. If the book is as influential and useful as you say, kicking some money (seeing it as $40 CAD) towards it is a good habit.
Support the people who do good work 🙂
I've purchased at least 10 copies of this book (definitely not at $40 each, not sure where that's quoted). The author is fully aware of the content in my article.
Thanks for fighting for the content creator!
I noticed the book was published in 2006. I can't help but think about progress in hardware since then. But I don't know enough about hardware to know if that progress impacts the overall utility of the book. I suppose based on the positive review here and elsewhere, probably not much.
Any thoughts on this consideration?
It's a funny situation, in some ways, we've progressed a lot since 2006. But in terms of the underlying mechanisms, much is unchanged.
I'm not saying hardware isn't improving btw, just not fundamentally like it was in the late 90's-early 2000's. Branch prediction is much better and they add a ton of extra instructions to chipset. Part of what "Inside the Machine" covers, is the "4 GHz race". This is where Intel and AMD duked it out trying to be the first with a 4GHz clock. What they ended up finding out (this is early 2000's), is that clocks over 4GHz tend to melt. This discovery changed the landscape quite a bit, because instead of just trying to add more transistors and increase clock cycles, they actually had to find new areas to improve.
That event, in combination with a cultural shift towards mobile technologies has had a very noticeable effect on microprocessor development. Intel focuses more on ways to save power, and improve task parallelism, as opposed to increase clock speeds and raw processor power.
I would say the bigger changes to hardware are what's happening outside of traditional processors. FPGA's and ASIC's are a real force these days, and Nvidia is obviously killing Intel in terms of recent GPU stuff. Overall I think the book is still 95% as useful as it was when I read it.
Really appreciate your insights! Just wanted to make sure I wouldn't invest too much time to be learning stuff that is/would be soon outdated. Thanks for both write ups!
I highly disagree with this headline. I've been building apps for ~10 years and not once have I needed to know what a CPU L2 cache is. Put your time towards real-world encounters, yourprimer.com/
Hi Robert, there's a difference between saying that a book won't help you and reading it and saying it didn't. Sometimes things we read change our understanding in ways that we don't expect and that sounds more like what Ryland is saying you can expect from reading this book. I've been developing for almost 20 years and while I've never read this book, I have read other books that each contribute to my general understanding. I've also heard many people express sentiment like all developers should learn to code on C so that they gain a proper understanding of memory management. While i have never learned C, I can appreciate that a better understanding of memory management would help my development in Python. My point is really just that the book deserves consideration.
I have been dabbling into Rust recently which has certainly introduced me to different paradigms, I even started writing my JS more organized.
Thank you for your input friend, appreciate it! 🤗
My pleasure. A fiction book that I'd recommend is Bicentennial Man by Isaac Asimov. I won't give any spoilers but one thing it highlights is that we become resistant to change as we get older and while I'm pretty open to new things (at least, that's how I think of myself compared to my parents and other non-tech people my age) I nevertheless find myself acting more like the old person not trusting this "new fangled" technology than I ever thought I would. Internet of Things... meh... I'm concerned with the dangers of letting a dev (and his bug) ruin my toaster! AI... Terminator should end all discussion of that. Big Data - big brother is watching. So I find myself having to overcome a natural suspicion as I get older but I'm also not sure whether it's just because I've been in the industry longer so I know how things can go wrong or just my 40 years of age showing. I think a certain amount of time in dev shows you fads coming and going and you tend to want to sit out the first round while the fanboys and girls prove whether it will last. Right now, I'm happy that containerisation is going to stay and learning Docker is my new mountain to climb :-) Good luck with your own journey in learning Robert!
Which HBO series?
Sorry! That was meant to be a joke. There's not really an HBO series.
I had to think about it for a minute too before I realized it was a joke. :)
Oh no, I wish we had one though lol
Me too. Try "Halt and Catch Fire", not about microprocessors exactly, as close as you'll get.
Great post. Articles that have meaningful and are more enjoyable, at least to me. It’s interesting to read. We are providing opportunity to students where they can gain from their education overseas.
Thanks for the kind words.
Randomly landed on this page from a DEV summary in the email which I never look at. And look who I find here telling people to learn more about hardware? :)
Hope you guys are doing well and making a killing.
Thank you so much for your share.
I'm really looking forward to reading this book as a self-taught 17 year old full-stack engineer.
To be honest, through all my journey I've felt many many times that understand what's going on in the low-level helps you a WHOLE LOT in more high level stuff like JS or Python.
Thank you SO MUCH for this book and this article.
First off, you're very welcome.
This is the best time to start learning this stuff. If you master it now, you'll have that knowledge to compound (like interest) for way more years than most do. That gives you an advantage, trust me.
Glad you liked the post!
I didn't take a CS or Software Engineering degree, it was physics all the way through, where coding was essential and incidental at the same time. I did take a master in HPC, where they taught us a LOT about processors, including some of the concepts above (especially pipelining and caching). When I eventually stumbled into a career as a programmer, these concepts were invaluable.
Will look forward to reading this. Thanks for the post.
I think that this book is a lot more "practical" as opposed to traditional books used in the CS curriculum. I know that I learned a lot of "related" stuff in college, but none of it ever seemed relevant like the information in this book.
Thanks for the great comment!
I am programmer who was self taught by working experiences, and didn't have good reading of books except project tutorials.
Though when I know my code works logically, I'm blind what happened with those peripherals inside the computer.
I'll make this book as a good reading! Thank you for this great post!
I'm also self-taught. This sounds like the perfect book for you!
I'm a computer engineering student. That's all contents I've studied from it but deeper. Make me think back to the old day. Thanks
Np! Glad you liked it.
Definitely! That is what take me understand how computer works and useful when coding independent any programming languages.
I think you'll love "Computer Systems: A Programmer's Perspective" for the same reasons!
Bookmarked, thanks for the rec.
Some comments may only be visible to logged-in visitors. Sign in to view all comments.