

Buy anything from 5,000+ international stores. One checkout price. No surprise fees. Join 2M+ shoppers on Desertcart.
Desertcart purchases this item on your behalf and handles shipping, customs, and support to British Virgin Islands.
🚀 Code Your Future: Where Innovation Meets Collaboration!
Code is a powerful, versatile programming tool designed for professionals seeking to enhance their coding experience. With cross-platform compatibility, lightning-fast performance, and robust security features, it empowers users to create, collaborate, and innovate in a secure environment.
| Customer Reviews | 4.6 out of 5 stars 2,782 Reviews |
L**R
explanation of computers for the laymen
This book is quite possibly the best explanation of how computers work from a bottom-up perspective I have ever come across. Petzold takes us on a journey in short, easy-to-read, and occasionally humorous chapters, staring from two young children passing messages to each other after bed time with flashlights, all the way up though circuitry, machine language, memory, and so on. I also enjoyed the historical aspect of Petzold's presentation. He gives names, dates, and sometimes faces. Personally, I find it amazing that we humans have come so far in so short a time period with all our technologies. Petzold requires no in-depth knowledge of computers from his readers- he gradually builds up on concepts that are always tied back to everyday occurrences in your life. Let's take the discussion of Morse code as a talking point: Sure, you could flash a light bulb once for A, twice for B, three times for C, and so on, but your fingers would fall off after a few words. Instead, lets agree to vary the length of time the light is on, and combine different sequences of light blinks to correspond to different letters, which is much easier on the fingers, because we make commonly used letters easier to send. From there, we get into telegraphs, and oh by the way, that is essentially the same way computers send data. This is typical of what Petzold does all throughout the book: start with what most people would call a reasonable solution to a problem, expose its flaws, then show the more thought-out solution, and then fill in the gaps to tie it back to modern digital computers. The book is ten years old now, but is still very relevant...which I find astounding, given how fast anything to do with computers changes these days. Petzold barely skims the surface of programming, so if you are looking for a book strictly on coding, search elsewhere (try the more software-oriented explanation of computer workings Write Great Code: Volume 1: Understanding the Machine ). The last few chapters do seem to be a hurried conclusion, but I would attribute this more to the fact that now that we know how computers work, what we can do with them is an infinitely broad category, so only a summary discussion is reasonable. Whether you are a computer scientist or not, I think you will benefit from this gem of a book- if you are not, then enjoy learning how computers really work. If you are an established computer scientist, you've probably seen everything in the book before, but it still warrants your time because of the clear analogies that are used throughout the book. I know I have had enough people ask me "how does (insert something about computers) work?", and I haven't been able to clearly explain the concepts- now I have some good analogies to use, as well as a book to heartily recommend.
M**1
Half "light read", half "I have to do that section again" - Great overview of how computers really work
As many other reviews said, the first part of the book is a brilliant, entertaining, easily understandable & accessible overview of underlying topics that relate to how "codes", electronic signals, alternate number systems & computers came about. The latter sections are considerably more dense, and required going back a few times, tracing the circuit diagrams with my finger, and Googling the finer points of electrical circuitry, how to do math in binary, octal, and hexadecimal, and other assorted topics covered in this wide-ranging work. I had classes a LONG time ago in electrical engineering, as well as a good amount of experience with binary, hex(adecimal), and programming, so given that background I could follow along fairly easily & connect the concepts to my existing knowledge. If you don't have any background at all in EE, Comp Sci, or programming, be prepared to re-read & re-re-read the chapters on logic gates, circuits, and how these bits of hardware physically compute & store basic arithmetic values in order to perform complex tasks. Those sections were the most challenging, but ultimately for me provided the most valuable information because it helped fill in the missing pieces of the puzzle in my prior knowledge. For anyone who wants to "learn to code," but you find yourself confused by or not really bothering to understand concepts like pointers, memory addresses, Boolean logic, or esoteric & ancient magic spells like "XOR" or Assembly Language, this book does an excellent job of explaining in real, physical hardware terms exactly what those mean and how they work. My only critique is that after ALL that fine-grained detail & historical backstory for most of the book, the final chapter crams roughly the last 40 years of computing into a few short pages, covering everything from Graphical User Interfaces & image compression to the internet & (rather outdated) descriptions of web browsers. It felt rushed & tacked on. I'd really like to see an update, or a companion book, that covers newer topics in such detail as the first half of this book. Overall though, this was a fantastic, educational if at-times-dense read. I had to work at it a bit, but that was the point. If you hate to read, don't want to learn, and are too lazy to work at it - buy it anyway to give him another $15 and then write a review explaining why it's the book/author's fault :P
R**A
Changes pretty dramatically after a few chapters.
Let me revise my previous review with a few brief things to say about this. First, this book has a lot of pictures and diagrams and you'll want to frequently go back to them and take your time. For this reason, it's probably a pretty bad idea to buy the kindle version. Second, this books starts out as super-light reading (for me anyway) and then starts getting much harder and denser. For that reason I can't give it 5-stars. The preview you get for your kindle might be misleading for that reason. The pacing, in my opinion, was too slow in the beginning and too fast in the middle. You'll want to take your time with this book. This is very close to a Malcolm Gladwell or Freakonomics style book, but it's not quite. But, on the other hand, this really is a good book if you want to learn about this sort of thing. Unlike Gladwell or Freakonomics, you really are learning stuff. I see no reason why this couldn't be used in a college course, but it won't feel like you're reading a textbook. For what it is, it's extremely accessible. And I don't think there's another book quite like this, certainly not of this quality. It is extremely well written. I did wind up taking a few flashcards, though, since it is harder than other books that follow the template of: "one-syllable-word: the amazing hidden side of superlative everythingness." But it's still done in a style that is very close to that. And you'll actually learn things that are true, and not figure out a later after you read the book that studies were misrepresented and facts distorted to fit the narrative of the book. So that's why I revised this review. I know a bit more than I did when I first wrote it and my expectations of what the book is has changed, too. I'm going to buy whatever this guy puts out next. Four stars only because the pacing was a little off and it might not be exactly what you expect it is from the first couple of chapters. If you want to actually learn about how computers work, there has never been a book this well-written. But you do have to actually want to know how computers work.
A**N
Extremely well motivated introduction to how computer harware works
Code is a fantastic casual introduction to computer science. The author takes the interested reader through concepts like number systems, information encoding, electricity, computer hardware, computer architecture, software including operating systems, computer languages from Assembly to high level languages and some input output device analysis. It is a remarkable accomplishment in a relatively short number of pages. The author starts by asking the reader how they would go about communicating with a friend at some distance with only a flashlight. From such a basic problem starting point the author introduces ways in which people have encoded information and how efficiently that information is encoded. The reader learns about morse code and braille. Then bits are introduced and the author shows the reader how the base of a number system is quite arbitrary and then introduces ideas in binary. The reader is introduced to basic Boolean logic and how information can be encoded in binary very naturally. The author then goes to show how one can build adding machines from basic logic devices and then introduces complement arithmetic to show how subtraction can be done using a binary adding machine. The author then moves on to some basic circuit elements and shows how latches can be made to store bits. From latches the author moves on to flip flops and discusses how a clock can be used to sequence logic. The author then gets into registers and basic computer architecture. The reader is shown some early chip designs and what functions they included. In the process of learning Von Neumann architecture we learn about memory, the bus, the program counter and eventually move in to operating systems. After aspects of that are introduced the reader learns about fixed and floating point numbers as well as computer languages and compilers. Eventually the author introduces how graphical interfaces were developed such that computers became interactive devices for the users rather than pure computational devices. I read this book while taking a course on basic computer architecture so I had the fortune of having two sources for instruction. In reading this alongside/after a more formal course I feel like the material of the course is slightly more easily absorbed as this book brings life to what can sometimes be a terse subject. Definitely think it can be read on its own or as a intro to what can become a very difficult subject very quickly.
J**G
Best Book I've Read This Year
I think that this is the best book that I have read all year. In some sense this is the book that I have been looking for for twenty-five years--the book that will enable me to understand how a computer does what it does. And--given the centrality of computers in our age--it has been a long wait. But now it is over. Charles Petzold (1999), Code: The Hidden Language of Computer Hardware and Software does a much better job than anything else I have ever seen in explaining computers--what they really are, and how they really work. Have you ever wondered just how your computers really work? I mean, really, really work. Not as in "an electrical signal from memory tells the processor the number to be added," but what the electrical signal is, and how it accomplishes the magic of switching on the circuits that add while switching off the other circuits that would do other things with the number. I have. I have wondered this a lot over the past decades. Yet somehow over the past several decades my hunger for an explanation has never been properly met. I have listened to people explain how two switches wired in series are an "AND"--only if both switches are closed will the lightbulb light. I have listened to people explain how IP is a packet-based communications protocol and TCP is a connection-based protocol yet the connection-based protocal can ride on top of the packet-based protocol. Somehow these explanations did not satisfy. One seemed like answering "how does a car work?" by telling how in the presence of oxygen carbon-hydrogen bonds are broken and carbon dioxide and water are created. The other seemed like anwering "how does a car work" by telling how if you step on the accelerator the car moves forward. Charles Petzold is different. He has hit the sweet spot exactly. Enough detail to satisfy anyone. Yet the detail is quickly built up as he ascends to higher and higher levels of explanation. It remains satisfying, but it also hangs together in a big picture. In fact, my only complaint is that the book isn't long enough. It is mostly a hardware book (unless you want to count Morse Code and the interpretation of flashing light bulbs as "software." By my count there are twenty chapters on hardware, and five on software. In my view only five chapters on software--one on ASCII, one on operating systems, one on floating-point arithmetic, one on high-level languages, and one on GUIs--is about ten too few. (Moreover, at one key place in his explanation (but only one) he waves his hands. He argues that it is possible to use the operation codes stored in memory to control which circuits in the processor are active. But he doesn't show how it is done.) Charles Petzold's explanatory strategy is to start with the telegraph: with how opening and closing a switch can send an electrical signal down a wire. And he wants to build up, step by step, from that point to end with our modern computers. At the end he hopes that the reader can look back--from the graphical user interface to the high-level language software constructions that generate it, from the high-level language software constructions to the machine-language code that underlies it, from the machine-language code to the electrical signals that load, store, and add bits into the computer's processor and into the computer's memory. But it doesn't stop there. It goes further down into how to construct an accumulator or a memory bank from logic gates. And then it goes down to how to build logic gates--either out of transistors or telegraph relays. And then deeper down, into how the electrons actually move through a transistor or through a relay and a wire. And at the end I could look back and say, yes, I understand how this machine works in a way that I didn't understand it before. Before I understood electricity and maybe an AND gate, and I understood high level languages. But the whole vast intermediate realm was fuzzy. Now it is much clearer. I can go from the loop back to the conditional jump back to the way that what is stored in memory is fed into the processor back to the circuits that set the program counter back to the logic gates, and finally back to the doped silicon that makes up the circuit. So I recommend this book to everyone. It is a true joy to read. And I at least could feel my mind expanding as I read it.
R**P
a wonderful introduction to computers
This book has a special place in my heart. I started my career working for one of the big box software service companies and the work I was exposed to was of a pretty boring nature i.e. repetitive work with no room for any thought; the focus was on doing things mechanically without understanding what the thing is all about. Moreover, my educational background was mechanical/industrial engg and hence I entered the field without knowing much about the field of computing. So that was my introduction to the field of computing and the combination of these factors was disastrous; soon I lost my interest in the field and I was terribly depressed. Later on I moved to other functional areas (product mgnt, project mgnt, prod mktng etc) thinking that my salvation would be there since I thought I would be able to get a broader view of the problem we are trying to solve (as opposed what I was exposed in the engineering side). After making a few trips along that route I realized my salvation is not going to come from that side either. Along the way I tried to figure it out myself and after going through a lot of introductory books (with explanations in a piecemeal manner which I found to be unsatisfactory) I somehow stumbled on this book...and this is the book that kindled my interest. I have since then read numerous books on computing that I found to be extremely interesting, but without the spark provided by this book I would still be in the dark. The nice thing about this book is the approach of moving forward in time from 1850's or so onwards to the end of 20th century, starting from simpler technologies and progressively moving to complicated ones. The author starts with an explanation about morse code, braille system, telegraphs (and even a primer on electricity), number system, boolean logic before moving onto to logic gates, half adders, full adders, doing subtraction, edge triggered flip flops and more. All the components and ideas that form part of the hierarchical abstraction of the machine is explained in minute details. I remember spending a lot of time thinking up many circuits using the concept of logic gates described here. The chapters on memory, automation and microprocessors is simply brilliant and the detailed sketches of the varying states of these components makes it very easy to comprehend. Towards the end of the book the author seem to increase the pace a bit and the final chapters such as operating system and graphical revolution does not seem to have the same kind of magic that the middle ones had. I suspect the author was wary of making the book too huge and thereby losing the interest of the reader. This book is targeted at someone who truly wants to have an understanding of computers, i.e. if you believe that software, hardware, processors, network, high level, low level, etc etc cannot be looked upon as isolated silos, then this book is for you.
R**Y
Excellent Book To Understand Computers at the Basic Level
I am a computer programmer by trade and sincerely wish I had this book 7 years back when I started formal education. During my 4+ years in the higher education system, and even after, I and my classmates (later colleagues) were taught how to program computers (in various languages) and many of the higher level ideas in programming (Data Structures, Algorithms, Program Structure, Etcetera, Etcetera, Etcetera) but we never really learned how the computers worked inside. Even to many trained programmers, or at least me:), these beige boxes can be something of a magical black box which we don't really understand at a fundamental beyond the point of it processing the instructions we give it in our chosen programming language. In school I recieved perhaps one single semester course that attempted to teach how these things worked inside, yet that course still skimmed on the inner workings, the teacher instead spent his time on how monitors drew pixels on the screen and how laser printers worked..... Looking back on it, I would blame the ignorance of the inner workings of computers that some programmers have on the decline of having to learn Assembly language (starting in the early nineties?), the lowest level programming language sans actual Machine Code, where one would be forced to deal with the raw inner workings of a computer naturally. I myself hope to learn it one day after reading this book :D Instead, I was taught the C programming language and what we learned in school became only more abstract in regards to the actual hardware... This is where this wonderful book came into play. Since I recieved it half-a-year ago, it must have been read/devoured by me a dozen times or more - it goes from teaching the make-up of various codes (morse, braille, etcetera) to showing how some simple to understand concepts can be combined until a working computer, calculator, etcetera, can be built....... it gives one a great foundation for learning what Computer Science is all about or gives a newer-generation Programmer, like me, much needed knowledge on how that beige box basically works, on a hardware level! The best thing is that those computer analogies can be finally thrown out the window - we all heard them before - like how "ram is like a table, or workspace. The bigger it is, the more things you can have ready and available at one time. The hard drive is like your drawers and cabinets. You can store more stuff there, but to use it, you have to take it out first and put it either on the table (RAM) or hold it in your hand (cache)." Petzold also uses analogies when he introdues topics but quickly moves beyond them, giving his audiences real understanding of the subject - which is very welcoming since analogies tend to explain function well but break down quickly when one is determined to learn more about a topic. It is probably one of the few computer books on my shelf that can't get outdated and that's good, because it still will be there in 20 years.
T**D
fantastic read
I've been programming for about 10 years at this point. I started as a lowly Web Designer fresh out of High School and through a series of jobs, opportunities, necessity, and general interest in computers I've become a full time developer working in a multitude of languages for the past 8 years. Initially I had worked in higher level languages, as I learned more and more about how they worked I drilled down through the various aspects of computers (reverse engineering, understanding assembly, etc.) but I didn't really have a firm grasp on hardware. My learning preference is to start and the basics and work my way up adding layers of knowledge on layers of previous knowledge, it helps me visualize and understand abstracts more easily. I'd started reading some Electrical Engineering books, basic circuits, understanding Voltage, Current, Resistance, Ohms, Capacitors, Circuit Diagrams, etc. But still felt a little fuzzy in a lot of regards of how they'd come together, I'd purchased and played with Arduinos and breadboards, built crude circuits, and use microcontrollers and still felt fuzzy about how it all "really" came together, all the while this book sat on my Kindle (used to have a Kindle format, not sure why it's gone now...) unread as a "some day" goal. Recently sick of reading EE books and CS related stuff I was looking for a more abstract "lighter" read during my lazy day in the park and decided to read the first few pages of CODE... I wish I had gotten around to it months ago! This book is fantastic, it's entertaining yes, but more importantly it's very clear, concise, and really appeals to my aforementioned learning style. You literally start with the basics in each regard and work your way up through each chapter. This isn't a skim or reference book, it's a cover to cover read and you'll be a better engineer/programmer/technofile for it. I'd recommend everyone even remotely interested in computers and how they work give this book a read, and especially recommend anyone venturing into the CS field to give it a good once over. I'd give it 10 stars if I could.
Z**I
パソコンの電気的な仕組みを知るための本
小学生のころから、パソコンは電気で動く、電気で計算している、2進数も電気のオン・オフで表現される、といった話は聞いていましたが、具体的な仕組みは知りませんでした。 例えば、NOTゲートは、入力が「0」(電気が流れてこない)なら出力は「1」になる(電気が流れ出ていく)わけですが、この出ていく電気はどこから湧いたのでしょうか? 逆に、入力が「1」なら出力は「0」ですが、入ってきた電気はどこに消えたのでしょうか? こういったことは、真理値表や論理回路の記号を眺めていただけでは分かりません。 この本、Charles Petzoldの『Code』は、小学校で習うような電気回路から、論理ゲート、メモリ、CPUなど(の実物よりかなり大きなモデル)を組み立てる方法を説明し、ブラックボックスだったコンピュータの中身を目で見えるようにしてくれます。 NOTゲートの例についても、そこに繋がっている導線が、実は、入力用と出力用の2本以外にもあり、魔法が使われているわけではないということが分かるようになります。 その他の疑問も、私の場合は、この本でほとんど解消しました。特に、コンピュータによる「計算」とは懸け離れているように感じられる周辺機器(ディスプレイやプリンタ)の制御方法にも簡単に触れられていて、「考えた人すごい!」と思いました。もちろん、これを分かりやすく説明してくれる著者もすごいのですが。 というわけで、コンピュータの大体の構成は理解したうえで、それが実際にはどう実現されているのか具体的に知りたい、という方にはぜひ読んでみていただきたい本です! 洋書ですが、平易な英語で書かれているので、コンピュータ用語になじみがあれば、すらすら読めるはずです。「試し読み」で確認してみてください。(「Preface」だけは少し難しめかもしれませんので、本文で判断されることをお勧めします。)
M**M
Very, very good: especially if you are a high level programmer and are interested in the fundamentals
Simply the best introduction to fundamental computing from an electronics perspective to those without a formal Comp. Sci. degree there is in my opinion. As a Perl programmer - and one could say any high level programming language - I am abstracted from the hardware so have no real idea of what goes on 'in the engine' compartment. True, a lot of the information is now historic and is utterly unnecessary to know in these days of virtualised cloud computing on demand with pay-per use billing....but for those interested it is an insight to that now passing (passed?) era of 'the Before Time'. This book has a very smooth, swallow learning curve - more a 'learning line' - and goes from a simple on-off telegraph relay used as a transmission device all the way through how n-bit adders and '1s complement' is used to to interact with memory blocks to do subtraction through to their implications and use in assembly language with registers of modern processors. The section on coding of language (Braille is used as an example) is also enlightening. From the discussions I've had with others who have done formal Computer Science degrees (I haven't - yet) this book covers a sizable chunk of the fundamental computing topics. After reading this book I will never look at a division operation in one of my programs in the same way again!
L**S
Se você sabe inglês, um dos melhores livros para entender os conceitos base da computação.
É um livro que te introduz à muitos conceitos de computação de forma muito clara e lúdica. Ele não é um livro pesado que você precisa ler enquanto faz várias anotações e pesquisa vários termos. Ele é realmente um livro introdutório, se você está começando ou já trabalha com computação e quer aprender esses conhecimentos básicos, esse é provavelmente um dos melhores livros. Mesmo na UFRJ as aulas não são tão claras e tão aprofundadas como esse livro, que em seu 12º cápitulo já explica ADDER usando circuitos lógicos, algo que eu não vi cursando ciência da computação na UFRJ, com matérias que utilizavam circuitos lógicos.
D**G
It starts off very basic not sure how it ends.
Only a few chapters in, but I'm enjoying it so far. The first few chapters I've been skimming quite a bit because the concepts are very simple and some of the paragraphs are somewhat repetitive. It does give you a good understanding of how the simple concepts build up to form complex systems. But that's only what I've read so far.
C**N
Muy buen libro
Es de los mejores libros de informática que he visto. Empieza desde lo más básico que hay en informática, la electricidad, y va progresando hasta lo más complicado. Está en inglés pero se lee muy bien con unos pocos conocimientos del idioma... una pena que no lo haya en castellano porque el libro, su altísima calidad, lo merece.
Trustpilot
1 week ago
1 week ago