Quotes of All Topics . Occasions . Authors
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.
Believe the terrain, not the map
Don't document bad code - rewrite it.
I really enjoyed Princeton as a graduate student.
Each new user of a new system uncovers a new class of bugs.
Controlling complexity is the essence of computer programming.
Mechanical rules are never a substitute for clarity of thought.
If you had done something twice, you are likely to do it again.
Trying to outsmart a compiler defeats much of the purpose of using one.
An effective way to test code is to exercise it at its natural boundaries
90% of the functionality delivered now is better than 100% delivered never.
If you're as clever as you can be when you write it, how will you ever debug it?
C is a razor-sharp tool, with which one can create an elegant and efficient program or a bloody mess.
I had spent the summer of 1966 working at MIT in the group that was the MIT component of the Multics effort.
The most effective debugging tool is still careful thought, coupled with judiciously placed print statements.
Trivia rarely affect efficiency. Are all the machinations worth it, when their primary effect is to make the code less readable?
Do what you think is interesting, do something that you think is fun and worthwhile, because otherwise you won't do it well anyway.
It's important to be informed about issues like usability, reliability, security, privacy, and some of the inherent limitations of computers.
Every language teaches you something, so learning a language is never wasted, especially if it's different in more than just syntactic trivia.
For better or worse, the people who become leaders and decision makers in politics, law and business are going to come from schools like Princeton.
As we said in the preface to the first edition, C "wears well as one's experience with it grows." With a decade more experience, we still feel that way.
Get the weirdnesses into the data where you can manipulate them easily, and the regularity into the code because regular code is a lot easier to work with
No matter what, the way to learn to program is to write code and rewrite it and see it used and rewrite again. Reading other people's code is invaluable as well.
Programming language is very specific to instructing a computer to do a particular structure of a sequence. It's the very way you tell the machine what you want it to do.
... it is a fundamental principle of testing that you must know in advance the answer each test case is supposed to produce. If you don't, you are not testing; you are experimenting.
Bell Labs was an astonishing place for many decades, though it fell on somewhat hard times during the telecom meltdown some years ago, as its corporate owner had to cope with shrinking markets.
No matter how non-technical your life and work, you're going to have to interact with technology and technical people. If you know something about how devices and systems operate, it's a big advantage.
Technology is mostly a force for good, but it has its downsides, too. I want my students - and my readers - to be intelligently skeptical about technology and be informed about the good and the not-so-good parts.
I seem to get totally wrapped up in teaching and working with students during the school year. During the summer, I try to spend time in the real world, writing code for therapy and perhaps for some useful purpose.
Even though most people won't be directly involved with programming, everyone is affected by computers, so an educated person should have a good understanding of how computer hardware, software, and networks operate.
If you don't understand viruses, phishing, and similar threats, you become more susceptible to them. If you don't know how social networks leak information that you thought was private, you're likely to reveal much more than you realize.
Anytime you want to hear about graph partitioning, I will be glad to tell you what I know about graph partitioning. It remains a standard problem. I think it's an interesting problem, because it shows up in a variety of guises in real life.
I want students to understand specific technologies, but the real goal is that they should be able to reason about how systems work and be intelligently skeptical about technology so that, when they're running the world in a few years, they'll do a good job.
Some compilers allow a check during execution that subscripts do not exceed array dimensions. This is a help, but not sufficient. First, many programmers do not use such compilers because They're not efficient. (Presumably, this means that it is vital to get the wrong answers quickly.)
Unix has, I think for many years, had a reputation as being difficult to learn and incomplete. Difficult to learn means that the set of shared conventions, and things that are assumed about the way it works, and the basic mechanisms, are just different from what they are in other systems.
Computers and computing are all around us. Some computing is highly visible, like your laptop. But this is only part of a computing iceberg. A lot more lies hidden below the surface. We don't see and usually don't think about the computers inside appliances, cars, airplanes, cameras, smartphones, GPS navigators and games.
Another effective [debugging] technique is to explain your code to someone else. This will often cause you to explain the bug to yourself. Sometimes it takes no more than a few sentences, followed by an embarrassed "Never mind, I see what's wrong. Sorry to bother you." This works remarkably well; you can even use non-programmers as listeners. One university computer center kept a teddy bear near the help desk. Students with mysterious bugs were required to explain them to the bear before they could speak to a human counselor.