Model building is the art of selecting those aspects of a process that are relevant to the question being asked. As with any art, this selection is guided by taste, elegance, and metaphor; it is a matter of induction, rather than deduction. High science depends on this art.

The book the Ziff folks sent me as an example of their art was 'Late Night VRML 2.0 with Java,' 700 pages + CD-ROM, published February 1997. I was personally acquainted with more movie stars than people who might conceivably have wanted to buy this book or any book like it.

A lot of people talk about sometime around 2030, machines will be more powerful than the human brain, in terms of the raw number of computations they can do per second. But that seems completely irrelevant. We don't know how the brain is organized, how it does what it does.

My work was fairly theoretical. It was in recursive function theory. And in particular, hierarchies of functions in terms of computational complexity. I got involved in real computers and programming mainly by being - well, I was interested even as I came to graduate school.

As soon as you have good mechanical technology, you can make things like backhoes that can dig holes in the road. But of course a backhoe can knock your head off. But you don't want to not develop a backhoe because it can knock your head off, that would be regarded as silly.

You've got to be very insightful about your brand, who you are, and what you mean to people. You've got to be able to inspire the whole organization behind that vision, so that every touch point the consumer experiences with the brand is reflective of that same brand promise.

The world's urban poor and the illiterate are going to be increasingly disadvantaged and are in danger of being left behind. The web has added a new dimension to the gap between the first world and the developing world. We have to start talking about a human right to connect.

Raise your quality standards as high as you can live with, avoid wasting your time on routine problems, and always try to work as closely as possible at the boundary of your abilities. Do this, because it is the only way of discovering how that boundary should be moved forward.

Imagine you are writing an email. You are in front of the computer. You are operating the computer, clicking a mouse and typing on a keyboard, but the message will be sent to a human over the internet. So you are working before the computer, but with a human behind the computer.

I've done a reasonable amount of travelling, which I enjoyed, but not for too long at a time. I'm a home-body and get fatigued by it fairly soon, but enjoy thinking back on experiences when I've returned and then often wish I'd arranged a longer stay in the somewhat exotic place.

I have always been convinced that the only way to get artificial intelligence to work is to do the computation in a way similar to the human brain. That is the goal I have been pursuing. We are making progress, though we still have lots to learn about how the brain actually works.

Everything that I've learned about computers at MIT I have boiled down into three principles: Unix: You think it won't work, but if you find the right wizard, they can make it work. Macintosh: You think it will work, but it won't. PC/Windows: You think it won't work, and it won't.

My vision when we started Google 15 years ago was that eventually you wouldn't have to have a search query at all. You'd just have information come to you as you needed it. And Google Glass is now, 15 years later, sort of the first form factor that I think can deliver that vision.

It's time to recognise the internet as a basic human right. That means guaranteeing affordable access for all, ensuring internet packets are delivered without commercial or political discrimination, and protecting the privacy and freedom of web users regardless of where they live.

The story of the growth of the World Wide Web can be measured by the number of Web pages that are published and the number of links between pages. The Web's ability to allow people to forge links is why we refer to it as an abstract information space, rather than simply a network.

Linear programming can be viewed as part of a great revolutionary development which has given mankind the ability to state general goals and to lay out a path of detailed decisions to take in order to "best" achieve its goals when faced with practical situations of great complexity.

I think it is no exaggeration to say we are on the cusp of the further perfection of extreme evil, an evil whose possibility spreads well beyond that which weapons of mass destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of extreme individuals.

While problems in a film are fairly easy to identify, the sources of those problems are often extraordinarily difficult to assess. A mystifying plot twist or a less-than-credible change of heart in our main character is often caused by subtle underlying issues elsewhere in the story.

Interestingly, modern science has estimated that the age of the earth is about 4 billion years. Scholars feel it is uncanny that the Vedic Aryans could have conceived of such a vast span of time over 3,500 years ago that would be similar to the same figure estimated by science today.

There are a couple of people in the world who can really program in C or FØRTRAN. They write more code in less time than it takes for other programmers. Most programmers aren't that good. The problem is that those few programmers who crank out code aren't interested in maintaining it.

It is, therefore, possible to extend a partially specified interpretation to a complete interpretation, without loss of verifiability... This fact offers the possibility of automatic verification of programs, the programmer merely tagging entrances and one edge in each innermost loop.

Some compilers allow a check during execution that subscripts do not exceed array dimensions. This is a help, but not sufficient. First, many programmers do not use such compilers because They're not efficient. (Presumably, this means that it is vital to get the wrong answers quickly.)

LISP has jokingly been described as "the most intelligent way to misuse a computer." I think that description is a great compliment because it transmits the full flavour of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts.

It is easy to predict that some of the discoveries of research directed towards Grand Challenges - but only the most unexpected ones, and at the most unexpected times - will be the basis of revolutionary improvements in the way that we exploit the power of our future computing devices.

We will soon create intelligences greater than our own ... When this happens, human history will have reached a kind of singularity, an intellectual transition as impenetrable as the knotted space-time at the center of a black hole, and the world will pass far beyond our understanding.

Computer programming is an art, because it applies accumulated knowledge to the world, because it requires skill and ingenuity, and especially because it produces objects of beauty. A programmer who subconsciously views himself as an artist will enjoy what he does and will do it better.

One of the things we did at PayPal was collaborative filtering and machine learning: looking at patterns of human behavior. We used it there to predict when people would try to cheat the system to get money. But you can predict pretty much any behavior with a certain amount of accuracy.

The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.

We now think of internal representation as great big vectors, and we do not think of logic as the paradigm for how to get things to work. We just think you can have these great big neural nets that learn, and so, instead of programming, you are just going to get them to learn everything.

It is important to distinguish the difficulty of describing and learning a piece of notation from the difficulty of mastering its implications. [...] Indeed, the very suggestiveness of a notation may make it seem harder to learn because of the many properties it suggests for exploration.

I think the trick with knowledge is to “acquire it, and forget all except the perfume” - because it is noisy and sometimes drowns out one's own “brain voices”. The perfume part is important because it will help find the knowledge again to help get to the destinations the inner urges pick.

Unix has, I think for many years, had a reputation as being difficult to learn and incomplete. Difficult to learn means that the set of shared conventions, and things that are assumed about the way it works, and the basic mechanisms, are just different from what they are in other systems.

Surveying the shifts of interest among computer scientists and the ever-expanding family of those who depend on computers for their work, one cannot help being struck by the power of the computer to bind together, in a genuine community of interest, people whose motivations differ widely.

Any enterprise CEO really ought to be able to ask a question that involves connecting data across the organization, be able to run a company effectively, and especially to be able to respond to unexpected events. Most organizations are missing this ability to connect all the data together.

Overemphasis of efficiency leads to an unfortunate circularity in design: for reasons of efficiency early programming languages reflected the characteristics of the early computers, and each generation of computers reflects the needs of the programming languages of the preceding generation.

A single human brain has about a hundred million nerve cells... and a computer program that throws light on the mind/brain problem will have to incorporate the deepest insights of biologists, nerve scientists, psychologists, physiologists, linguists, social scientists, and even philosophers.

My best advice came by examples. A supportive environment at home, school, and grad school. Support at the New York Institute of Technology, then George Lucas, Steve Jobs, and Bob Iger. The examples meant that I should support other people, even when things aren't going well. It will pay off.

The establishment of formal standards for proofs about programs... and the proposal that the semantics of a programming language may be defined independently of all processors for that language, by establishing standards of rigor for proofs about programs in the language, appears to be novel.

The world is colors and motion, feelings and thoughts and what does math have to do with it? Not much, if 'math' means being bored in high school, but in truth mathematics is the one universal science. Mathematics is the study of pure pattern and everything in the cosmos is a kind of pattern.

The computer field is intoxicated with change. We have seen galloping growth over a period of four decades and it still does not seem to be slowing down. The field is not mature yet and already it accounts for a significant percentage of the Gross National Product both directly and indirectly.

If you find that you're spending almost all your time on theory, start turning some attention to practical things; it will improve your theories. If you find that you're spending almost all your time on practice, start turning some attention to theoretical things; it will improve your practice.

I now have had my foggy crystal ball for quite a long time. Its predictions are invariably gloomy and usually correct, but I am quite used to that and they won't keep me from giving you a few suggestions, even if it is merely an exercise in futility whose only effect is to make you feel guilty.

My first program taught me a lot about the errors that I was going to be making in the future, and also about how to find errors. That's sort of the story of my life, making errors and trying to recover from them. I try to get things correct. I probably obsess about not making too many mistakes.

We need business leaders who have a respect for technical issues even if they don't have technical backgrounds. In a lot of U.S. industries, including cars and even computers, many managers don't think of technology as a core competency, and this attitude leads them to farm out technical issues.

We've seen a massive attack on the freedom of the web. Governments are realizing the power of this medium to organize people and they are trying to clamp down across the world, not just in places like China and North Korea; we're seeing bills in the United States, in Italy, all across the world.

It's a waste to chase the pipe dream of a magical tiny theory that allows us to make quick and detailed calculations about the future. We can't predict and we can't control. To accept this can be a source of liberation and inner peace. We're part of the unfolding world, surfing the chaotic waves.

Please don't fall into the trap of believing that I am terribly dogmatical about the go to statement. I have the uncomfortable feeling that others are making a religion out of it, as if the conceptual problems of programming could be solved by a single trick, by a simple form of coding discipline!

One term that's used in this industry a lot is this notion of 'feeding the beast.' You've got all of these people whose livelihoods are dependent on it. There are enormous pressures to keep material going into it, and the pressures to feed it are not irrational. They're the basis of your business.

Part of being the successful Pixar is that we will take risks on teams and ideas, and some of them won't work out. We only lose from this if we don't respond to the failures. If we respond, and we think it through and figure out how to move ahead, then we're learning from it. That's what Pixar is.

I get the most gratification in life by feeling like I'm doing something purposeful and meaningful and makes a difference to other people, and that comes with its pros and cons. I've accepted this is the kind of person that I am, and this is the path I've chosen in life - and I'm stickin' with it.

Share This Page