School is basically about one point of view - the one the teacher has or the textbooks have. They don't like the idea of having different points of view.

Having an intelligent secretary does not get rid of the need to read, write, and draw, etc. In a well functioning world, tools and agents are complementary.

If the pros at Sun had had a chance to fix Java, the world would be a much more pleasant place. This is not secret knowledge. It's just secret to this pop culture.

Science requires a society because even people who are trying to be good thinkers love their own thoughts and theories - much of the debugging has to be done by others.

Some people worry that artificial intelligence will make us feel inferior, but then, anybody in his right mind should have an inferiority complex every time he looks at a flower.

Sun Microsystems had the right people to make Java into a first-class language, and I believe it was the Sun marketing people who rushed the thing out before it should have gotten out.

Because people don't understand what computing is about, they think they have it in the iPhone, and that illusion is as bad as the illusion that 'Guitar Hero' is the same as a real guitar.

Quite a few people have to believe something is normal before it becomes normal - a sort of 'voting' situation. But once the threshold is reached, then everyone demands to do whatever it is.

Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves.

It's hard to change information in books, but if we have everything online, then a somewhat untrustworthy group of people controlling the thing - which I think is what we have - gives us '1984.'

Computer science inverts the normal. In normal science, you're given a world, and your job is to find out the rules. In computer science, you give the computer the rules, and it creates the world.

The future is not laid out on a track. It is something that we can decide, and to the extent that we do not violate any known laws of the universe, we can probably make it work the way that we want to.

There is the desire of a consumer society to have no learning curves. This tends to result in very dumbed-down products that are easy to get started on, but are generally worthless and/or debilitating.

I had the fortune or misfortune to learn how to read fluently starting at the age of three. So I had read maybe 150 books by the time I hit 1st grade. And I already knew that the teachers were lying to me.

When I first got to Apple, which was in '84, the Mac was already out, and 'Newsweek' contacted me and asked me what I thought of the Mac. I said, 'Well, the Mac is the first personal computer good enough to be criticized.'

The flip side of the coin was that even good programmers and language designers tended to do terrible extensions when they were in the heat of programming, because design is something that is best done slowly and carefully.

The result is - document destruction - we're really not going to be able to prove beyond a truth the negatives and some of the positive conclusions that we're going to come to. There will be always unresolved ambiguity here.

Understanding- -like civilization, happiness, music, science and a host of other great endeavors--is not a state of being, but a manner of traveling. This great road has no final destination. The journey itself is the reward.

I've been a Fellow in a number of companies: Xerox, Apple, Disney, HP. There are certain similarities because all the Fellows programs were derived from IBM's, which itself was derived from the MIT 'Institute Professor' program.

In our society we have hard nerds and soft nerds. The hard nerds are the ones who used to have the slide rules at their belt; now they have calculators. The soft nerds are the ones who get violently ill whenever anybody mentions an integral sign.

All the companies I've worked for have this deep problem of devolving to something like the hunting and gathering cultures of 100,000 years ago. If businesses could find a way to invent 'agriculture,' we could put the world back together and all would prosper.

In the commercial world, you have this problem that the amount of research you can do in a company is based on how well your current business is going, whereas there actually should be an inverse relationship: when things are going worse, you should do more research.

[Computing] is just a fabulous place for that, because it's a place where you don't have to be a Ph.D. or anything else. It's a place where you can still be an artisan. People are willing to pay you if you're any good at all, and you have plenty of time for screwing around.

The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.

I think the trick with knowledge is to “acquire it, and forget all except the perfume” - because it is noisy and sometimes drowns out one's own “brain voices”. The perfume part is important because it will help find the knowledge again to help get to the destinations the inner urges pick.

Social thinking requires very exacting thresholds to be powerful. For example, we've had social thinking for 200,000 years, and hardly anything happened that could be considered progress over most of that time. This is because what is most pervasive about social thinking is 'how to get along and mutually cope.'

When the Mac first came out, Newsweek asked me what I [thought] of it. I said: Well, it's the first personal computer worth criticizing. So at the end of the presentation, Steve came up to me and said: Is the iPhone worth criticizing? And I said: Make the screen five inches by eight inches, and you'll rule the world.

I fear - as far as I can tell - that most undergraduate degrees in computer science these days are basically Java vocational training. I've heard complaints from even mighty Stanford University with its illustrious faculty that basically the undergraduate computer science program is little more than Java certification.

Bad User on Device is a medium that can dynamically simulate the details of any other medium, including media that cannot exist physically. It is not a tool, although it can act like many tools. It is the first metamedium, and as such it has degrees of freedom for representation and expression never before encountered and as yet barely investigated.

Most creativity is a transition from one context into another where things are more surprising. There’s an element of surprise, and especially in science, there is often laughter that goes along with the "Aha." Art also has this element. Our job is to remind us that there are more contexts than the one that we’re in - the one that we think is reality.

Basic would never have surfaced because there was always a language better than Basic for that purpose. That language was Joss, which predated Basic and was beautiful. But Basic happened to be on a GE timesharing system that was done by Dartmouth, and when GE decided to franchise that, it started spreading Basic around just because it was there, not because it had any intrinsic merits whatsoever.

Computer literacy is a contact with the activity of computing deep enough to make the computational equivalent of reading and writing fluent and enjoyable. As in all the arts, a romance with the material must be well under way. If we value the lifelong learning of arts and letters as a springboard for personal and societal growth, should any less effort be spent to make computing a part of our lives?

By the time I got to school, I had already read a couple hundred books. I knew in the first grade that they were lying to me because I had already been exposed to other points of view. School is basically about one point of view -- the one the teacher has or the textbooks have. They don't like the idea of having different points of view, so it was a battle. Of course I would pipe up with my five-year-old voice.

The idea that hardware on networks should just be caches for movable process descriptions and the processes themselves goes back quite a ways. There's a real sense in which MS and Apple never understood networking or operating systems (or what objects really are), and when they decided to beef up their OSs, they went to (different) very old bad mainframe models of OS design to try to adapt to personal computers.

When I first prepared this particular talk... I realized that my usual approach is usually critical. That is, a lot of the things that I do, that most people do, are because they hate something somebody else has done, or they hate that something hasn't been done. And I realized that informed criticism has completely been done in by the web. Because the web has produced so much uninformed criticism. It's kind of a Gresham's Law-bad money drives the good money out of circulation. Bad criticism drives good criticism out of circulation. You just can't criticize anything.

Share This Page