Quotes of All Topics . Occasions . Authors
A computer is not really like us. It is a projection of a very small part of ourselves: that portion devoted to logic, order, rule and clarity.
Introduced in the 1960s, multitasking is an engineering strategy for making computers more efficient. Human beings are the slowest elements in a system.
Has Google appropriated the word 'search?' If so, I find it sad. Search is a deep human yearning, an ancient trope in the recorded history of human life.
I think technical people now should learn literature, because literature teaches you a great deal about how - the depths and variety of human imagination.
When I am writing, and occasionally achieve single focus and presence, I finally feel that is where I'm supposed to be. Everything else is kind of anxiety.
It has occurred to me that if people really knew how software got written, I'm not sure they'd give their money to a bank or get on an airplane ever again.
The ability to 'multitask,' to switch rapidly among many competing focuses of attention, has become the hallmark of a successful citizen of the 21st century.
I came of technical age with UNIX, where I learned with power-greedy pleasure that you could kill a system right out from under yourself with a single command.
I was a girl who came into the clubhouse, into the treehouse, with the sign on the door saying, 'No girls allowed,' and the reception was not always a good one.
Reading code is like reading all things written: You have to scribble, make a mess, remind yourself that the work comes to you through trial and error and revision.
With code, what it means is what it does. It doesn't express, not really. It's a very bounded conversation. And writing is not bounded. That's what's hard about it.
When you lose your Visa card, you get a new card with a new number, and any new charges with the old number are blocked. Why can't we do the same with Social Security numbers?
When knowledge passes into code, it changes state; like water turned to ice, it becomes a new thing, with new properties. We use it, but in a human sense, we no longer know it.
I don't know where anyone ever got the idea that technology, in and of itself, was a savior. Like all human-created 'progress,' computers are problematic, giving and taking away.
Some people hit a profession and just keep going deeper into it, making a life and making it more and more stable. That's not been my experience. I always want to try something new.
It will not work to keep asking men to change. Many have no real objective to do so. There's no reward for them. Why should they change? They're doing well inside the halls of coding.
I think many people have wonderful stories inside them and the talent to tell those stories. But the writing life, with its isolation and uncertain outcomes, keeps most from the task.
Multitasking, throughput, efficiency - these are excellent machine concepts, useful in the design of computer systems. But are they principles that nurture human thought and imagination?
I like the little semi-competencies of human beings, I realize. Governance, after all, is a messy business, a world of demi-solutions and compromise, where ideals are tarnished regularly.
To be a programmer is to develop a carefully managed relationship with error. There's no getting around it. You either make your accommodations with failure, or the work will become intolerable.
Y2K is showing everyone what technical people have been dealing with for years: the complex, muddled, bug-bitten systems we all depend on, and their nasty tendency toward the occasional disaster.
Internet voting is surely coming. Though online ballots cannot be made secure, though the problems of voter authentication and privacy will remain unsolvable, I suspect we'll go ahead and do it anyway.
Productivity has always been the justification for the prepackaging of programming knowledge. But it is worth asking about the sort of productivity gains that come from the simplifications of click-and-drag.
I broke into the ranks of computing in the early 1980s, when women were just starting to poke their shoulder pads through crowds of men. There was no legal protection against 'hostile environments for women.'
The web is just another stunning point in the two-hundred-thousand-year history of human beings on earth. The taming of fire; the discovery of penicillin; the publication of 'Jane Eyre' - add anything you like.
When I hear the word 'disruption,' in my mind, I think of all these people in the middle who were earning a living. We will sweep away all that money they were earning, and we will move that to the people at the top.
Each new tool we create ends an old relationship with the world and starts a new one. And we're changed by that relationship, inevitably. It changes the way we live, changes our patterns, changes our social organization.
Evolution, dismissed as a sloppy programmer, has seen fit to create us as a wild amalgam of everything that came before us: except for the realm of insects, the whole history of life on earth is inscribed within our bodies.
A computer is a general-purpose machine with which we engage to do some of our deepest thinking and analyzing. This tool brings with it assumptions about structuredness, about defined interfaces being better. Computers abhor error.
Closed environments dominated the computing world of the 1970s and early '80s. An operating system written for a Hewlett-Packard computer ran only on H.P. computers; I.B.M. controlled its software from chips up to the user interfaces.
Before the advent of the Web, if you wanted to sustain a belief in far-fetched ideas, you had to go out into the desert, or live on a compound in the mountains, or move from one badly furnished room to another in a series of safe houses.
The biggest problem is that people have stopped being critical about the role of the computer in their lives. These machines went from being feared as Big Brother surrogates to being thought of as metaphors for liberty and individual freedom.
My approach to being a self-taught programmer was to find out who was smart and who would be helpful, and these were - these are both men and women. And without learning from my co-workers, I never could've gone on in the profession as long as I did.
I don't like the idea that Facebook controls how people express themselves and changes it periodically according to whatever algorithms they use to figure out what they should do or the whim of some programmer or some CEO. That bothers me a great deal.
What happens to people like myself, who have been involved with computing for a long time, is that you begin to see how many of the 'new' ideas are simply old ones coming back into view on the swing of the pendulum, with new and faster hardware to back it up.
I think that focusing all experiences through the lens of the Internet is an example of not being able to see history through the eyes of others, to be so enamored of one's present time that one cannot see that the world was once elsewise and was not about you.
Through the miracle of natural genetic recombination, each child, with the sole exception of an identical twin, is conceived as a unique being. Even the atmosphere of the womb works its subtle changes, and by the time we emerge into the light, we are our own persons.
It had to happen to me sometime: sooner or later, I would have to lose sight of the cutting edge. That moment every technical person fears - the fall into knowledge exhaustion, obsolescence, techno-fuddy-duddyism - there was no reason to think I could escape it forever.
Human thinking can skip over a great deal, leap over small misunderstandings, can contain ifs and buts in untroubled corners of the mind. But the machine has no corners. Despite all the attempts to see the computer as a brain, the machine has no foreground or background.
It's possible to let technology absorb what we know and then re-express it in intricate mechanisms - parts and circuit boards and software objects - mechanisms we can use but do not understand in crucial ways. This not-knowing is fine while everything works as we expected.
I used to pass by a large computer system with the feeling that it represented the summed-up knowledge of human beings. It reassured me to think of all those programs as a kind of library in which our understanding of the world was recorded in intricate and exquisite detail.
Writing was a way to get away from my life as a programmer, so I wanted to write about other things, but of course nobody wanted to publish another story about a family, unless it was extraordinary. When I began writing about my life as a programmer, however, people were interested.
If you've ever watched someone who is a mother talk on the phone, feed the dog, bounce the baby, it's just astounding to see someone manage, more or less well, to do all those things. But on a computer, multitasking is really binary. The task is either in the foreground, or it's not.
So many people for so many years have promoted technology as the answer to everything. The economy wasn't growing: technology. Poor people: technology. Illness: technology. As if, somehow, technology in and of itself would be a solution. Yet machine values are not always human values.
I hate the new word processors that want to tell you, as you're typing, that you made a mistake. I have to turn off all that crap. It's like, shut up - I'm thinking now. I will worry about that sort of error later. I'm a human being. I can still read this, even though it's wrong. You stupid machine, the fact that you can't is irrelevant to me.