Quotes of All Topics . Occasions . Authors
Forget artificial intelligence - in the brave new world of big data, it's artificial idiocy we should be looking out for.
Time, presence and physical attentiveness are our most basic proxies for something ultimately unprovable: that we are understood.
We live in an age of miracles so commonplace that it can be difficult to see them as anything other than part of the daily texture of living
In an age of constant live connections, the central question of self-examination is drifting from ‘Who are you?’ towards ‘What are you doing?
As a medium, electronic screens possess infinite capacities and instant interconnections, turning words into a new kind of active agent in the world.
We are all amateur attention economists, hoarding and bartering our moments - or watching them slip away down the cracks of a thousand YouTube clips.
Mass literacy is a phenomenon of the past few centuries, and one that has reached the majority of the world's adult population only within the past 75 years.
If computers remain far worse than us at image recognition, a certain over-confident combination of man and machine can elsewhere take inaccuracy to a whole new level.
Even when they're not causing injury, human-controlled cars are often driven inefficiently, ineptly, antisocially, or in other ways additive to the sum of human misery.
As commentators like the American psychologist Gary Marcus have noted, it's extremely difficult to teach a computer to recognise cats. And that's not for want of trying.
For all the sophistication of a world in which most of our waking hours are spent consuming or interacting with media, we have scarcely advanced in our understanding of what attention means.
Above all, the translation of books into digital formats means the destruction of boundaries. Bound, printed texts are discrete objects: immutable, individual, lendable, cut off from the world.
Unlike us, machines do not have a 'nature' consistent across vast reaches of time. They are, at least to begin with, whatever we set in motion - with an inbuilt tendency towards the exponential.
Over tens and hundreds of thousands of years, we evolved to find certain things stimulating, and as very intelligent, civilized beings, we're enormously stimulated by problem solving and learning.
I spoke at TED Global 2010 about the ways that video games engage the brain, and in particular, the idea of reward structures: how a challenge or task can be broken down and presented to make it as engaging as possible.
Once the words of a book appear onscreen, they are no longer simply themselves; they have become a part of something else. They now occupy the same space, not only as every other digital text, but as every other medium, too.
The biggest neurological turn-on for people is other people. This is what really excites us. In reward terms, it's not money; it's not being given cash - that's nice - it's doing stuff with our peers, watching us, collaborating with us.
The really interesting stuff about virtuality is what you can measure with it. Because what you can measure in virtuality is everything. Every single thing that every single person who's ever played in a game has ever done can be measured.
From exam grading to health education to professional training to democratic participation, paths towards self-realization and success in the world are often daunting and obscure: journeys only the privileged feel confident setting off along.
The best teachers, one hopes, don't shout at their students - because they are skilled at wooing as well as demanding the best efforts of others. For the ancient Greeks and Romans, this wooing was a sufficiently fine art in itself to be the central focus of education.
The earliest known writing probably emerged in southern Mesopotamia around 5,000 years ago, but for most of recorded history, reading and writing remained among the most elite human activities: the province of monarchs, priests and nobles who reserved for themselves the privilege of lasting words.
Video games are a special kind of play, but at root, they're about the same things as other games: embracing particular rules and restrictions in order to develop skills and experience rewards. When a game is well-designed, it's the balance between these factors that engages people on a fundamental level.
I love video games. I'm also slightly in awe of them. I'm in awe of their power in terms of imagination, in terms of technology, in terms of concept. But I think, above all, I'm in awe at their power to motivate, to compel us, to transfix us, like really nothing else we've ever invented has quite done before.
Vast volumes of mixed media surround us, from music to games and videos. Yet almost all of our online actions still begin and end with writing: text messages, status updates, typed search queries, comments and responses, screens packed with verbal exchanges and, underpinning it all, countless billions of words.
Modern motor vehicles are safer and more reliable than they have ever been - yet more than 1 million people are killed in car accidents around the world each year, and more than 50 million are injured. Why? Largely because one perilous element in the mechanics of driving remains unperfected by progress: the human being.
In classrooms full of students who range from brilliant to sullen disaffection, it's games - and often games alone - that I've seen engage every single person in the room. For some, the right kind of play can spell the difference between becoming part of something, and the lifelong feeling that they're not meant to take part.
For the moment, machines able to 'think' in anything approaching a human sense remain science-fiction. How we should prepare for their potential emergence, however, is a deeply unsettling question - not least because intelligent machines seem considerably more achievable than any consensus around their programming or consequences.