Information is Surprise
A bit, the smallest unit of information, the fundamental particle of information theory, is a choice, yes or no, on or off.
One of the things that I became fairly comfortable with early on was that I wasn’t going to be able to offer a single definition of information. I was going to write a whole book about this thing and was going to title it The Information, and I was never going to say in a short, pithy way what information was because it’s taken me 500 pages to say it. And in fairness, it takes the Oxford English Dictionary not quite as big a space but something like 8,000 words to define this simple word.
Having said that, I don't mean to be flippant. The core of my book, the starting point for the book, is a scientific definition of information that emerged with the birth of information theory from the work of Claude Shannon and it’s a very mathematical definition, and yet it’s possible to talk about it in a human way. You can say that information is surprise. We know that information is something that we have a unit of measure for. We measure information in bits, and, of course, that wasn’t always true. That also began with Claude Shannon.
A bit, the smallest unit of information, the fundamental particle of information theory, is a choice, yes or no, on or off. It’s a choice that you can embody in electrical circuits and it is thanks to that that we have all this ubiquitous computing. But it’s also the flip of a coin or it’s the lighting of one or two lanterns in the Old North Church when Paul Revere had to convey a single bit of information by land or by sea, a choice. And again, it’s information because it’s surprise. If you already know the answer, there's no information there.
In Their Own Words is recorded in Big Think's studio.
Image courtesy of Shutterstock