…information is received in inverse proportion to its predictability.Mike Ford, This American Life #258: Leaving the Fold. (Keep that one around the next time you’re underestimated.) (via heather-rivers)
I was flicking through Cryptography: An Introduction by Nigel Smart this morning to keep my crypto-sec neurons from rusting over and this quote rings true with information theory. Information is entropy: you learn information when something you couldn’t predict happens.