The incredible complexity of your computer stems from the equally incredible simplicity of binary logic. Everything you see on your screen is the result of what amounts to a long sequence of yes and no questions. Using this logic, we can store everything from text to music and perform the mammoth number-crunching that underpins Google's web search empire.
As explained in this mini-documentary, the basis for all of these digital possibilities stems from the work of the late mathematician Claude Shannon. As a graduate student at the Massachusetts Institute of Technology, he developed the crucial concepts that enable us to use digital circuits to create complex computers. He even gave us the term "bit": the smallest piece of information in a digital system (either a one or a zero).
Shannon went on to work on crypto-analysis with fellow computing pioneer Alan Turing during World War II, and he developed the field of information theory. Along with fellow mathematician Edward O. Thorp, Shannon also came up with the first wearable computers: a gadget the pair used to calculate the odds of winning at blackjack tables in Reno.