logo

Books : Decoding the Universe (Information Theory)

Decoding the Universe

1. Boltzmann, wrote S=k log W , the first law of thermodynamics deals with explaining heat, work, and energy.

2. The industrial revolution needed more powerful engines. The steam engine stars with a fire that cause water to boil into steam, which takes up more room than the equivalent water-it expands. The expansion of steam does work; it moves a piston which, in turn, can move a wheel or lift a rock or pump water. The steam then either flies away into the sky or moves into a cool chamber exposed to air and then condenses, flowing back toward the fire to begin the cycle again. The steam engine sits between high temperature object (fire) and a cold-temperature object(the air). The system will tend toward equilibrium. In allowing the heat to flow, the engine extracts some of the energy and perform useful work. Work and heat are always ways of transferring energy.

3. Carnot put a super engine flowing heat from the hot resevoir to the cold. While allowing the same amount of heat, Q, to flow the cold reservior through a heat pump back into the hot reservoir. Some of the work from the super engine can be diverted to the heat pump. "All, told no, net heat flows from the cold reservoir to the hot reservoir". A perpetual motion machine. "But nothing comes for free. It's the law." "Energy can not be created or destroyed. Energy is conserved." The second law of equilibrium states that anytime you do work, you are irreversibly increasing the equilibriumness of the universe." The second law explains why there does not exist a super engine. "Entropy always increases". "Entropy captures the configuration of the entire collection of matter in terms of probabilities-in terms of the most probable configurations of a collection of atoms, or, in our box-and marble example, the most likely outcomes wen we dump marbles in a box. The higher the probablity of a configuration of mater, the higher the entropy of that configuration."

4. "Some of themost fundamental rules in physics, the laws of thermodynamics, for example, and the laws that tell how collections of atoms move in a chunk of matter-are deep down, actually laws about information." Shannons helped translate differential equations into a form the computer could understand and creating designs of electrical relays and flip-flo switches. Shannon created boolean logic using mathematics of manipulating 0s and 1s. Shannon uses 0s and 1s to measure the mass flow of information; he included compression algorithms into the model by exploiting redundancy in a given message. A question with N possible outcomes would need log N bits of information to distinquish between the information. Informtion encoded in 1s and 0s cand answer any question, so long as that question has a finite answer. Written language is a stream of finite symbols. Each symbol can be represented as a stream of bits. Bits are the universal medium of information. Five bits can be compressed into a one or two bits through a mapping rule. The rules make the string redundant. Shannon creates his channel capacity theorem to explain how much stuff can be sent over communication lines. "Information is intimately related to entropy and energy. The function Shannon derived was, roughly speaking, a measure of how unpredictable a string of bits is. the less predictable it is, the less able you are to generate the entire message from a smaller string of bits-in other words, the less redundant. The less redundancy a message has, the more information it can contain, so by measuring this unpredictability, Shannon hoped to be able to get at the information stored in the message." In the marbles in the box, the distribution of half the marbles on both the left and right side had the highest entropy and the distribution with all the marbles on either the left or right side had the lowest entropy. The entropy distribution of 1s ands 0s of symbols directly relates to the amount of information of the stream.

5. Shannon figured out how much energy was required to transmit a bit from place to place under certain conditions. Information theory is the science of manipulation and transmission of bits, is very closely tied to thermodynamics. Maxwell's entropy problem could use information theory instead thermodynamics to separate the hot atoms from the cold atoms. Information does not come free, it requires energy. Szilard calculated that kT log 2 joules for every bit of information. Using that useful energy increases the entropy of the box. The process of obtaining and acting on the information increases the entropy of the universe. The opening and closing of the shutter was based on the information and decreases the entropy. Shannon information entropy and thermal entropy are related. Once the energy is stopped the box returns to equilibrium. A turning machine could acts as the controller for the shutter, opening and closing.

6. Memory reusablity requires energy. "Bits can be added without consuming energy or increasing the energy of the universe. You can multiple bits. You can negate them. But one action in a computer generates heat, which when dissipated into the environment, increases the entropy in the universe. That action is erasing a bit."

s