Entropy

Reading a book by Leonard Susskind, which gives me further perspective on the topic of abstraction and information loss—something I’ve written about but never fully understood. Or rather, I have trouble following the scientists when they say that nature is information. The suspicion won’t leave that they are using a metaphor from the computer industry. Anyway, Susskind says “entropy is hidden information.” What he means is that if heat is added to a gas the particles vibrate faster, so it’s harder to specify their location; all the information needed to describe the substance is not available. It must be present, but it’s not easily available. “Hidden” information might work better than “lost” information, and it connects exactly with the allegorical dimension of abstract art, namely its “meaning.” Susskind goes on: “Entropy always increases.” An artist has to acknowledge that meaning is always harder to specify, and the inability to pin it down has to be accepted as constitutional for abstraction, and brought into the method, whatever that is. But an artwork, because it is a single thing, and all its parts are specifically what they are, and all its forms are just so, by definition has zero entropy. Somehow—and this is the history of modern art—a work sheds information as it condenses down into one single specific thing, so it maintains its actual zero entropy while the entropy of meaning grows. Hey, just thinking out loud.

A Generalized Plot of Entropy versus Temperature for a Single Substance

A generalized plot of entropy versus temperature for a single substance. A work of art is at the bottom left corner of the diagram, its meaning at the top right.

This entry was posted in Abstraction and Society, American Modernism, Principles of Abstraction and tagged , , , , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *