This blog has occasionally commented on current affairs, particularly as to the role of technology in the economy. I think this is relevant to art, not least because the most overused word today (or one of them) is creativity. What used to be the special role of artists has become the necessary possession of every business. Apparently. I intend to say much more about this because it’s important for abstraction as non-conceptual practice, but right now I want to offer some perspective on our current techno-ideology.
The assumption that artificial intelligence is possible is based on one fallacy and one very sinister misconception. First of all is the fallacious assumption that every complex mental task—writing a symphony, judging whether someone is lying or not, scoring a goal in hockey, learning a language, playing the piano—can be broken down into a series of simpler ones. Many actions can, but not all. In some, such as playing a musical instrument, many lesser tasks are repeatedly practiced and built together into a new holistic accomplishment that can’t be retrospectively broken down. Secondly, and more importantly, the definition of intelligence used by AI researchers is completely functional. If one can’t tell if a given action was performed by a machine or a person, then the distinction is lost. This idea is the contribution of Alan Turing. The problem is that all measures of the adequacy of AI are based on finished accomplishments, on the past, and leaves the human capacity to find the new or the as yet unforeseeable out of consideration. This ability to recognize the new could be called learning, and AI researchers are very concerned to make a machine that can learn. Yet artificial intelligence will arrive one day and pass the Turing test, because our human capacity to learn, our adaptability, which far exceeds that of the computer, has allowed us to adapt to it. We are reducing our own capacities to meet the machine on its level—because learning is above all social, and our society, and economy, have adopted the computer. Jaron Lanier has made important observations about this, from the inside.