List of authors
Download:TXTPDFDOCX
The Open Work
the memory with the immediacy of a cry or a vision. Nowhere else have we thus savored the sweetness and violence of love and the languor of memory.

This communication allows us to accumulate a large capital of information about both Petrarch’s love and the essence of love in general. Yet from the point of view of meaning, the two texts are absolutely identical. It is the second one’s originality of organization—that is, its deliberate disorganization, its improbability in relation to a precise system of probability—which makes it so much more informative.

At this point, of course, one could easily object that it is not just the amount of unpredictability that charms us in a poetic discourse. If that were the case, a nursery rhyme such as «Hey diddle diddle / The cat and the fiddle / The cow jumped over the moon» would be considered supremely poetic. All I am trying to prove here is that certain unorthodox uses of language can often result in poetry, whereas this seldom, if ever, happens with more conventional, probable uses of the linguistic system.

That is, it will not happen unless the novelty resides in what is said rather than in how it is said, in which case a radio broadcast that announces, according to all the rules of redundancy, that an atomic bomb has just been dropped on Rome will be as charged with news as one could wish. But this sort of information does not really have much to do with a study of linguistic structures (and even less with their aesthetic value—further evidence that aesthetics cares more about how things are said than about what is said).

Besides, whereas Petrarch’s lines can convey a certain amount of information to any reader, including Petrarch, the radio broadcast concerning the bombing of Rome would certainly carry no information to the pilot who has dropped the bomb or to all those listeners who heard the announcement during a previous broadcast. What I want to examine here is the possibility of conveying a piece of information that is not a common «meaning» by using conventional linguistic structures to violate the laws of probability that govern the language from within. This sort of information would, of course, be connected not to a state of order but to a state of disorder, or, at least, to some unusual and unpredictable nonorder.

It has been said that the positive measure of such a kind of information is entropy; on the other hand, if entropy is disorder to the highest degree, containing within itself all probabilities and none, then the information carried by a message (whether poetic or not) that has been intentionally organized will appear only as a very particular form of disorder, a «disorder» that is such only in relation to a preexisting order. But can one still speak of entropy in such a context?

The Transmission of Information

Let us now briefly turn to the classic example of the kinetic theory of gas, and imagine a container full of molecules all moving at a uniform speed. Since the movement of these molecules is determined by purely statistical laws, the entropy of the system is very high, so that although we can predict the general behavior of the entire system, it is very difficult to predict the trajectory of any particular molecule.

In other words, the molecule can behave in a variety of ways, since it is full of possibilities, and we know that it can occupy a large number of positions, but we do not know which ones. To have a clearer idea of the behavior of each molecule, it would be necessary to differentiate their speeds—that is, to introduce an order into the system so as to decrease its entropy. In this way we would increase the probability that a molecule might behave in a particular manner, but we would also limit its initial possibilities by submitting them to a code.

If I want to know something about the behavior of a single molecule, I am seeking the kind of information that goes against the laws of entropy. But if I want to know all the possible behaviors of any given molecule, then the information I am seeking will be directly proportional to the entropy of the system. By organizing the system and decreasing its entropy, I will simultaneously learn a great deal and not much at all.

The same thing happens with the transmission of a piece of information. I shall try to clarify this point by referring to the formula that generally expresses the value of a piece of information: I = N logh, in which h stands for the number of elements among which we can choose, and N for the number of choices possible (in the case of a pair of dice, h = 6 and N = 2; in the case of a chessboard, H = 64 and N = all the moves allowed by the rules of chess).

Now, in a system of high entropy, in which all the combinations can occur, the values of N and h are very high; also very high is the value of the information that could be transmitted concerning the behavior of one or more elements of the system. But it is quite difficult to communicate as many binary choices as are necessary to distinguish the chosen element and define its combinations with other elements.

How can one facilitate the communication of a certain bit of information? By reducing the number of the elements and possible choices in question: by introducing a code, a system of rules that would involve a fixed number of elements and that would exclude some combinations while allowing others. In such a case, it would be possible to convey information by means of a reasonable number of binary choices. But in the meantime, the values of N and h would have decreased, and, as a result, so would the value of the information received.

Thus, the larger the amount of information, the more difficult its communication; the clearer the message, the smaller the amount of information. For this reason Shannon and Weaver, in their book on information theory, consider information as directly proportional to entropy.12 The role played by Shannon—one of the founders of the theory—in the research on this particular question has been particularly acknowledged by other scholars in the field.’3 On the other hand, they all seem to insist on the distinction between information (here taken in its strictest statistical sense as the measure of a possibility) and the actual validity of a message (here taken as meaning). Warren Weaver makes this particularly clear in an essay aiming at a wider diffusion of the mathematics of information: «The word information, in this theory, is used in a special sense that must not be confused with its ordinary usage.

In particular, information must not be confused with meaning . . . To be sure, this word information in communication theory relates not so much to what you do say, as to what you could say. That is, information is a measure of one’s freedom of choice when one selects a message . . . Note that it is misleading (although often convenient) to say that one or the other message conveys unit information. The concept of information applies not to the individual messages (as the concept of meaning would), but rather to the situation as a whole . . .

A mathematical theory of communication deals with a concept of information which characterizes the whole statistical nature of the information source, and is not concerned with the individual messages . . . The concept of information developed in this theory at first seems disappointing and bizarre—disappointing because it has nothing to do with meaning, and bizarre because it deals not with a single message but rather with the statistical character of a whole ensemble of messages, bizarre also because in these statistical terms the two words information and uncertainty find themselves to be partners.»14

Thus, this long digression concerning information theory finally leads back to the issue at the heart of our study. But before going back to it, we should again wonder whether in fact certain concepts gitimately be applied to questions of aesthetics—if only because it is now clear that «information» has a far wider meaning in statistics than in communication. Statistically speaking, I have information when I am made to confront all the probabilities at once, before the establishment of any order. From the point of view of communication, I have information when (1) I have been able to establish an order (that is, a code) as a system of probability within an original disorder; and when (2) within this new system, I introduce— through the elaboration of a message that violates the rules of the code—elements of disorder in dialectical tension with the order that supports them (the message challenges the code).

As we proceed with our study of poetic language and examine the use of a disorder aiming at communication, we will have to remember that this particular disorder can no longer be identified with the statistical concept of entropy except in a roundabout way: the disorder that aims at communication is a disorder only in relation to a previous order.

II Poetic Discourse and Information

The example of Petrarch should have helped us understand that the originality of an aesthetic discourse involves to some extent a rupture with (or a departure from) the linguistic system of probability, which serves to convey established meanings, in order to increase the signifying potential of the message. This sort of information, characteristic of every aesthetic message, coincides with the basic openness of

Download:TXTPDFDOCX

the memory with the immediacy of a cry or a vision. Nowhere else have we thus savored the sweetness and violence of love and the languor of memory. This communication