
This chapter proves that information is never neutral or stable. It is a common denominator across multiple forms, and each form reveals how we understand systems, communication, and the meaning behind them. The chapter explores virtuality, language, entropy, noise, and feedback, which highlight that information, along with being stored, also reshapes the environments that carry it. Even though information sometimes fades or persists, most often, it transforms. Theories from Saussure, Shannon, Boltzmann, and Bateson prove that information is defined less by permanence and more by probability, relation, and adaptation. Information demonstrates itself to be both constrained and filled with endless possibilities, whether through the structural logic of language, the physical principles of entropy, or the creative possibilities of noise and feedback.
Information: Everywhere and Nowhere
We found that one of the most intriguing parts of the chapter on virtuality is the contrast in how information was stored in the past compared to today. Physical traces such as carvings and manuscripts have been left behind and have survived for centuries. Contrastingly, digital information is extremely fragile. For example, a file can be duplicated without limitation, but it can also disappear with a click or even a forgotten password. As such, information is both everywhere and nowhere. It doesn’t necessarily have weight or size, yet it structures the way in which we perceive the world. This is where its power lies. Information can be re-coded or transformed to fit new contexts instantly. This reminds us of how much of what we create only exists temporarily; examples include Snapchat stories or disappearing messages. Blake’s idea of how these moments still carry meaning even if they don’t have a lasting trace can be applied here. What matters isn’t necessarily permanence but the way information can adapt and reshape systems.
Communication and Language
The most striking aspect of the “Language” chapter was the comparison between Shannon and Weaver’s theory of communication and Saussure’s theory of language. Despite being completely different theories, they still both show that systems matter more than the individual messages. Meaning essentially appears from the structures that shape it. Shannon and Weaver’s model treats information in terms of probability. The focus is on how predictable or unpredictable a message is within the system. Saussure also makes a similar point, as speech only makes sense in the context of the larger structure of language. This is where words have value through their position in relation to each other. For example, in the text, English is mentioned as 50% statistically predictable. Even though redundancy would sound like a flaw, it is actually what makes communication work. If every word were unpredictable, we wouldn’t be able to follow along. Even though the system constrains us, this constraint is what gives meaning and clarity.
What is Entropy? How its Connected To Energy?
Looking at how information has a significant connection to physical systems, the theorist Ludwig Boltzmann demonstrated that entropy, which measures disorder, is a physical property of a system. Over time, systems tend towards maximum entropy, which means more disorder and less usable energy. Ordered systems are low in entropy and contain more information and less unusable energy. Using the example of a scenario of finding a hot cup of coffee on a table in a cool room instead of the usual situation of finding a cup at room temperature to explain entropy. The hot coffee in the cold room is low-entropy because the coffee won’t stay hot for long, making it an unusual situation. In contrast, a cup at room temperature is a more probable and expected state. Boltzmann wanted to define the entropy of a physical system as a function of possible energetic configurations, the number of different possible ways to distribute more configurations and produce a random high entropy. Boltzmann quantification of the entropy law, creating the equation S = k log P, with S standing for the entropy and P being the number of different possibilities.

How We Use Entropy in Communication Systems
Looking further at the link with statistical mechanics, Shannon defined information as the mathematical inverse of probability. The more surprising it is, the more information it contains. For example, choosing from a binary set (yes/no, on/off) provides one bit of information, but not a lot, because the options at the source are very limited. In this way, the value of information is calculated: the more choices a sender has, the more information the message carries. In the approach to calculate the value of information, considering how many choices the sender has and how much information the message may contain, Shannon created a mathematical formulation of information H = –∑ pi log pi. Pi being the probability of choice, H is if the choice is predictable. Information theory explains how unlikely order shifts to probable disorder in physical systems, distinguishing between signals (useful) and noise (waste) in communication. In physical systems, thermodynamic entropy is the amount of energy unavailable for further work, or “wasted”. In communication systems, the informational entropy of a message measures the message probabilities from different perspectives
– The source: how many choices are possible
– The channel: the amount of signal transmitted versus the amount lost in noise
– The destination: how much uncertainty the message resolves for the receiver.
Through the development of reception theory, reader response theory and cognitive science, which have focused on how people interpret information through different ways of communication.
Media Systems and Noise
The author, Bruce Clarke, positions media not as neutral tools for transmission but as dynamic systems embedded in material and environmental contexts. His liking for systems theory is evident in the way the chapter frames all meaningful communication as occurring within cycles of transmission and reception, where signals and noise are always present. Noise is defined as anything in a received message that was not originally sent. It was first treated as an obstacle to efficiency, described by Shannon as random interference that disrupts productivity. However, he and Weaver also recognized that noise introduces new probabilities into the system and can be understood as information itself. Gregory Bateson sharpened this point with his definition of information as “[…] a difference that makes a difference”(165). Noise unsettles transmitted messages, creating the potential for new forms and information to emerge.
Norbert Wiener’s definition of cybernetics as the study of messages and control highlights why noise matters. Both machines and biological organisms regulate themselves through circuits of transmission and feedback. Signals travelling through these circuits are never perfectly stable; real-world channels, be that nerves or telephone wires, always contain random fluctuations or noise. Just as thermodynamic systems lend themselves toward entropy, communication systems face the inevitability of noise. What begins as a problem of error or interruption can quickly become a creative opportunity: systems learn and evolve because they must adapt to noise.
This is most clearly visible in media arts. What engineers once feared as breakdowns or flaws often became the raw material for innovation. Musicians like Jimi Hendrix transformed screeching feedback into controlled musical enhancements, while tape manipulation and distortion gave artists like the Beatles new expressive vocabularies. In these cases, noise was not a loss but a generative supplement to the message. Visual media can carry similar outcomes: glitches in video or digital photography can become aesthetic choices, reframing errors as features. Media art reveals that meaning is often made through the manipulation of noise, not its elimination.
Early communication systems, such as the telegraph and telephone, prioritize noiseless transmission, aiming to reduce distortion, while inscription media such as the phonograph or photograph captured and preserved both signal and noise. Friedrich Kittler shows how these technologies disrupted the dominance of writing, which reduces speech to 26 symbols (letters) and filters out the messy world of accidental sound. In contrast, sound recording and photography preserved continuous reality, complete with its imperfections. This created a conceptual divide: symbolic systems that treat information as immaterial code, and material systems that capture the world’s natural, noisy textures. However, symbolic and material systems are never fully separate. Information only comes into being when it takes material form, whether in carved stone or cloud servers. The qualities of these materials determine what endures, what decays, and what remains accessible.
Noise is not the enemy of communication but a structural feature of it. Sometimes it destroys order, but just as often it enables creativity, adaptation, and novelty. Media systems are ecological: they consist of signals, noise, and the environments that sustain them. Recognizing this prepares us for the cybernetic concept of feedback, where noise and uncertainty are not filtered out but re-circulated through the system, enabling regulation, adaptation, and, at times, the discovery of entirely new patterns.
Introduction of Feedback in Information Systems
Information has shifted not just to store or transmit, but to be used to create a new function of feedback. Noting that information theories define information as a mathematically inverse function of the probability of a predictable message. Bateson stated that noise is the only possible source of new patterns or information, highlighting how noise always carries meaning, even if it seems meaningless to the audience. During the 1940s, as computers were being developed, feedback became a crucial part of control mechanisms. With certain sensors, both input and output can be managed by converting them into a circuit, creating a feedback loop that uses its own output as an input. This results in either negative feedback, which stabilizes order, or positive feedback, which leads to growth or disorder. Uncertainties about messages allowed noise to serve as a source of additional information, introducing unexpected patterns useful for creative or adaptive purposes. Feedback transforms transmission into a dynamic process that enables systems to self-regulate, discover new patterns, and produce art.
Key Takeaways
Media should be understood not merely as channels for transmitting information, but as dynamic systems in which signals and noise interact within material and environmental contexts. Clarke emphasizes that meaning arises from relationships within these systems, what Bateson calls the “context principle”, and that communication is impossible without context. Information is never neutral. It moves, transforms, and reshapes the systems it inhabits. Media are not just channels for messages but dynamic, ecological systems where signals and noise coexist, and new meaning emerges from the appearance of noise. Noise is not simply interference but a generative force that shapes, disrupts, and enriches meaning. By situating media within ecological systems, Clarke challenges the notion of isolated tools and instead presents them as active participants that both shape and are shaped by the information around them.
Work Cited
Title cover and Images created in Canva By Alisha and Sam
Clarke, Bruce. “Information.” Critical Terms for Media Studies, edited by W. J. T. Mitchell and Mark B. N. Hansen, University of Chicago Press, 2010, pp. 157-171. Accessed 1 October 2025.


















