While some observe and lament the way digitization necessarily strips communication of its interesting imperfections, others contend that digitization, by reducing communication to its basic components, produces a lingua franca, capable of facilitating universal communication (van Dijk, 2006).
Being striped of errors, repetitions, and static allows digitized information to be easily stored and transferred, permitting the “easy manipulation and display of these data” (Verhulst, 2002: 433).
There are no in-betweens.” That digital bits have only two possible values leaves many to argue that, in the words of Robert Pepperell (2003, 126), “digital information is discrete and ‘clean’, whilst analogue information is continuous and ‘noisy’.” Robinson (2008, 21) defines analog as: “smoothly varying, of a piece with the apparent seamless and inviolable veracity of space and time; like space and time admitting inﬁnite subdivision, and by association with them connoting something authentic and natural, against the artiﬁcial, arbitrarily truncated precision of the digital (e.g., vinyl records vs.
CDs).” One example is the synthesizer, which in the mid 1960s and 1970s produced sound through “continuous variables such as changing voltages” instead of binary 1s and 0s (Pinch and Trocco, 2002, 7).
While popular and scholarly accounts often describe digitization as a technical process, humans have delegated particular decisions about what signals should be kept and what should be thrown out to algorithms that carry out digitization processes.
Negroponte stresses the universality of digitized information, arguing that “because bits are bits” they have the ability to “commingle effortlessly” (1995: 18).
A bit can interact with any other bit, regardless of “the forms that were initially transformed into digits, or what the digits represent when accessed by the end-user” (Flew, 9).
Yet, the universality of digital information requires that it be stripped of any non-essential “additional information” (Dretske, 1982: 137), or of any “intrinsic redundancies and repetitions” (Negroponte, 1995: 16).
Somewhat later, Leibniz’s ideas came to form the basis of the Morse alphabet, and therefore Morse code, which became the standard system of the telegraph.
Morse code, as a binary system based on only two different states, proved far more resistant to transmission, coding, and decoding errors than alternatives (Vogelsang, 2010: 7).