How does intertextuality change our use of the English language? Its influence reaches far beyond the addition of new words, and the way we form sentences, stretching into the shapes of fonts and scripts. The ability to publish words, phrases, instantaneously means that we as user/generators are shaping the English language every day—changing it, adding to it, subtracting from it (or condensing it, see Acronymity), molding it, and affecting “common usage”—all with a few clicks of buttons.
I teach English 101 and 102 (College Writing) at a community college and one of my favorite lectures to give to start the semester is about how English is a “living” language. What does it mean to be a living language? It means that not only is this language currently used by billions of speakers, but the lexicon expands—branches out to include new words and phrases (including slang), but also absorbs words from foreign languages as well.
Around sixty-five years ago, when Scrabble came to the market in the United States, the Great Depression was holding the world in an economic headlock. In the average American home, families might have been slapping down tiles, spelling out words like “beat,” “goons,” and “Aces” (“Scrabble,” Merriam Webster). Fast forward to this decade in America: you see far more use of obscure dictionary words, high-point-scoring words built using Qs and Zs, and strategy played around the bonus points squares. Not only have Americans, and Scrabble-players worldwide become adept at Scrabble strategy, our collective vocabularies have expanded, and surely the number of acceptable Scrabble play words has expanded, too.
This brings me to the point of Acronyms and how they end up in our common word pool. Throughout the life of Scrabble, there have been debates about whether or not to count acronym words such as AWOL, radar and SCUBA as legal words. However, as these terms became more widely used they tended to supplant the phrases that they substitute. AWOL, meaning absent without leave, and radar which is defined as Radio Detection and Ranging), stand alone as “words,” each accepted for its own acronymic meaning (“Desertion,” “Radar”). At one time, these were uncommon, unfamiliar terms. But through regular usage, they became common. The speed of which these terms are absorbed has quickened, and we find new contributions to the lexicon at every turn.
In the age of user/generators and sender/receivers, those of us who are plugged in and logged on textually are directly affecting the use of language. In mere milliseconds, a new phrase or term could “go viral” via SMS texts, Facebook posts, Tweets, or on the wave of a myriad other digital delivery systems. We shape our language by effectively voting for what terms appear again and again in common usage, simply through our speech and textual communications.
According to Media Studies professor at MIT, Henry Jenkins, in his book Convergence Culture, the old definition of convergence was centered around the idea that eventually all of our media devices, players, entertainment gadgets, text-based work machines and communications devices might one day be combined into one centralized communication/entertainment “blackbox” to service the household (Jenkins 17). Though technologically possible, this hasn’t exactly happened yet (although the advent of multi-purpose cell phone technologies has brought us darn close). For most Americans, our social nature still plays upon us to have other forms of entertainment and personal communications outside of/in addition to the “blackbox.”
One thing that Jenkins gets right on the money is the fact that new tactics of modern media strongly enforce and encourage a higher degree of audience participation and authorship. For the most part, in Convergence Culture, he’s talking about the fandom world, and using examples like Survivor spoilers, American Idol message-board die-hards, and multi-level consumerism of The Matrix franchise(Jenkins). But this idea of universally accessible authorship and user/generator creation goes above and beyond the consumerism cult and can also be applied to the adaptation and evolution of language.
Take, LOL, OMG, FYI, and the textual heart (♥) as examples. This year, the Oxford English Dictionary announced the addition of LOL, OMG, and FYI to its listings of officially recognized words (“Is English Changing?”, “New Words: New Initialisms…”).
♥ to heart
The new sense added to heart v. in this update may be the first English usage to develop via the medium of T-shirts and bumper-stickers. It originated as a humorous reference to logos featuring a picture of a heart as a symbol for the verb love, like that of the famous ‘I ♥ NY’ tourism campaign. Our earliest quote for this use, from 1984, uses the verb in ‘I heart my dog’s head’, a jokey play on bumper stickers featuring a heart and a picture of the face of a particular breed of dog (expressing a person’s enthusiasm for, say, shih-tzus) which itself became a popular bumper sticker. From these beginnings, heart v. has gone on to live an existence in more traditional genres of literature as a colloquial synonym for ‘to love’. (Oxford English Dictionary Online)
On its website, OED Online, the publishers of the Oxford English dictionary explain why these “initialisms” have been voted into the dictionary due to their surfacing more and more often in common usage. The OED explains that while these acronyms are most commonly associated with electronic textual communication, their research has revealed that many of their first appearances go back further than the Internet (OMG from a letter dated 1917, LOL previously stood for “little old lady”) (“New Words: New Initialisms…”). These terms are recognized as informal language and termed “gossipy,” their usage is considered most useful to shorten phrases in order to fit a longer expression into a form with a limited character-count, to save time, or perhaps to parody more serious, thoughtful communication (“New Words: New Initialisms…”).
In the age of media convergence, consumer participation has emerged as the central conceptual problem: traditional gatekeepers seek to hold onto their control of cultural content, and other groups… want to give consumers the skills they need to construct their own culture,” Jenkins writes (204).
But what happens to the “gatekeepers” of old? Do they simply go away?
I’ve found myself all ready in this double-bind playing my role as English teacher, but also poet and creative writer. I use slang in my classroom, and in my writing, frequently. But my younger students get me. If I say to a class of freshman, “I’m really feelin’ that definition of an adjective,” or that I dig the introduction to a student’s paper, they understand what I mean and accept that I am using their language. However, in my duties as instructor and grammarian, I am often compelled to turn around and scold them for using this type of informal language in their papers, since the purpose of the course is to help prepare them for college-level writing. They are in my classroom to learn the nuances of formal writing; to become the voices of authority on assigned topics or concepts conceived from their own, original research, by using appropriate terminology, style and voice.
I still maintain the outlook that traditional standard-bearers of the English language should embrace the idea that slang and new terminologies (such as the text-speech acronyms now incorporated by the OED) are not wrong, they are simply new. English is a dynamic, living language. It is growing and expanding—absorbing new terms and features—everyday.
The rules dictate formality and set a baseline standard for the English language because they were adopted by our predecessors. Ultimately, a language must have a set of standards for building expression, or the users have no rules to go by. No guidelines means no cohesion; nothing holding the language together.
Though we must maintain these general laws and commonalities, the instruments of change are in our hands, and therefore, we should feel empowered to use them, not only to de-center the previous forms of power/ownership over language, but to author our own new terms/language/structure, and create a new, more useful, more accessible and democratic model.
Hypertext’s deepest contribution to language is immediate, meaningful, widely accessible new terminologies and information transfer, but,
Convergence doesn’t just involve commercially produced materials and services traveling along well-regulated and predictable circuits. It doesn’t just involve the mobile companies getting together with the film companies to decide when and where we watch a newly released film. It also occurs when people take media into their own hands. (Jenkins 17).
The widespread access of language through the web, and the ubiquity of WiFi and expanding use of immediately-web-accessible devices helps streamline language. Bulky terminology is reduces to acronyms, abbreviations, and shortcuts to facilitate easier, quicker communication. This streamlining produces a webby spinoffs of the language; enhanced digital versions of composition, rife with jargon and replete with hidden, intertextual meanings (coded language). “[Pierre Levy] suggests that ‘the distinction between authors and readers, producers and spectators, creators and interpreters will blend to form a circuit … of expression, with each participant working to ‘sustain the activity’ of others” (Jenkin 95).
Coding languages and programs, such as HTML, CSS, java, binary code and others, have their own nuances and influence, as well. Thirty years ago, there was no such thing as a Maxobjects or an Lobjects; 80 years ago, the word “scripts” referred to writing which were purely textual, and codes were linguistic systems processed orally to hide or scramble information in case it fell into enemy hands. There was no Elite English (l337). How did these come into being? As proponents of l337 explain, eventually all languages begin to adapt with longer, bulkier terms getting shortened. Futurists might predict that the best way for all of us to communicate with the greatest ease and efficiency would be to invent/adopt a universal language lexicon broad enough to include any and every word (in existence in the universe) and should also be “living;” expanding and growing to future incorporate words invented for things never previously defined, such as modern terms like, staycation and defriend (Oxford Dictionaries Online).
“If we spoke fewer languages, communication between different peoples would be easier… in a [the era of global, instant communications] a common language may seem desirable” (Language).
This plan for a common, universal language has been attempted in the past, with little success. In 1887, for example, Polish physician Ludvig Zemenhoff created a new language called Esperanto. It was a simple combination of words and bases of words commonly found in the dominant European languages. Zemenhoff designated just a few official words and practical, simple grammatical rules that could be picked up quickly and easily by many users. However, it never really took root anywhere because it had no connection to place, nor a tie to a common people (Language).
The English language, by default of its wide use and its dominance in the languages of commerce and transportation, has become a universal standard for industry and politics. Even though more than 1 billion people on the planet speak Chinese—the most widely spoken language, English is growing faster and is in wider circulation simply because of historical governmental powers of the United Kingdom and political power of the United States. It is not likely the world will ever boil things down to one universal language. However, some linguists theorize that by the middle of the 21st century, we could get down to 100 (Language).
A text’s ability to be converted to other formats, using interactively in other programs speaks directly to intertextuality. Lately, and more ubiquitously, this has been used to cross-market entertainment products, from video games to movies, to software, to teaching tools and other support tools. But user/generators also find success in connecting various services and information resources in order to reach wider audiences, while adapting their messages to be more specific. Social networking is the most obvious vehicle for not only one’s self-expression, but also physical location (GPS, foursquare.com, etc.) and also data sharing labor and work (Google documents, Sharepoint, Acrobat Pro, etc.).
Jenkins advocates that for each intertextual component and hypertext link to fulfill its use and full potential, every separate (though interlinked) item must perform its own, exclusive function (106). It is wasteful, as far as resource management and even data/server storing, to have too much information repeated, and it also create disinterest and boredom in the user if there is redundancy within a transmedia, multi-linked product.
A transmedia story unfolds across multiple media platforms, with each new text making a distinctive and valuable contribution to the whole. In the ideal form of transmedia storytelling, each medium does what it does best—so that a story might be introduced in a film, expanded through television, novels, and comics … Each franchise entry needs to be self-contained so that you don’t need to have seen the film to enjoy the game, and vice versa. (Jenkins 97)
Adapt or Die!
Warnings About the Hyper-Dynamic Nature of Intertext Languages
Just as Socrates feared the advent and widespread use of a written language might cause a dramatic shift away from spoken intellectual conversation and dramatic debate, many theorists now warn of a shift away from face-to-face communication to a mode of communiqué that is entirely digital (Landow 47). As Landow explains in Hypertext 3.0, there has long been a tendency for academics and linguistic theorists to distrust technology and its impact on language. Landow points to academics and new theorists who are sounding the alarm that computers, cell phones, networks and digital devices are separating users by even more artificial means.
If we become a global community of screens, relying on our digital devices to create, distribute and share all communication, what is the cost? What is the worth such subtle, but meaning-packed, figments of human communication such as body language, cadence, tone, inflection, dialect, and how might we preserve these?
Landow says, for those of us, “who live in an age in which educators and pundits continually elevate reading books and attack television as a medium that victimizes a passive audience” it is strange to hear criticisms that cheap, easily-distributed information could have any ill effect on society (47). But the criticism does not lie in the argument that digital information delivery devices are bad for people, but the behaviors around their use and the extra layers of distance between humans posed by the varying lens after lens and screen after screen separators are what render us “passive,” adding distance, but also making us, as communicators, less articulate.
To some, this idea of “speaking” purely through devices seems absurd and impossible. We could never erase or forget about the value of real, human contact, they say.
While many maintain that networking supports community and digital information transfer can only promote further-reaching dialogue, more avenues for free expression, and expansions of the collective experience, others feel, like Socrates did; that the ubiquity of private-user web communication culture fosters an unhealthy, un-human “denaturing” of language; “an anonymous, impersonal ‘denaturing’ of human speech” (47).
Another worry stemming from the immediacy of changes and additions to the lexicon via web is that discovering the difference between truth and fallacy (verifiable, provable truths and arguments based on bias, misinformation and/or fallacy) will become impossible. As information gets uploaded to the web, the stack of claims grows thicker and thicker, without a system of checks of validity for all information posters. Just as I could come along and make a claim about the web opening the door to immediate, permanent changes to our language, someone else could come along and similarly theorize/offer a rebuttal claiming the web doesn’t foster and influence growth and changes to the language. Without support and evidence, neither claim holds true. However, who are the ones out there policing when unsupportable claims, or “incorrect” uses of language are employed? The web (especially the web of Western, English-speaking cultures) is too large too police; no one could possibly check every fact that is posted each millisecond of the day. Therefore, with the hyperactive nature of linguistic expansion, truth gets harder and harder to prove.
Finally, and related to the worry of “digital denaturing” of language and the uncontrollable wild-fire spreading nature of misinformation, some linguistic theorists also worry that if the rules and terms change too fast within our language, and we no longer have a common ground upon which to stand, we will one day no longer have a base of language to understand one another.
“Things fall apart, the center does not hold…”
Because human beings can easily adapt, our language(s) will continue to grow, spread and evolve. “If, in the future, we lack the words to describe [things around us], chances are, we’ll invent the words we need (Language).
Hypertext changes how we speak and how we are spoken to by leveling the field and democratizing texts. As user/generators, we becomeinter-connected, we become all “textual.” (With some obvious exceptions: Third World, victims of oppressive regimes, the incarcerated).
In this essay, I have gone into little detail about how hypertext can (and does) change the rules of authorship and the structure of the narrative. As is exemplified, the rapid accessibility of texts can lend itself to immediate influence and lending of ideas across mediums and, potentially, across cultures and ideologies. Hopefully, this great accessibility can help foster more discussions about diversity, tolerance, and difference, while simultaneously initiating positive change to the language, and a culturally-inclusive, dynamic and progressive evolution of the English language, while maintaining a baseline standard of grammar, usage, and mechanical laws for all English language speakers and writers to adhere to. Without this basic linguistic foundation, it is warned, some parts of society would lose their basic, common, cultural ground, and the effects could be as devastating as prophesized in Ray Bradbury’s short story, “A Sound of Thunder.”
Please see Attached PDF Works Cited Page for all sources.