Abolishing the Red Squiggle
“At any one time, language is a kaleidoscope of styles, genres, and dialects.”
— David Crystal
“Linguistic prestige is not an indication of intrinsic beauty in languages but rather of the perceived status of those who speak them.”
— Sarah J. Shin
If you’ve been around children for any amount of time, you’re sure to have heard some unique utterances — asking what you “buyed” when you “go-ed” to the store, telling you their drawing was “gooder” than yours. While we often associate such occurrences with children, I suspect most of us have caught ourselves realizing we’ve misspelled a word because it “seemed” correct — “calender” rather than “calendar”, swapping “affect” and “effect” or “principle” instead of “principal”. The reason that we clock these phrases and spellings as incorrect is due to “standardization”, a process in which a codified version of a language or group of language varieties is created for use in some official capacity. Standardization is so familiar for most of us that it operates like a fish-in-water scenario – that is, we’re so submerged in it from the beginning that it’s difficult to spot. What would destandardized language even look like? While it may be hard to imagine, I’m here to tell you not only that we can destandardize our language, but that we must.
Part I: The Creation of the Western Standard
In order to understand how and why we must counteract standardization, it’s necessary to have a baseline understanding of where the modern “Standard Language” came from. I imagine that the average person has a rather neutral, harmless, perhaps even positive view of the Standard Language, however this is far from reality. While this section may seem a bit heavy on exposition, believe me, it’s non-exhaustive.
Western Standardization
While you may be familiar with definitions or spellings changing over time, this process is commonly seen as a series of piecemeal adjustments to an ancient standard that’s existed for time immemorial. However, standardization is not necessarily the default state of languages – tweaked over time like a mechanic working on a car. “Ancient Greek”, as many refer to it, existed as a cluster of dialects with no one being considered more correct or proper than the other, until the emergence of Koine Greek in the 4th century BC. Language negotiation is even present as far back as the period of contact between Sumerian, Akkadian, and other Semitic languages. The English language didn’t standardize until the 15th century, when the printing press permitted the proliferation of the Chancery Standard — a version of English that was used by the bureaucracy in documents intended for the king — beyond the walls of Westminster.
English wasn’t alone in this endeavor. The Spanish language underwent its first form of standardization in the 13th century with King Alfonso X of Castile and Leon. Though other standards arose at the same time, Castilian won out in the end. Italian was crafted from the Florentine Tuscan dialect employed by Dante Alighieri in the writing of the Commedia in the 14th century. The reach of this new standard would remain limited until the passing of the Coppino Act in 1877, which made schooling compulsory throughout the newly unified Italy. Prior to this, Standard Italian had mainly been a written language, spoken by practically no one. The French language emerged out of local northern dialects and saw instances of standardization in the form of several dictionaries and grammars throughout the 16th century — culminating in the L’ordonnance de Villers-Cotterêts, in which François I declared that French was the official language of the Kingdom of France. German wouldn’t see standardization until the turn of the 20th century, when the Second Orthographic Conference drew inspiration from the works of academics Wilhelm Wilmanns and Konrad Duden in order to form one language for “all German-speaking states”.
In the midst of the 17th century, the Treaty of Westphalia contributed to the emergence of many modern European states. By laying the groundwork for this new nationalism (defined by uniting the aristocracy and government of a place), the goal of these nascent nation-states shifted — generate social, political and economic capital by convincing citizens to contribute through the manipulation of their loyalty.
Linguistic Nationalism
As I detailed in my post on conlang ethics, languages are inextricably linked to the culture of their speakers. This is true for vocabulary, such as the word “candidate”, which comes from the Latin “candidus” (shining white) due to the white togas candidates for office wore. We also see it in phrases which reveal differences in perspectives — such as asking someone what they “do for a living” versus what they “do for work”. These kinds of cultural facets are often the most salient features that the average person uses to distinguish between dialects. A dialect may have a unique term for geographical features characteristic of that locale (holler, in Appalachian English — meaning a small sheltered valley, usually with some water source), or terms that reveal historic interactions (agbada, in Nigerian English — meaning a loose fitting robe worn by men).
When a standard language is created and propagated by a State, it is removed from the context that it co-evolved in. Instead, these organic influences are replaced by the values and perspectives of the State which has co-opted it, becoming a mechanism for spreading its socio-political culture. The first step of this strategy involves the formation of regulatory bodies. The trend started with the 1583 founding of the Accademia della Crusca in Florence, which went on to publish the standard Italian Vocabulario in 1612, which the authors based largely off the work of Dante. Inspired by the Italians, the Académie Française came into being in 1637 when King Louis XIII granted it legal recognition to “give certain rules to [our] language and to make it pure, eloquent and capable of treating the arts and sciences” where “rules will be issued for spelling that will be imposed on all”. Prompted by increasing French influence, the Real Academia Española followed a similar trajectory a bit later in 1713. To this day, both institutions make regular attempts to assert their dominion over their respective languages. In the case of the Académie, their battles include gender inclusive language as well as North and West African influence. Similar disputes have been raised by the Real Academia, particularly regarding gender inclusivity.
Once these regulating bodies were in place, the next step was to minimize local languages. This was explicitly called for by bishop and revolutionary Abbé Grégoire, who insisted that the “annihilation of patois” was necessary for the universalization of French in 1794. Similarly, Italian left-wing philosopher Antonio Gramsci expressed that Italians should be willing to abandon their “dialects”, lest they be left behind by modernity. This would prove to be a driving philosophy of the 18th century as the West sought to modernize. Though State-run public schooling was driven largely by class conflict that followed in the wake of several revolutions, it was swiftly re-tooled as a means of statecraft. As historian James B. Collins states:
“We see in this history a transformation of literacy: from a plurality of scriptal practices embedded in a commonplace working-class culture of political dissent, to a unified conception and execution, centered on the school, with deviations from the school norm attributed to deficiencies and deviations in working-class homes, communities, and minds”.
He goes on to explain how this “universal literacy” becomes synonymous with literacy itself, despite the persistence of other literacies (perhaps I’ll write a post on Freirean literacy in the future…). This reined in the revolutionary attitude, re-establishing power in the State.
In a similar vein, Dr. Stephanie Hackert discusses (in her article Linguistic Nationalism and the Emergence of the English Native Speaker) how language became the single “constituting element of national belonging”. Essentially, the concept of a “native speaker” functioned in a way to validate being born into the citizenry of a nation-state. In order for this to work, States must adhere to what is called “monoglossic language ideology” where, in order to maintain the veneer of monolingualism in a society, instances of multilingualism are ignored, or punished. Standard languages became used as a litmus test of belonging to the nation — since the State culture was largely artificial. In their article Undoing Appropriateness: Raciolinguistic Ideologies and Language Diversity in Education (2015), professors Nelson Flores and Jonathan Rosa discuss how this construction of the native speaker is often racialized. This categorization lead to Irish, Scottish, and Welsh English speakers being relegated as second-class citizens due to their non-standard Englishes. Dr. Tomasz Kamusella demonstrates the inverse of this idea in the context of Central Europe, where the newly standardized German language was used to stake claims concerning which nations “should” be included in the German nation-state. In his doctoral thesis, Dr. Pontus Lindgren Ciampi demonstrates a similar 19th century state-building strategy in Eastern Europe through the example of Serbia and Bulgaria. Even today, the French state rejects the “arabization” of the French language through wesh or kiffe, ignoring former Arabic loanwords such as algèbre, alcool, or magasin.
Linguistic Imperialism and Linguistic Violence
This strategy was integral beyond the bounds of Europe as well, as newly centralized Western states pursued new sources of capital through colonization. In the U.S., there has been a history of linguistic violence from the country’s inception — including cutting off the tongues of enslaved Africans who spoke their first languages, the restriction of/punishment for using Indigenous languages in boarding schools, and the forced implementation of English as the language of instruction in Puerto Rican schools.
The Welsh Not, a physical token of shame for those who spoke Welsh in some schools, mirrored practices in other, later British colonial holdings. The use of English in colonial institutions established the language as the language of prestige and later mobility, mirroring the establishment and proliferation of standard languages by the aristocracy a few centuries earlier. With the language’s themselves, European colonization also imported concepts of language standardization.
Violence also took the form of crafting writing systems for African languages that manipulated the languages’ sounds in the hope that it would eliminate sounds unfamiliar to Europeans which they had deemed “less than human”. Since standardized languages were adopted by States as vehicles for spreading their ideologies, their usages as tools of cultural genocide that became rampant in the age of imperialism is predictable. This system wasn’t unique to the West either, as evidenced by the use of “hōgenfuda” dialect cards in post-Meiji Japan, particularly in Okinawan schools in the Ryukyu Islands — used to shame speakers of non-Tokyo dialects.
However, while this kind of violence persisted through the 20th century, it continues to this day. The European Union receives criticism for its exclusion of minority language and selective preference for specific, standardized languages – as the Council of Europe’s Charter on Regional and Minority Languages expressly excludes “dialects” of Standards and languages of migrant communities. Though the United States lacks an official language, many attempts have been made to render English the official language of the federal government – most recently in 2005 and 2021.
The appropriation and misuse of AAVE (African American Vernacular English) continues a long tradition of decontextualizing Black speech and canonizing it into Standard American English. All the while Black Americans face racist, material disadvantage for speaking in a manner regarded as “less professional” or “less intelligent” than the Standard. Latinx communities in the United States face a similar discrediting when Chicano English is mistaken for “incorrect English”, and simultaneously as “incorrect Spanish”, rather than as a distinct dialect. Casual hatred of discursive filler word “like” or colloquial quotative terms like “to be like” — which originated in the Southern Californian sociolect “Valleyspeak” — is regularly employed to discredit women. This tactic is also used against vocal fry, which can be found in other languages as a phonemic feature. (Known as “laryngealization”, creaky voice is used to distinguish the meanings of words in languages such as Jalapa Mixotec, as well as Danish.) Southern American English, Appalachian English, and other various dialects are regularly associated with lack of intelligence or naivety, due to their connection to working-class and/or rural communities, and a similar tendency exists in the United Kingdom with regard to their regional and working class varieties. Despite regularly innovating new terms such as “clock”, “slay”, “shade”, and “tea”, queer and trans communities face regular policing of the terms they use to describe themselves.
Part II: Understanding Destandardization
So now that I’ve established the history of Western language standardization, and the harm born of it, I can present the alternative — destandardization.
What is Destandardization
In my experience, people’s initial reaction to the concept of destandardizing is to jeer. “What, so we should just let anyone spell anything the way they want?” This misunderstands destandardization as a series of individual acts of linguistic rebellion, rather than a systematic shift in our perception of and relation to language. Standardization IS a system. When the language of a select demographic of people is established as the “right” form of the language, said demographic will be afforded an advantage withheld from others. In the case of the United States, maintaining white, middle class American English as the standard means that white, middle class Americans are more likely to have already encountered the language of media, of education, of the workplace, of politics — of society. Meanwhile, speakers of other Englishes are left to learn a second form of their language all while being chastised for the version they were raised with.
If standardization serves to spread and reinforce State power, destandardization is the movement to remove the control that the State wields over the use/perception/validity of language use. We weaponize this State-backed power all the time. An interesting challenge to pose to someone whose gut-instinct is to defend standardization is to point out that in order to “correct” someone, you have to understand what they meant. When someone says “Where are you at” and you reply “You mean, where are you”. You’re not conveying that you didn’t understand them — you clearly did — because you were able to tell them what they “should” have said. This adherence to the Standard acts to position yourself above them.
A core tenant of destandardization is the communicative principle of language, which emphasizes the ability to communicate ideas, expressions, or thoughts as the primary function of language. As a result, language can’t be deemed “right” or “wrong”, rather it can be deemed “successful” or “unsuccessful”. This is the systemic shift in priorities needed to create changes across the board. It leads to more effective communication across dialects and accents — and eliminates the preferential treatment of one.
(Interestingly enough, when this principle is applied and State influence is diminished, mutual intelligibility becomes one of the key markers of language/dialect differences. This can really rock our perception of languages as demonstrated in NativLang’s video on asymmetrical intelligibility, and Name Explain’s video on mutual intelligibility between Scandinavian languages.)
What Does It Look Like?
What might destandardization look like in practice? The classroom — the domain I’m most familiar with — is rife with opportunities to destandardize. A child is telling a story and uses the word “chile” (AAVE) rather than “child” (“Standard”), or incorporate “güey” (Chicano English) to refer to a friend. The difference in reinforcing or dismantling standardization is how we respond to these scenarios.
By saying “Say child, instead” or “don’t use that, that’s not English”, the student’s language and culture are invalidated and deprioritized for the sake of the Standard. If there’s any confusion, you approach it like you might any linguistic confusion. “I’m sorry, I’m not familiar with that term, what does it mean?” or “Some English speakers use ‘chile’ instead of ‘child’. It’s similar in many ways and different in others.” This can be applied to grammar as well — communicatively, there’s simply no need to correct “Where are you at?” or “I ain’t had no dinner”.
You may have been on-board so far, but what about writing? This is where many people have a hard time wrapping their mind around destandardization, because “correcting” is so ingrained in our reading brain. Look no further than the comments section of a YouTube video, or the replies to a tweet, and you’ll see several snappy “*you’re” and “*there”. Despite being one of the most normalized corrections, homophones (words that sound the same, but are spelled differently), are some of the most glaring sirens of the absurdity of standardization. For many English speakers, there is little if any functional difference in the pronunciation of their, there, and they’re — and even less so between to, two, and too. Yet, we don’t normally struggle to ascertain which is meant in speech. I’d argue that this is similarly true in writing. If the message is communicated, why “correct”?
What About…?
Now that the idea is out there, I can address some common critiques of the idea of destandardizing.
What if I don’t understand what they said?
This is certain to happen. Truly, it’s a wonder that we’re able to communicate at all, ever. Language is a massive game of telephone where speakers try to externalize their thoughts to others through limited biological (vocal chords, manual gestures, and cavities) and tactile (writing implements) means. Not only that, it is done across space, and across time, and across contexts — complicated by social norms and expectations. You WILL miscommunicate. The goal is not to avoid miscommunication, it’s to mitigate the effects of a miscommunication. Your objective isn’t to establish dominance over another speaker by flexing your employment of the arbitrary standard. Rather, it’s to understand and be understood.
When what you mean is “I didn’t understand what you said”, say just that. “I’m not familiar with that term” or “I’ve never heard that word before”, all demonstrate your willingness to learn from others and establish that you are not looming over them. This happens within dialects all the time, it’s not too different cross-dialectically. When encountering a non-standard spelling, “I think you spelled that wrong” wields the Standard against your collocutor. If they pronounce the word that way, is that spelling wrong for them? Instead, “I’m unfamiliar with this spelling, what are you referencing here?” indicates that the goal is to understand, and that you do not believe your dialect to be better or more “right” than theirs.
Flores and Rosa, mentioned earlier, define this outlook as “additive approaches” which “promote the development of standardized language skills while encouraging students to maintain the minoritized linguistic practices they bring to the classroom”.
What if I’m an academic, or a professional?
The contexts I’ve mentioned so far have mostly been in classroom settings, but how does destandardization work in other contexts? For example, I’m writing this article in what many would consider Standard American English — why don’t I write it in my own dialect?
Throughout history, lingua francas (common languages) have been established for the purposes of commerce, diplomacy, and other intercultural interactions. Though people may retain their own language in their everyday lives, or even in their communities, an additional language may be used to facilitate communication across spaces where many languages are spoken. While some may claim that standardized languages serve this purpose for dialects, this is not usually the case. As we’ve seen, standards elevate the language of the elite at the expense of others. This happened first in proto-colonial contexts within France or the British Isles, but also later in (post-)colonial contexts, where a Standard was imposed. If this imposition is the case, erasing the local language, how could a standard ever operate as a sort of lingua franca?
Co-Creation
As we’ve covered, the main issues with Standardization are 1) it operates contrary to the fluid, evolving nature of language, and 2) its implicit tendency to reinforce State hegemony through imposition. Would it be possible for standardization to occur without these two problems?
In the 1970s and 80s, the opening of a new school for deaf children in Managua attracted students from many different areas. While these children primarily used a variety of different home sign systems — they quickly co-created a new, standardized system known as Lenguaje de Signos Nicaragüense (LSN), which later evolved into the Idioma de Signos Nicaragüense (ISN) as new students arrived and acquired it.
Lorenzo Dow Turner, along with the work of many others, demonstrated how the influences of several African languages spoken by enslaved Africans trafficked to the East Coast of the United States were standardized into Gullah — a language still spoken by Black communities along the coast today. A similar process occurred between French and several Niger-Congo languages in Ayiti — becoming Krèyol. Turner would go on to be known as a seminal figure concerning dialectology, creole linguistics, and African American studies.
Closed Practice
A significant point is frequently raised concerning destandardization in the context of Indigenous and Minoritized languages. On one hand, the creation of standardized forms for the purpose of revitalization is often criticized for decontextualizing indigenous languages — subjecting them to a Western-style language-as-resource treatment that co-opts the languages. In many cases, Indigenous languages carry Indigenous knowledge, whose dissemination is meant to be closely guarded by specific community members. However, some Indigenous activists and linguists argue that without standardizing, Indigenous languages are deliberately kept beneath the influence of standardized Western languages.
Here, I return to the emphasis that co-creation must be a communal practice. The standard should not be imposed. In (post-)colonial contexts, this means that settler communities should not dictate the standardization status of Indigenous or Minoritized languages. A lot of advocacy is being done for models of learning Indigenous and Minoritized languages in ways that do not strip them from their context or commodify them — including this article by linguists Susan Chiblow and Paul Meighan.
Applications
As stated earlier, the implementation of standardized languages carries in itself a cultural violence. However today, in the United States, we’re seeing that cultural violence morph into a physical and structural violence in several areas as the State weaponizes language further and further — ICE has arrested US citizens for not speaking English, a college has threatened to fail students who include pronouns in their profiles, English has been made the sole official language of 28 states despite the United States having no official language. Queer, trans, and non-conforming people continue to have the way they speak about themselves and each other policed and punished by the State.
Destandardization not only serves to defang the State’s weaponization of language but also to facilitate productive communication — where we see each other and hear each other as we are instead of how we’re told we should be. Paulo Freire’s book Pedagogy of the Oppressed is an excellent read for anyone interested in understanding how the State sees and defines “literacy”, and to what end language and education can be used to empower participation in society and self-emancipation.
~Jon