Frontispiece
Contact

Ruby on Rails
PostgreSQL
Valid XHTML 1.0 Transitional
Valid CSS
RSS Feed
Canada
Header
Weblog :: Linguistics
The life and death of words 20.V.2009 14:29
Many people - from Sapir and Whorf, to Orwell, to Wittgenstein - have argued that the language we use determines the way we perceive reality; to quote the Austrain philosopher, 'the limits of my language are the limits of my world.' In reality, however, neither the prescriptivist view of a pure, ideal language, nor the more common view of language as a natural phenomenon changing like the tide, are really accurate. Language is, instead, best viewed as an evolutionary phenomenon. Evolution - just, supposedly, a theory - is one of the most powerful available to us for understanding the world. Its force extends far beyond its traditional domain of biology - any phenomenon that exhibits heredity is best viewed evolutionary, and one of the most important of these is human culture passed, with some mutations, from parents to their children. Our view of language tends to be static and permanent, but it is evolving, and not just in the sense of 'changing incrementally' - it is subject to a whole host of evolutionary pressures external to the language itself. Yet not one of the books I've read on linguistics mentions evolution at all - and this, I think, is a mistake.

Take the fate of the word 'niggard,' meaning, roughly, miserly or cheap. Unfortunately, it also sounds very similar to a word that's no longer acceptable in polite society. Even though it's a fairly high-status word only likely to be used among people with good vocabularies, it's simply too dangerous to take the risk, as Sen. Congressional aide Dave Howard found out in 1999, when was barraged by criticism and hate mail for using it. The word, for all intents and purposes, is dead in the English language, because whatever semantic need it may have filled, it has been brutally superceded by the need for political correctness - and, to an extent, the ignorance of the public. Just as the pressure on elephant's tusks to become longer has been brutally reversed by the ivory trade - they are now shorter and shorter in each generation - so 'niggard' has been driven, very swiftly, into existence. The pressure is sometimes purely phonetic - in this way, the word 'immanent' has been driven out of the spoken language, because English vowel reduction has made it too hard to distinguish from the (slightly) more common 'imminent'. There were simply too many synonyms - 'inherent', 'quintessential' - and its use was too rare to make survival possible.

Sometimes, though, the source of the pressure isn't trivial, but is driven by a genuine change in the society's ontology - broadly, the kinds of things it needs to describe. This is evident not only in uncontroversial words such as 'webpage' and 'internet' that are needed to describe concepts that simply weren't important decades ago, but also in that bugbear of prescriptivists, political correctness. Take, for example, the singular 'they', as in 'would everyone please make sure they have their luggage?' To be honest, although I probably ranted against it in high school English classes, I've made my peace with this: it reflects a genuine change in the way society perceives itself, namely, that context rarely suffices to determine the gender. In ages past, you'd pretty much know whether a woman or man was talked about based on, say, their profession, but now - with the exception of a few careers, such as CEOs or serial killers - either gender is likely enough to participate in them to need generic reference. English - because only a single word indicates gender - makes the pressure to preserve gender reference quite low, so the singular 'they' has made steady progress. This isn't to say all the proposals introduced by political correctness reflect our movement to a more equal society - I hope never to hear an absurdity like 'herstory' in conversation, and I'm fairly confident I never will. The point is, there was a gap between how we perceived the world and how we described it and something, in this case a reuse of the plural particle for the singular, came in to fill it. In a similar way, 'you guys' has become the third person plural pronoun to alleviate the evolutionary disfavoured ambiguity of 'you' that evolved out of medieval politeness.

In addition to these relatively 'natural' process, there's the more insidious one railed against by Orwell in his famous Politics and the English Language: favouring the survival of certain terms by manipulating the influential pressure of the printed media, the most recent example of this is the abominable 'enhanced interrogation,' torture to any generation before the current news cycle. Its sadly successful creation was prompted by the Bush administration's desire to avoid the negative connotations of the word 'torture', as well as the Geneva Conventions. What's sad is that it probably couldn't have gained currency without some support in society - broadly speaking, the belief that there is a real difference when We do it than when They do it; We're the good guys, so it must have been somehow justified. This is far from a new phenomenon - Ambrose Bierce defined impiety as 'your irreverence for my diety' - and the fact is such words reflect prejudices in (in this case, American) society whether we want to believe they're there or not. As a matter of fact, many languages have a kind of division of synonyms to indicate how distantly we identify with the subject, as I read recently in a Stephen Pinker book: I'm slim, you're skinny, he's scrawny; I'm uninhibited, you're promiscuous, she's a slut.

Moreover, when such neologisms don't reflect a corresponding change in society, they tend to be overwhelmed by their previous meanings. Take the endless failed PC efforts to come up with new words for people with disabilities: 'crippled', 'invalid', 'disabled', 'handicapped', 'differently-abled', and whatever else they might have thought up recently. The reason for each new word is because the previous one, whatever it's PC origins, came to take on the meaning that reflected societal perceptions of disability as something negative, and no matter how many positive overtones you add, this will be inevitable, which is why, thankfully, language change by edict almost never works. Similarly, when a negative perception disappears, we often don't need the euphemism anymore, which is why 'African-American' hasn't survived - when we say 'black', we don't mean anything racist by it, even if previous generations did; the long shadow of racism, however, ensured the eclipse of well-meaning words such as 'coloured' or 'Negro', the former still extant in the NAACP.

Wittgenstein may have thought that the limit of his language were the limits of his world, but the fact is when there is a disagreement between the two, it's the latter that tends to change. The 'eskimo-words-for-snow' myth is well-debunked, but a more Inuktitut place words are a more instructive case. Instead of simple words like 'here' and 'there', it has much more precise ones like 'up there' and 'down here by me' - exactly the kind of concepts you'd need to communicate quickly when hunting in a landscape with few landmarks. The Sapir-Whorf hypothesis said a society's thought is shaped by the language it uses, but the opposite is at least equally true - our language reflects our society, and when society changes, words, or phrases, or syntactic structures that are no longer useful die out, and new ones are born to take their place. What's fascinating is how seemingly irrelevant and chaotic the process can be, as in the case of 'niggard'. Dawkins argued that there was probably no evolutionary theory of language case, but used such examples as certain sounds carry better in the Himalayas shaping Tibetan, which is almost certainly not relevant. But in the case of lexicon, and syntax and, sometimes, phonetics too, evolution remains the best tool we have to perceive this quintessential human phenomenon.

Sana'a, Yemen Ye

Languages and dialects 14.XI.2005 05:06
The most intelligent thing ever said on the subject of languages and dialects remains Max Weinreich's comment from 60 years ago: 'a language is a dialect with an army and a navy.' Nevertheless, linguists and non-linguists alike devote an inordinate amount of time and print to discussions of whether this or that language is actually a language at all, seemingly oblivious to the fact that such a distinction teaches us nothing at all about linguistics. I think that this is one manifestation of the tendency towards overclassification that is present in dominant modern thought: we want to draw neat, sharp lines around everything, and always be able to say that it's one or the other.

The dialect/language issue is important enough that Wikipedia advises articles on Chinese languages to be appended with the generic label 'linguistics' so as not to offend anyone. But let's look at a particularly interesting example in the language/dialect debates: Hindi and Urdu. Now, I can pretty much guarantee that the majority of educated speakers will insist that these are separate languages with separate traditions, but this is clearly not the case. The problem is that too many linguists have accepted the very political divide of language vs. dialect, into which Hindi and Urdu don't fit at all.

Picture this situation in English. There's two towns, and in everyday matters everyone speaks the same language, with family, with friends, in the shops, the casual language is basically the same. However, sometimes in life the situation calls for an 'elevated' level of speech, with various degrees of pretentiousness. So, in Town A, when people want to speak dramatically, they use words of Latin origin, as Anglo-Saxons have done since the Battle of Hastings when trying to sound intelligent. But the residents of Town B are purists: they think English should never have abandoned its natural Germanic origins, and continue to create and appropriate Germanic words, considering these to be the best-fitting and best-sounding ones for the language. Additionally, while Town A has adopted the Roman alphabet, Town B continues to use a variation on Germanic runes.

So, what do we have here? I sincerely doubt we have two languages - these people sound exactly the same in most everyday situations. And yet it's not just a matter of dialects - the differing elevated vocabularies aren't a matter of different evolutions on the same thing, but wholly divergent traditions. This is essentially the situation with Hindi and Urdu - collectively, they form the Hindustani langauge, but while Hindi takes its educated vocabulary from Sanskrit, Urdu's comes from Persian and Arabic, very much like what happened to English at the norman conquest. Yet Hindi's commonplace vocabulary comes in good parts from Persian and Arabic as well - Bollywood films, for example, are in a variety of common Urdu which is perfectly natural and comprehensible to residents of Delhi. Because of the unnatural evolution of these languages, the language/dialect divide ceases to have meaning.

As is so common nowadays, instead of admitting the limitations of the terms 'language' and 'dialect', linguists created a baffling new term - 'diasystem' - to describe the special case of one language with two different standardisations. This is ridiculous. And yet, we see it throughout modern science - the debates over species and subspecies in biology, for one. I could talk for hours about the philosophical origins of such thinking, but I'll save that for another day - the point for the moment is that these artificial divisions take away from real scientific pursuits. Everyone knows evolution, be it linguistic or biological, is a fluid process - and yet when we look at contemporary matters, we want a perfect division, as if there's some microsecond when a dialect became a language, when a subspecies became its own species, where we can draw a line in the sand. These lines in the sand are a cancer on scientific thought, because they refuse to admit that it's all relative. A language's definition lies in the languages by which it's been affected, not into which artificial definition we're best able to fit it. But alas, fueled both by science on the one hand and politics on the other, it seems this kind of forced divide has many long years ahead of it.
Colorless green ideas sleep furiously 25.X.2004 21:07
PHIL 355 (Theories of Reality aka Metaphysics) is definitely the most worthwhile academic endeavour in which I've ever taken part. It just seems like, for once, I'm taking a course that actually discusses, and forces me to think more rationally about, things which are important to me. Now, obviously, to most people in the world, metaphysics is not exactly up there in terms of concerns, but to me, nothing could be more important, at least from an intellectual-analytical perspective. I really do want to try to understand the world as best I can, and metaphysics deals with the most basic level of this: what it means for things to exist. Almost all the sciences have tacit metaphysical assumptions that they never explore, because they are simply widely assumed, based either on common sense or observation. However, it seems to be to be at least equally important to consider, and justify, these assumptions, and this is exactly what metaphysics is concerned with doing. I recommend anyone even remotely interested in philosophy take a metaphysics course. It's the foundation upon which all else is laid.

Specifically, I'd like to write about a problem we've discussed in class (it was on my recent midterm, which I think went pretty damn well), this being: is x exists a predicate. This is mostly intended as a rebuttal to the ontological argument, but it has a number of interesting side consequences when it comes to talking about things from a formal logic standpoint.

First of all, we must see what we are really saying when we make a predicate statement like, 'the Prime Minister of Canada is uncharismatic,' which he most surely is. Logically, though, what we are saying is 'there is something that is the Prime Minister of Canada and uncharismatic, and nothing else is the Prime Minister of Canada.' This sort of formulation, although farther from the natural language equivalent, it saves us from false statements being dismissed as nonsense. For example, if we say 'the present King of France is bald,' we cannot say that this statement is true or false, but are forced to dismiss it as nonsense. However, if we recognise this statement for what it really is, 'there is something that is the present King of France and bald, and nothing else is the King of France,' we can easily see how this statement is false: there is nothing which is the present King of France.

In this case, though, what can we do about existence as a predicate. Let us take an example from my textbook. We define a bachelor as something which is an unmarried male, and a superbachelor as something which is a bachelor and also exists. The question is: have we added anything to the definition of the former by positing existence as a necessary trait? Note that this is one thing that supporters of the ontological argument, from Anselm to Plantinga, have said of God: any maximally perfect being must have existence as one of his attributes. Thus they define God into existence, and existence being a predicate would allow them to do that. But is existence really a predicate?

If it is, then a problem arises when we try to make statements about non-existence. When we say 'x does not exist', this is a statement of the form 'subject-predicate', which is to say, logically, 'there exists a subject such taht subject is predicate', which would lead, in our example, to 'there exists an x such that x does not exists,' which borders on nonsensical. There are a couple ways to dance around it, most notably wading into the murky realm of 'unactualised possibles', objects that could exist, but don't. So, saying 'x does not exist' is really saying 'there exists an x such that the existential possibility of x is not instantiated.' However, such evasion becomes impossible when dealing with truly impossible objects.

The title of this post comes from Chomsky, being his famous sentence which he used to prove that impossible sentences do, in fact, make sense to us as long as they're syntactically valid, even though they refer to nothing in the world, or even to anything possible. Similar we could discuss something like, to use Quine's example, 'the round square cupola atop Berkeley College.' Needless, to say, there is no such thing: most of us would agree that it is true to say 'the round square cupola atop Berkeley College does not exist.' Yet this is not an unactualised possible, it is a plain impossibility. Thus, there must be some logical way of stating that it does not exist.

The solution is of course that existence is not a predicate. To say 'x exists', and consider it a subject-predicate statement, is as ridiculous as saying that in 'unmarried bachelors are unmarried', the predicate portion adds meaning to the proposition, which it obviously doesn't. This lends coherence to statements about incoherence: it is easy to see how saying 'there does not exist and object which is round and square and a cupola and atop Berkeley College.' Thus, it seems prettylogical that existence is in fact not a predicate, and thus the ontological argument, or at least one of its core arguments, is rendered invalid. If anyone's still reading at this point, give yourself one gold star!

On an interesting sidenote, one of the CompEng profs here at UW, who happens to be president of the Canadian Islamic Congress, made the news recently by declaring that all Israelis over the age of 18 are eligible targets for the intifadeh. His argument is that since military service is compulsory, all adults should count as soldiers of the enemy, but come on. It's not like you'd say war veterans are legitimate targets once they've been discharged. And why would you even make statements like that? To piss off the Jews? To make the government lean further towards Israel? To increase the distrust of Muslims in the west? Whom the hell do comments like that help? Nobody, that's who.
Art, language, and beauty. 24.VI.2004 00:23
To see the world in a grain of sand,
And Heaven in a wildflower,
Hold infinity in the palm of your hand,
And eternity in an hour.

The aim of art – of all, art but especially of linguistic arts, of literature, of poetry – is to express something larger than the piece itself. All art aspires to this, in fact, aspires to this; it cannot help but do so. The skill of the artist is to be able to draw out from the audience certain associations from the artwork, to make them think of that which is not explicitly included in the work, but is included implicitly.

Ultimately, this is most true of the arts of language, simply because of nature itself. Language acts as a web, each word associated with dozens, even hundreds of other words, each signifying something else, the totality of the associations making up the primary word’s definitions. At its core, the art of language is to craft a sentence of these associations, and take their aggregate as a larger meaning. For every word in a sentence, or in a poem, or in a book, each word conjures the spectre of a thousand others, to varying degrees, and it is these which form the experience of the book.

This interpretation can be shown to lead to the varying traditions of literary criticisms. In classical criticism, appreciation of literature is directly linked to education. This school of thought assumes there is, at the apex of literary learning, a common language, a common set of interwoven associations, superior to any other, which can be taught to those willing to learn. The ultimate in language-craft, then, is to write in the way most reflective of these associations, and anyone who studies literature sufficiently will be able to arrive at the same conclusion. A common set of language can exist, which will lead to commonality of writing and commonality of interpretation.

This is, of course, to some degree true. Much of the body of literature is self-referential, and the ignorance of the foundations of literary tradition has led to much of the reduced literary appreciation in modern society. It is impossible to appreciate much of even Shakespeare’s work without being familiar with the thousands of years of literature which preceded him. A single word can refer to another passage in another book, or even the entirety of an epic and all associated with that. The title of a book or a character therein has a greater multiplicity of linguistic associations than almost any other word. Many of these remain engrained in our consciousness, and the meanings of authors who employ them are preserved: there are few unfamiliar with the symbolism of Adam and Eve as primeval humans. What, though, of Antigone, or of Ariadne? And what of noble Oedipus reduced to a mere Freudian metaphor from the once evocative name standing for virtue, self-sacrifice, and the ultimate futility thereof?

As associations of language die, so does our power of interpretation of the works that employ them. This is true not only of literature, but of painting and sculpture as well: how can one feel the power of Leda and the Swan, or Laocoön and his Sons, without knowing the stories from which they flow? However, it can not be said either that a commonality of associations could be arrived at: even those most familiar with these stories will have associations different from one another, and no amount of education nor study can remove these. There is a too great diversity of stimuli in the world for a truly common language to be shared between even two people, let alone an entire society, let alone divergent societies spanning millennia.

It is from this realisation that the post-modern tradition stems: that the reader is a part of the art as an integral as the writer, if not more so. This claim has a fairly reasonable foundation: upon reading a book, its totality of associations emerges from the reader, not the writer. It is therefore reasonable to assume, since each person has a language of associations different from that of any other, perception of art cannot but be an individual experience. It is therefore true that no opinions, no true criticisms, can exist of art, because there is no commonality, no correctness, on which criticisms could draw.

The fallacy in this argument can, again, be exposed through the examination of the nature of language itself. It has been said, and it is true, that language is individual, and cannot be identical for any two people. It is easy to postulate, from this, that language is utterly independent for each person, but this is not true either. If private language such as this existed, there could be no communication at all, yet it is obvious that communication of associations exists: how else could one follow directions, or comprehend the events of a story? Though language cannot be identical, it has to be similar for it to constitute a language all: it must carry a commonality of symbols and associations that have high (but not total) degrees of similarity. It is because language functions as a web of associations that two people make speak languages which are not identical, but are the same. Of the billions of associations in one’s mind, millions may be disparate, but the commonality is still overwhelming. Language is a societal phenomenon with individual inflection: its total definition is individual, but its general definition is common. Thus neither the classical nor the post-modern interpretation can be correct: the truth lies in between, with both learned associations and synthetic ones being indispensable.

It is in understanding this, and in failure to understand its implications, that much of the success and failure of modern art lies. The success, of course, lies in its popularity: modern has proven to be largely popular with those who are not mired in the traditional definitions of art, and it is natural that this should be so. This is because modern art is devoid of association by its own nature; it presents images that hold no common associations to society, and are vague enough that there is nothing with which they cannot be associated. Thus, the artistry lies entirely with the audience: their associations are the only ones projected onto the work, and none of the artists. The art would be equally beautiful if generated from the void, or even merely imagined, or independent of any society, simply because it is the viewer whose associations are presented. And it is for this reason that modern art succeeds: anyone approaching modern artwork openly will find it speaks to him, but only because he is speaking to himself.

Obviously enough, this vacuum is also the failure of modern art. Art is created, and the process and though of creation is integral to it. Modern art, being devoid of association, fails to even be an expression: it does not express anything greater than itself, because it itself is nothing. The expression is that of the audience, and only of the audience, and thus the art itself aspires to nothing or, at the very least, fails to achieve to any degree any aspirations it may have had. This sort of art is not interpreted, but merely projected upon. To praise its ‘artists’ is as to see a painting, then to praise the maker of the canvas itself for its beauty. The truly brilliant masterworks of art are ones which conjure a plethora of associated concepts with a minimum of expression, yet are never so incomprehensible as to force the audience to create the entirety of meaning. A wide and subtle web is woven by high art, as opposed to a coarse and small one, or none at all.

For the most part, art is created which speaks to a society’s natural associations: ones which are acquired by most of the potential audience simply by their experience, or by a commonality of education, as is the case with mythological reference. However, most ‘revolutions’ in art are said to occur when a new set of associations is introduced. This was the case, in painting, with Picasso or Dali, whose visuals were evocative enough to create associations as rich and multifarious as that of any renaissance work, yet had enough seeming familiarity, though one could never quite put one’s finger on it, that they were not devoid of interpretation. In literature, a perfect example of such an ambition that overleapt itself is that of Joyce’s Finnegan’s Wake. Never has there been a book that so infuriatingly exposed the ignorance of its reader as compared to its writer. Yet, while being truly revolutionary in the scope of the evocation of its words, it fails in the respect that one cannot grasp even the majority of this association without happening to be James Joyce. The phonemic association, while instinctive to a degree in terms of cognate words, is utilised to a point where it cannot be grasped, and, with each word evoking so many others, it loses the natural associations of a sentence, that of one word with the words surrounding. While a revolution in terms of theoretical art, its impossibility of interpretation is the thing that has prevented it from being as celebrated as Ulysses, whose brilliance relies on both the learned associations of literary tradition and the synthetic associations of the natural stream of thought.

Yet the aspiration of Finnegan’s Wake is that which is purest to art, to express that which is immeasurable in something finite. This, too, is its failure, it expresses but fails to be interpreted because its web is so subtle, and so delicate, that to master some associations is to lose sight of, and break, others. While art must always aspire to that which is greater than itself, merely a quantity of expression is not the success of great art. There must be a balance of creation and interpretation, an understanding of the one’s own linguistic associations, of one’s society’s, and of one’s potential audience’s. Thus, through the power of certain words, which evoke untold thousands of others, can heaven in a wildflower be seen.