Linguistics

Article/Chapter Review: Introducing English Linguistics

Written for an Applied Linguistics Master’s Course: Introduction to Linguistics

 

Introduction

     In this section of Meyer’s (2009) sixth chapter, lexical semantics methods used by lexicographers and semanticists are detailed. We are first presented with the different methods, approaches and motivations of lexicographers and semanticists. Kay (2000) poses concerns about relying solely on a lexicographers’ methods, arguing that because their main goal is to sell dictionaries, the methods used to define word meanings are (1) not based on modern semantic theory, (2) lack needed linguistic influences and (3) cater to a readership in ways that are not relevant to semanticists (Meyer, 2009, p. 158). However, despite these differences in methods/motivations in lexical semantics, lexicographers and semanticists do have a shared interest in analyzing and understanding the meanings of words.

Lexical Semantic Methods: Dictionaries and Lexicographers

     The process for making English language dictionaries introduces us to various methods used in lexical semantic studies. The main focus here is on the construction of the monolingual, English dictionary, though there are other types of dictionaries mentioned (Bilingual, Unabridged/Abridged, Thesauruses and Specialized Dictionaries). Lexicographers follow two basic steps to create the monolingual, English dictionary: first, they study the word(s) being used in context and second, they create definitions that the target readership can understand.

     The first step in studying words in context relied on collecting text samples. The first dictionaries relied heavily on written texts to understand word meanings. The process used, as illustrated with the creation of the Oxford English Dictionary, was meticulous and involved finding texts that illustrated the word meaning(s) and choosing/recording those words in a detailed and specific manner on citation slips (Meyer, 2009, pp. 160-161). The example texts they collected to help define the meaning of a word in context had to be large, in response to Zipf’s Law (1932) that showed how content words appeared less frequently than function words in texts. To succeed in finding the meaning of a content word within a given text, larger text samples were needed.

     Though these methods were thorough, many issues were raised on these original processes. First, the time periods that were chosen to gather words, though significant,  did not fully represent English development, and as Landau (2001) noted, the chosen words from these time periods only represented the educated, upper classes (Meyer, 2009, p. 160), signifying a more biased representation of the English language. Additionally, Labov (1973) pointed out the difficulties faced when trying to determine whether or not a word has one or multiple meanings. In his work, he showed how the meaning of a word like cup can change based on how it is perceived. From a neutral or academic standpoint, one may look at photos/examples of various cups and determine each to be defined as a cup. However, when changing the context or situation in which the cup was used (as in his “food” category), many instances showed a switch from defining the object as a cup to a bowl (Meyer, 2009, pp. 164-165).

     Many solutions have been presented in response to these issues. Labov (1973) offered a solution to the issue he posed about words changing meaning based on context. He suggested that we define words with a more scalar perspective rather than with absolute definitions. In doing so, lexicographers are able to articulate the definition of a word more concisely in a dictionary (Meyer, 2009, p. 165). Additionally, because the method of handwritten citations requires large textual samples, modern lexicographers have turned to accessing digital samples from corpora like the Collins Word Web. Databases such as these also provide both written and spoken English and are constantly updated as new words are introduced to the system.

     The second step to making a dictionary was creating the word definitions. Traditionally, most dictionaries present the same standard information for a word (spelling, pronunciation, definition, etymology). How that information is presented depends on the lexicographer’s philosophy for defining words – some take an Aristotelian approach, identifying words based on genus and differentia; others take on a more encyclopedia-inspired approach (Meyer, 2009, p. 165). In any case, definitions are based on headwords which are the base form of words (Example: dog is a headword and dogs is a variation of it).

     Issues regarding how dictionaries define words have been presented. First, Moon (2007) argued for the inclusion of collocations in dictionaries since certain words are more prominently used in phrases (Example: take a step, take a long time, etc.) than they are used in isolation (Example: using take to mean remove, steal, etc.). There were also issues with non-native speakers using monolingual, English dictionaries. Restricted vocabulary use in defining words in dictionaries for non-native speakers makes it difficult to make clear definitions for words. Additionally, technical vocabulary makes it more difficult to avoid using difficult language. So non-native speakers may be faced with definitions that are too lengthy, technical or unclear. To combat some of these issues, lexicographers have categorized definitions for words based on their most frequent uses. For example, the word impact is now first defined by its abstract rather than its concrete meaning in dictionaries since the abstract meaning is the most commonly used (Meyer, 2009, p. 167). The use of corpora (Example: Collins Word Web) helps provide illustrations of words from real sources, and definitions for words avoid circularity (Meyer, 2009, p. 165).

Lexical Semantic Methods: Componential Analysis and Semanticists

     Componential analysis, or lexical decomposition, is how semanticists try to define words by breaking them down to their essential parts using abstract principles (Meyer, 2009, p. 167). This approach in lexical semantics is illustrated with three different theorists. Leech (1981), used various scales to break down a word into its meaningful components:  the +/- scale for people, the / scale for “relation to opposition,” and the ↑/↓ scale for “polar oppositions” (Meyer, 2009, pp. 168-169). Cruse (2004) presented a system of “lexical contrasts” in response to Leech’s (1981) scalar model since it could not account for abstract words. It is a more general definition model which relies on comparing a given word with other words to gather its specific meaning (Meyer, 2009, p. 169). Wierzhicka (1996) used “semantic primes” on a scale to define words based on the theoretical idea that abstract semantic features occur in some way in all languages (Meyer, 2009, pp. 169-170). For example, one “Descriptor” for evaluators uses the primes BIG and SMALL as the scalar model for how size is defined.

Questions:

  • Previous versions of dictionaries have represented a specific class of people. Modern dictionaries can still present clarity/usability issues for non-native English users:
    • Do you think that there has been enough significant modifications to dictionaries to represent different English groups?
    • How can we better and more effectively define words in dictionaries to meet the needs of non-native English users?
  • Corpora such as Collins Word Web contributed greatly to advancing the lexicographer’s ability to examine and define words from spoken and written texts from larger, more easily accessible databases.
    • In what ways would you like to see technology and/or the types of language data collected advance further to increase our understanding of lexical semantics in English?
    • Which approach(es)/method(s) do you think contribute the most useful knowledge to our understanding of lexical semantics in English?
    • What are some of the advantages/disadvantages to the methods used in lexical semantics to define words?
    • Do you prefer the lexicographer’s approaches to the semanticist’s? Why or why not?

0 comments on “Article/Chapter Review: Introducing English Linguistics

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: