The lexical syllabus is based on the results of the computational analysis of English, which was first carried out to compile data for the production of dictionaries. John Sinclair, Editor-in-chief of the project, suggested that the data collected could also be used for language teaching in the form of a lexical syllabus. Antoinette Renouf, who also took part in the project, 33 helped him develop the new type of syllabus. According to their views, now, with the data provided by computational analysis, it seemed feasible to put into practice what observation, experience and intuition had taught, that 'for any learner of English, the mqm focus of study should be on: a) the commonest word forms in the language; b) their central patterns of usage; c) the combinations which they typically form' (In: CARTER and McCARTHY 1988:148). The idea to concentrate learning on vocabulary is justified by SINCLAIR and RENOUF (155) when they talk about grammar, saying that 'if the analysis of the words and phrases has been done correctly, then all the relevant grammar, etc. should appear in a proper proportion. Verb tenses, for example, which are often the main organising feature of a course, are combinations of some of the commonest words in the language.' But for SINCLAIR and RENOUF (146) '[a] simple list of words is not nearly explicit enough to constitute a syllabus. In order to construct an adequate syllabus, it is necessary to decide, in addition to which words we want to include in our syllabus, such things as what it is about a word that we want to teach, and what counts as a word.' Yet, the discussion around the definition of 'word' is long and controversial. According to their ideas, '[t]he conventional view is an inclusive one: that the term 'word' denotes a unit of language comprising a base form, such as give, and an 'associated' set of inflexions, such as gives, giving, gave, given. Sometimes derivations will be included, e.g. gift' (In: CARTER and McCARTHY 1988:147). And they go on to say that this is the same concept used in computational linguistics "where all forms, including the base form, can be subsumed under the term 'lemma'" (147). SINCLAIR (1991:173-174) defines lemmatization as 'the process of gathering word-forms and arranging them into lemmas or lemmata. So the word-forms give, gives, gave, given, giving, and 34 probably to give, will conventionally be lemmatized into the lemma give. Any occurrence of any of the six forms will be regarded as an occurrence of the lemma.' The main implication behind this concept is that, in traditional lists of vocabulary, whenever you have the base form, it is implicit that all the other forms of that word are covered in the material. This is misleading because it does not necessarily happen. And often, 'particularly with the commoner words of the language, the individual word forms are so different from each other in their primary meanings and central patterns of behaviour (including the pragmatic and stylistic dimensions), that they are essentially different 'words', and really warrant separate treatment in a language course' (SINCLAIR and RENOUF in: CARTER and McCARTHY 1988:147). So, the lexical syllabus proposes a different kind of treatment to the 'words' of the language taking into account the peculiarities detected by the computational analysis in an attempt to 'break the traditional expectations that all forms of a word are equally important and that all will behave in the same way' (SINCLAIR and RENOUF: 159). Taking into consideration the fact that, for SINCLAIR and RENOUF (150), a syllabus is more than a word list, the lexical syllabus proposed by them ended up being built upon three different aspects: word forms which 'can be subsumed under their base form or full form in a teaching list', and that can be identified according to their frequency; central patterns of usage such as the ones provided by delexical verbs which are transitive verbs that 'carry particular nouns or adjectives which can in most cases themselves be transitive verbs [showing that] [i]n general, the more frequent a word is, the less independent meaning it has, because it is likely to be acting in conjunction with other words, making useful structures or contributing to familiar idiomatic phrases' (SINCLAIR and RENOUF in: CARTER and McCARTHY 1988:153); and upon typical word combinations such as lexical collocations. SINCLAIR and 35 RENOUF (In: CARTER and McCARTHY 1988:154) believe that '[i]n these ways, the essential patterns of distribution and combination in modern English will be included in the lexical syllabus.' But, when analysing the first attempt to define a lexical syllabus for beginners based on a list of around 700 words, one realises that lower frequency items and 'utility' words are also included in the lexical syllabus. According to Renouf, the lower frequency items would account for some common lexical sets such as days of the week, months of the year, the seasons, the points of the compass and kinship terms which 'amounted to around 80 words' (RENOUF in: SINCLAIR 1987:170). And the 'utility words' were selected, still according to RENOUF (170), 'for their utility value in the writing of the Course materials. These [...] would serve to contextualise a very common word, or contribute to the treatment of a certain topic, or bring some interest value. They would also be teaching words, and receive analytical treatment.' They total around 150 words. According to SINCLAIR and RENOUF (In: CARTER and McCARTHY 1988:151), the main aims are to both achieve a balance 'between natural usage and utility' and to highlight 'the common uses of the common words' (154). They go on to say that '[o]ther languages may be different, [but] English makes excessive use, e.g. through phrasal verbs, of its most frequent words, and so they are well worth learning' (155). Another important issue SINCLAIR and RENOUF (155) discuss is the one that distinguishes syllabus from methodology. They put forward the lexical syllabus as 'an independent syllabus, unrelated by any principles to any methodology'. They state clearly that one cannot mistake a methodology for a syllabus. They try to rescue the relevance of the role of a syllabus in language teaching. They are in favour of a comprehensive syllabus which should specify '[t]he exact nature of the content, the sequence of events and the pattern of 36 coverage' (SINCLAIR and RENOUF: 145). In this way, the lexical syllabus intends to provide coverage of structures, notions and functions through the analysis of words and phrases. To them '[i]n the construction of a balanced and comprehensive course, the designer will no doubt keep a tally of structures, notions and functions, as well as vocabulary. But in the presentation of materials based on a lexical syllabus, it is not strictly necessary to draw attention to these check lists. If the analysis of the words and phrases has been done correctly, then all the relevant grammar, etc. should appear in a proper proportion' (In: CARTER and McCARTHY 1988:155). SINCLAIR and RENOUF (156) highlight the main advantages of a lexical syllabus stating that it is 'a much more detailed inventory of the possibilities of the language.' Besides that, '[o]ne big advantage of a lexical syllabus is that it only offers to the learner things worth learning. [...] So instead of building up phrases, the learner will be gradually breaking them down, sensing the variability' (155-156). Moreover, they state that '[t]he emphasis shifts from constructing messages to delivering them, and delivering them to maximum effect, and to achieving communicative goals' (156). And, '[i]n the lexical syllabus, such things as lists of structures and notions and functions would be secondary, and would come out of the implementation of the lexical syllabus rather than constrain it' (SINCLAIR and RENOUF: 160). It is in this sense that the lexical syllabus represents a meaningful and important change in the approach to syllabus design both in terms of content (when it concentrates on vocabulary) and of use (when it intends to cover what is really used in the language).
The lexical syllabus is based on the results of the computational analysis of English, which was first carried out to compile data for the production of dictionaries. John Sinclair, Editor-in-chief of the project, suggested that the data collected could also be used for language teaching in the form of a lexical syllabus. Antoinette Renouf, who also took part in the project, 33 helped him develop the new type of syllabus. According to their views, now, with the data provided by computational analysis, it seemed feasible to put into practice what observation, experience and intuition had taught, that 'for any learner of English, the mqm focus of study should be on: a) the commonest word forms in the language; b) their central patterns of usage; c) the combinations which they typically form' (In: CARTER and McCARTHY 1988:148). The idea to concentrate learning on vocabulary is justified by SINCLAIR and RENOUF (155) when they talk about grammar, saying that 'if the analysis of the words and phrases has been done correctly, then all the relevant grammar, etc. should appear in a proper proportion. Verb tenses, for example, which are often the main organising feature of a course, are combinations of some of the commonest words in the language.' But for SINCLAIR and RENOUF (146) '[a] simple list of words is not nearly explicit enough to constitute a syllabus. In order to construct an adequate syllabus, it is necessary to decide, in addition to which words we want to include in our syllabus, such things as what it is about a word that we want to teach, and what counts as a word.' Yet, the discussion around the definition of 'word' is long and controversial. According to their ideas, '[t]he conventional view is an inclusive one: that the term 'word' denotes a unit of language comprising a base form, such as give, and an 'associated' set of inflexions, such as gives, giving, gave, given. Sometimes derivations will be included, e.g. gift' (In: CARTER and McCARTHY 1988:147). And they go on to say that this is the same concept used in computational linguistics "where all forms, including the base form, can be subsumed under the term 'lemma'" (147). SINCLAIR (1991:173-174) defines lemmatization as 'the process of gathering word-forms and arranging them into lemmas or lemmata. So the word-forms give, gives, gave, given, giving, and 34 probably to give, will conventionally be lemmatized into the lemma give. Any occurrence of any of the six forms will be regarded as an occurrence of the lemma.' The main implication behind this concept is that, in traditional lists of vocabulary, whenever you have the base form, it is implicit that all the other forms of that word are covered in the material. This is misleading because it does not necessarily happen. And often, 'particularly with the commoner words of the language, the individual word forms are so different from each other in their primary meanings and central patterns of behaviour (including the pragmatic and stylistic dimensions), that they are essentially different 'words', and really warrant separate treatment in a language course' (SINCLAIR and RENOUF in: CARTER and McCARTHY 1988:147). So, the lexical syllabus proposes a different kind of treatment to the 'words' of the language taking into account the peculiarities detected by the computational analysis in an attempt to 'break the traditional expectations that all forms of a word are equally important and that all will behave in the same way' (SINCLAIR and RENOUF: 159). Taking into consideration the fact that, for SINCLAIR and RENOUF (150), a syllabus is more than a word list, the lexical syllabus proposed by them ended up being built upon three different aspects: word forms which 'can be subsumed under their base form or full form in a teaching list', and that can be identified according to their frequency; central patterns of usage such as the ones provided by delexical verbs which are transitive verbs that 'carry particular nouns or adjectives which can in most cases themselves be transitive verbs [showing that] [i]n general, the more frequent a word is, the less independent meaning it has, because it is likely to be acting in conjunction with other words, making useful structures or contributing to familiar idiomatic phrases' (SINCLAIR and RENOUF in: CARTER and McCARTHY 1988:153); and upon typical word combinations such as lexical collocations. SINCLAIR and 35 RENOUF (In: CARTER and McCARTHY 1988:154) believe that '[i]n these ways, the essential patterns of distribution and combination in modern English will be included in the lexical syllabus.' But, when analysing the first attempt to define a lexical syllabus for beginners based on a list of around 700 words, one realises that lower frequency items and 'utility' words are also included in the lexical syllabus. According to Renouf, the lower frequency items would account for some common lexical sets such as days of the week, months of the year, the seasons, the points of the compass and kinship terms which 'amounted to around 80 words' (RENOUF in: SINCLAIR 1987:170). And the 'utility words' were selected, still according to RENOUF (170), 'for their utility value in the writing of the Course materials. These [...] would serve to contextualise a very common word, or contribute to the treatment of a certain topic, or bring some interest value. They would also be teaching words, and receive analytical treatment.' They total around 150 words. According to SINCLAIR and RENOUF (In: CARTER and McCARTHY 1988:151), the main aims are to both achieve a balance 'between natural usage and utility' and to highlight 'the common uses of the common words' (154). They go on to say that '[o]ther languages may be different, [but] English makes excessive use, e.g. through phrasal verbs, of its most frequent words, and so they are well worth learning' (155). Another important issue SINCLAIR and RENOUF (155) discuss is the one that distinguishes syllabus from methodology. They put forward the lexical syllabus as 'an independent syllabus, unrelated by any principles to any methodology'. They state clearly that one cannot mistake a methodology for a syllabus. They try to rescue the relevance of the role of a syllabus in language teaching. They are in favour of a comprehensive syllabus which should specify '[t]he exact nature of the content, the sequence of events and the pattern of 36 coverage' (SINCLAIR and RENOUF: 145). In this way, the lexical syllabus intends to provide coverage of structures, notions and functions through the analysis of words and phrases. To them '[i]n the construction of a balanced and comprehensive course, the designer will no doubt keep a tally of structures, notions and functions, as well as vocabulary. But in the presentation of materials based on a lexical syllabus, it is not strictly necessary to draw attention to these check lists. If the analysis of the words and phrases has been done correctly, then all the relevant grammar, etc. should appear in a proper proportion' (In: CARTER and McCARTHY 1988:155). SINCLAIR and RENOUF (156) highlight the main advantages of a lexical syllabus stating that it is 'a much more detailed inventory of the possibilities of the language.' Besides that, '[o]ne big advantage of a lexical syllabus is that it only offers to the learner things worth learning. [...] So instead of building up phrases, the learner will be gradually breaking them down, sensing the variability' (155-156). Moreover, they state that '[t]he emphasis shifts from constructing messages to delivering them, and delivering them to maximum effect, and to achieving communicative goals' (156). And, '[i]n the lexical syllabus, such things as lists of structures and notions and functions would be secondary, and would come out of the implementation of the lexical syllabus rather than constrain it' (SINCLAIR and RENOUF: 160). It is in this sense that the lexical syllabus represents a meaningful and important change in the approach to syllabus design both in terms of content (when it concentrates on vocabulary) and of use (when it intends to cover what is really used in the language).
การแปล กรุณารอสักครู่..