Writing is the least useful skill for language learners and as internet
translators become ever more sophisticated it is likely that the vast
majority of learners will never have to construct written language in the foreign
language at all. We should downgrade its importance at all levels, but especially at GCSE. The current allocation of 30% of marks for writing is inappropriate.
That said, we shall no doubt continue to teach and assess writing, largely to support the other skills and to provide useful classwork and homework tasks.
So how should we clear up the current mess which is assessment of writing at GCSE?
When it was decided that that exam boards would have to mark written GCSE assignments this issue came sharply into focus with hundreds of schools unhappy with grades. A thread on the TES MFL forum about problems with marking writing at GCSE has been running for over a year.
Mistakes in marking with essay based questions occur where the examiner is inexperienced, careless or just tired. Difficult to interpret mark schemes do not help.
One way around this would be to abandon the current, open-ended essay assessment format. This type of "direct assessment", although it allows for authentic communication and does not encourage undesirable backwash effects, is hard to score accurately. (In its current "controlled assessment" form it is also to easy to cheat and therefore unreliable.)
We could assess grammatical skill using "indirect assessment" such as multi-choice or cloze tests. These methods, although reliable, are not "authentic" or communicative in nature. The ability to construct sentences with correct syntax could be achieved by using translation sentences with a clear-cut mark scheme. The problem with this, however, would be that teachers would then be encouraged to do endless prose translation sentences in the run-up to exams, thereby undermining good MFL teaching methodology.
Is there a way of testing syntax and vocabulary which is both accurate, encourages good teaching methodology, is not open to excessive interpretation, which would reflect good classroom practice and not lead to undesirable "backwash" effects? The answer is probably no, but we need to find a good compromise which allows for the most accurate assessment and which promotes good methodology.
One compromise solution would be a very closely guided composition task with bullet points in English. Here are a couple of examples of tasks for a higher tier exam paper which might require about 120 words each. Glossaries could be provided - I would not favour use of dictionaries because it is not possible to provide the same dictionary to all candidates and, in any case, they encourage poor practice among weaker students.
1. You are writing a Facebook message to your French friend about an upcoming exchange
I would be tempted to impose a word limit to encourage sensible timing and to allow for fair comparison of accuracy between candidates. (Some candidates might write a lot, quite well, but make many mistakes.)
You will say, no doubt, that this format resembles one which was used in GCSE for a number of years. Yes, it does, and they had good reasons for doing it this way.
The above tasks remain difficult for the weakest students, though glossaries could mitigate this.
Some of this may become, as the Americans say, moot, if 16+ exams disappear, but we shall always need to assess.
That said, we shall no doubt continue to teach and assess writing, largely to support the other skills and to provide useful classwork and homework tasks.
So how should we clear up the current mess which is assessment of writing at GCSE?
When it was decided that that exam boards would have to mark written GCSE assignments this issue came sharply into focus with hundreds of schools unhappy with grades. A thread on the TES MFL forum about problems with marking writing at GCSE has been running for over a year.
Mistakes in marking with essay based questions occur where the examiner is inexperienced, careless or just tired. Difficult to interpret mark schemes do not help.
One way around this would be to abandon the current, open-ended essay assessment format. This type of "direct assessment", although it allows for authentic communication and does not encourage undesirable backwash effects, is hard to score accurately. (In its current "controlled assessment" form it is also to easy to cheat and therefore unreliable.)
We could assess grammatical skill using "indirect assessment" such as multi-choice or cloze tests. These methods, although reliable, are not "authentic" or communicative in nature. The ability to construct sentences with correct syntax could be achieved by using translation sentences with a clear-cut mark scheme. The problem with this, however, would be that teachers would then be encouraged to do endless prose translation sentences in the run-up to exams, thereby undermining good MFL teaching methodology.
Is there a way of testing syntax and vocabulary which is both accurate, encourages good teaching methodology, is not open to excessive interpretation, which would reflect good classroom practice and not lead to undesirable "backwash" effects? The answer is probably no, but we need to find a good compromise which allows for the most accurate assessment and which promotes good methodology.
One compromise solution would be a very closely guided composition task with bullet points in English. Here are a couple of examples of tasks for a higher tier exam paper which might require about 120 words each. Glossaries could be provided - I would not favour use of dictionaries because it is not possible to provide the same dictionary to all candidates and, in any case, they encourage poor practice among weaker students.
1. You are writing a Facebook message to your French friend about an upcoming exchange
- Ask him/her how they are
- Say how you are
- Ask him/her what they have been doing recently
- Say you have just been into town with your friends
- Ask him/her when they are going to arrive
- Explain that you will meet them at the school car park
- Tell him/her to bring warm clothes because the weather forecast is not good
- Explain THREE things you are going to do during the stay
- Explain THREE things you did last weekend with your friends or family
- Sign off
- Explain who you are, your age and that you are looking for a summer job in France
- Explain why you would like to work at the camp site
- Tell them about yourself: mention THREE pastimes or interests
- Describe a job you have done (e.g. as part of work experience)
- Describe what aspects of your personality would make you a good candidate
- Ask them how much money they would pay you
- Ask them where you would stay
- Ask them if you would have any free time
- Sign off
I would be tempted to impose a word limit to encourage sensible timing and to allow for fair comparison of accuracy between candidates. (Some candidates might write a lot, quite well, but make many mistakes.)
You will say, no doubt, that this format resembles one which was used in GCSE for a number of years. Yes, it does, and they had good reasons for doing it this way.
The above tasks remain difficult for the weakest students, though glossaries could mitigate this.
Some of this may become, as the Americans say, moot, if 16+ exams disappear, but we shall always need to assess.
Comments
Post a Comment