Jump to content


Photo

Exam reports - General comments


  • Please log in to reply
28 replies to this topic

#16 BabyGrand

BabyGrand

    Advanced Member

  • Members
  • PipPipPip
  • 783 posts
  • Member: 144315
    Joined: 27-October 10
  • UK

Posted 11 June 2019 - 22:45

soapbox.gif

 

Some time ago I did a big research project, looking into the concept of music performance assessment, and different approaches to it.  I read a huge number of studies and commentaries. and conducted some research of my own.  The broad conclusion we came to was that following a strictly criteria-based assessment system - with no room for any subjective judgement - did succeed in bringing every assessor into line.  Everyone gave the same performers the same results, and as such there was consistency and "fairness", of a sort.  However, in a number of cases, they felt they were giving results that they did not actually agree with - as in, by following the criteria, it churned out a number - and each one, by following the system, came to the same number - but none of them agreed that that number was what the performance actually deserved!!  wacko.png  

 

The conclusion of the project was that music performance simply cannot just be broken down to a sum of its parts.  You can't create a system that is both 100% objective and "safe", and also accurate or "musical".  Yes, allowing an examiner to make an expert judgement brings about a greater risk of mistakes or bias (and complaints!), but it is an infinitely more musical way to approach things.  So any good system needs to hold these two things in tension.  Yes, absolutely, put in safeguards, give training, use criteria / grade descriptors, aim for fairness and every examiner applying the same principles, and so on.  But let the examiners do their job.  If you have examiners marking too harshly, too kindly, or inconsistently, the answer is to better train or replace the examiners, not to give them a word bank and reduce their job to a box ticking exercise. 

 

If an examiner is afraid to make a comment for fear of a complaint, something has gone seriously wrong with the system.  Especially because, what harm can an encouraging comment ever do?  Are they afraid of complaints if someone gets a low mark and yet the examiner says "Well done"?  blink.png   Better to say nothing in case their positive comment could be used against them?  I agree with maggiemay - an empty box looks like something is missing.  If the comment box is only or primarily used in "borderline" cases and fails, then it has essentially become a box for the examiner to justify their decision, rather than a place to offer some encouragement.  Why are they being actively discouraged to give even a "Well done" - do the board not trust their own examiners to write appropriately?  If an examiner is not trusted to write something suitable in that little box, why are they trusted to fill in the rest of the marksheet?  I can totally see elemimele's point, that it's a response to current culture.  But there has to be a better way.  

 

End soapbox.gif

 

I could talk about this for a lot longer, but I'd better not!  Interestingly, at the point I wrote the paper, I praised ABRSM for having a more musical approach than some other assessors.  It's sad to see that changing.  sad.png


  • 7

#17 ma non troppo

ma non troppo

    Prodigy

  • Members
  • PipPipPipPip
  • 1378 posts
  • Member: 76027
    Joined: 23-September 09

Posted 11 June 2019 - 22:53

What a shame abrsm don't appear to read this forum. They could learn a lot from it. More than from paying for the useless "Teacher's Voices" surveys.
  • 7

#18 Yet another muso

Yet another muso

    Advanced Member

  • Members
  • PipPipPip
  • 355 posts
  • Member: 103420
    Joined: 22-May 10

Posted 11 June 2019 - 23:59

Really interesting response, Baby Grand, thank you for sharing the summary of your research, which strikes at the very difficult balancing act involved when trying to assess musical performance. 

 

Maybe someone can correct me if I am wrong, but ABRSM are essentially answerable to just themselves and their customers. Unlike say schools and academic exam boards whose crazy policies are heavily controlled by the government. Hence they are responsible for their own decisions and have the power to change course on this. 

 

Personally I totally sympathise with why they take the steps they do, but still doubt if this is truly the best way to uphold consistent standards. The best way I can think of is simply to record all exams (which I believe they may be doing soon anyway?) and use this for complete random moderation - absolutely any recording could be listened to and checked against the mark form. But examiners only called into serious question if marks are way off where they should be as happens occasionally. That respects the different perception of actually being in the room, and still affords the examiners enough trust to do their jobs, while guarding against the odd complete maverick. 

 

I should reiterate that my source for this info was one person second hand, but another poster backed it up so it does seem to be so. Another trainee commented that the best way to pass the training was to keep your head down and keep any individual thoughts to yourself! I can't help feeling that examiners who feel like that are not going to get much job satisfaction, and that will ultimately impact on the quality of the job they do. I am one of those 'musicians with broad experience' who regularly have people telling me I should apply to become an examiner. I got as far as sending off for an application form a few years ago and have been in two minds about it ever since. I must admit the idea is getting less appealing!


  • 2

#19 ma non troppo

ma non troppo

    Prodigy

  • Members
  • PipPipPipPip
  • 1378 posts
  • Member: 76027
    Joined: 23-September 09

Posted 12 June 2019 - 00:19

I feel likewise. Several people have tried to persuade me to apply to be an examiner for abrsm, but there is no way I would - on much reflection.

Recordings don't really communicate a live performance, although they may be a basic safeguard.
  • 0

#20 Banjogirl

Banjogirl

    Virtuoso

  • Members
  • PipPipPipPipPip
  • 2808 posts
  • Member: 39509
    Joined: 12-September 08

Posted 12 June 2019 - 09:35

There an interesting comparison with the scoring of barbershop competitions. Our judges complete a long and rigorous training programme. In a typical competition there are nine judges, three per category, and they each score each song out of 100. Within a category, if any two judges'marks differ by more than ten they have a conflab to discuss their decisions. They don't have to change their mark but they discuss why they gave it, with the opportunity to change it if they wish. In practice it is very rare for scores to vary by anything like as many as ten points.

The system is fair and objective. By anyone's measure the 'best' chorus wins. But further down the rankings there can be choruses whose performance was a bit dire scoring higher than choruses whose performance overall was more enjoyable and therefore, to my mind, a better performance. The system is incredibly robust, and the scoring very consistent, but I believe that the very thing that makes it so consistent is the same thing that can give a subjectively 'worse' performance a higher score than an apparently better one. There is no score for 'enjoyability by a normal (non-barbershop) audience' so that doesn't get measured, but at least the measurement, as far as it goes, is as fair as it could be.
  • 1

#21 ma non troppo

ma non troppo

    Prodigy

  • Members
  • PipPipPipPip
  • 1378 posts
  • Member: 76027
    Joined: 23-September 09

Posted 12 June 2019 - 10:01

Put it to the public vote?!
  • 0

#22 HelenVJ

HelenVJ

    Virtuoso

  • Members
  • PipPipPipPipPip
  • 2202 posts
  • Member: 1265
    Joined: 03-May 04
  • South-East London ( OK - Penge)

Posted 12 June 2019 - 10:36

Some things can be measured accurately by numbers and percentages, but I don't think a musical performance is one of them, even with a list of criteria. Somehow music just doesn't 'go' naturally into a number., any more than a dance performance, a dramatic rendering, a painting etc.  Why would one piece be marked 23 and another 24? At least with Trinity there is a breakdown into categories - Fluency and Accuracy/Technical Facility/Communication and Interpretation.

I think marking is always going to be subjective to an extent, despite examiners being moderated to within an inch of their lives these days. If indeed they are discouraged from any individuality, that would explain the bland, characterless and almost meaningless comments.

I think dance, and possibly drama, exams usually have an overall grade, but not a mark or percentage. Some music festivals also give Merit, Distinction, Outstanding categories, but not a mark. Somehow I can't see the exam boards moving towards this. It would be a massive re-think - but GCSE and A level exam boards changed their marking systems and everyone coped.


  • 2

#23 elemimele

elemimele

    Prodigy

  • Members
  • PipPipPipPip
  • 1233 posts
  • Member: 895612
    Joined: 17-July 16

Posted 12 June 2019 - 11:36

on recording, and errors:

An exam result is a measurement, like any other, and measurements contain errors, in the sense of random variations, which are hopefully small. People have to accept this. Life isn't perfect. If one examiner gives 23, another 24, then if this makes a pass/fail difference to a student, the probable truth is that they're sitting very close to the pass/fail boundary, and it's random which side they'll fall. If you want to pass an exam, the correct strategy is to be far enough above the pass-mark that you're outside the random zone, not to sit on the fence and complain if you fall off the wrong side.

 

The sorts of errors that ought to be appealed, and corrected, are the large ones - the things where it's not a matter of musical opinion, but where something silly has happened, like an examiner has got his candidates in a muddle, or someone's messed up in the paperwork, and the grade is completely the wrong student's. Recordings are a good way to do a sanity check when there is suspicion of a complete mismatch. They will also help to identify a seriously "badly-calibrated" examiner, but minor variations in calibration are to be expected. An engineer has a safety-factor - so should a student.


  • 1

#24 Aquarelle

Aquarelle

    Virtuoso

  • Members
  • PipPipPipPipPip
  • 7801 posts
  • Member: 10531
    Joined: 05-April 07

Posted 12 June 2019 - 19:07

Wouldn't it be nice if a parent of every child, and  also every adult candidate  who didn't get a remark in the General Remarks space wrote the the ABRSM to complain that they hadn't got their money's worth? I can imagine the first twenty or so getting a reply of the " Board's policy is ...."  kind and after the 200th letter I suspect there would be some fast back pedaling.


  • 2

#25 musicposy

musicposy

    Advanced Member

  • Members
  • PipPipPip
  • 163 posts
  • Member: 25798
    Joined: 25-February 08
  • West Sussex

Posted 12 June 2019 - 23:25

Still not sure it would make a difference, Aquarelle, but one can dream!

This is what I like about LCM. They tend to comment much more frequently and will say "well done on gaining a merit" or "congratulations on passing your Grade 4". This is much better in my view. ABSM's approach makes it seem as though only a distinction is worth having. I had a young Grade 2 last session who got 110 and I was thrilled because she'd worked so hard and it hadn't come easy to her. Only my two distinction candidates got comments, but this child would have benefited from the encouragement so much more.
  • 4

#26 violin star

violin star

    Newbie

  • Members
  • Pip
  • 21 posts
  • Member: 892916
    Joined: 26-January 15

Posted 18 June 2019 - 18:01

This thread made me laugh as I recently unearthed my own Grade 1 piano comment sheet from 1967 (I was 6) which I passed with 106 marks. That long-dead examiner,  who was probably an elderly organist, certainly didn't see the need for encouragement.  Perhaps things have just come full circle.

 

Scales                    12    Passable

Broken Chords       11    Adequate

Pieces                    19   Rather slack,in tempo.  Fair, in management

                               20   Had more conviction

                              19    This was not very certain. Audible counting!

Playing at sight      11     Weak

Aural tests              A fair, B weak, C good

General remarks    -                             

 

I survived.


  • 1

#27 Sautillé

Sautillé

    Advanced Member

  • Members
  • PipPipPip
  • 129 posts
  • Member: 897314
    Joined: 15-February 17

Posted 18 June 2019 - 20:52

That is JUST AWESOME....Thankyou so much for sharing ;)
  • 1

#28 ma non troppo

ma non troppo

    Prodigy

  • Members
  • PipPipPipPip
  • 1378 posts
  • Member: 76027
    Joined: 23-September 09

Posted 18 June 2019 - 20:54

That is hilarious!
  • 0

#29 jenny

jenny

    Virtuoso

  • Members
  • PipPipPipPipPip
  • 2802 posts
  • Member: 7686
    Joined: 16-September 06
  • Manchester

Posted 19 June 2019 - 08:24

I have kept all of my mark sheets and certificates from when i started exams. Only a few of them have anything written in the general comments box. The final marks are very varied - from a 106 to some good distinction marks. I remember clearly that examiners in those days were nearly all men and that they were always brusque and intimidating. One of them (I think it was for Grade 6) stood right behind me for all of my pieces and I can still remember how uncomfortable I felt. How things have changed! I often tell my pupils about the comment for my Grade 3 scales, which just said 'Trouble with B minor!' 


  • 2