It is less than two days after our Year 12 Drama students received their final grades and there appears a growing number of teachers confused with marks that look out of kilter this year.
Trends so far indicate examples of middle band students being elevated too high (eg. C grade students receiving B+’s and A’s), while upper band students have been receiving results significantly below their standard (eg. anticipated A+’s being awarded B’s, or lower still).
This appears to be the case in all sets of grades:
- Grade Result 1: combined written and performance internally assessed SACs
- Grade Result 2: externally assessed solo performance examination
- Grade Result 3: externally assessed written examination
Edit: But the main concern appears to be with the solo performance examination grades being lower than expected.
I won’t be creating a storm in a tea cup if there isn’t a case to be heard, but if there are other teachers out there who would like to comment about a similar situation of your own from 2008, that also appears inconsistent with previous years’ results, please add your comments to this post.
As President of Drama Victoria, it is my role to represent Victorian Drama teachers. If this issue is more than a few isolated examples and appears significant, I will discuss the matter with the VCAA in the new year.
Please encourage any colleagues of yours at other schools with similar concerns to add their comments to this post.
“suffice to say that it is now my understanding that it is UNcommon for assessors not to give 7/7 for criterion 1 on the solo exam. So, that may throw the theory about assessors possibly “missing” aspects of the task, out the window.”
I just looked through the solo exam report given to me by the VCE coordinator, and ou of the 21 maximum marks allotted, my class scored 20.6. Which means that one student was marked down on the first criteria … which means my theory on my A+ student going down to a B is a more than plausible theory … i guess I will find out when I get to see her statement of marks. That solo was NOT missing anything, and I am furious! If it is indeed the case, I will do an annotated analysis of her solo, burn a DVD of her rehearsal tapes and send it in to the VCAA … that’s how annoyed I am!!!
I concur with all of the above. I found myself in agreement within half a mark unitl the last couple of years, but now I feel i have no relationship to the examjners evaluation. This wil make it harder for top students to choose drama as they cannot afford to be hit and miss in thier scores.
One of my students expressed it the best: OMG WTF?!? That just about echoed my response to their results this year. In previous years I (generally) agreed with the ordering of my students from strongest to weakest, but this year … I was shocked and appalled!
Put it this way, my top student, with a very sophisticated solo, strong expressive skills, richly non-nat, got a B, whilst my weakest, who struggled the week before to remember lines, delivered a stand-up routine and got an A! Not only that, but the girl who went away to Thailand before her solo, who was quite behind, got a B as well!? All three went to different assessors within the same examination centre!!
Now I haven’t been teaching VCE Drama for that long, but in my 6 years I have had 50 3/4 students, have seen over 120 Top Class performers, have gone to VCE inservices, read the SD back to front and upside down, saw solos at 4 other schools ranging from Frankston to Mac Rob, cross-assessed with 2 other schools on 3 different occasions, had kids in Top Class, had a student with a study score of 50 … and as I stood looking at my students’ results yesterday, all I could think was … how could we (I / they?!?) have gotten it so wrong? I knew the abilities of these kids, and yet, my ranking had been turned on its head?
And I know that I’ll go to Top Class next year and just walk out thinking, OMG WTF!!
The only way that my student could have received a B for her performance is if one or more of the assessors did not give her full marks on any criterion because they thought that the solo was missing something. I have gone through and made sure that it met ALL the criteria! Which means that despite a clear statement of intention, the assessors did not UNDERSTAND a part of her performance … I’m a little confused as to what … but it’s the only explanation I can come up with.
Maybe there was a technical aspect to one or more criteria that was not clearly expressed in the study design or the assessment guide?! Or, maybe, there are assessors who are fixated on a certain stylistic aesthetic that is skewing their judgement?! Maybe there are preconceived ‘rules’ that are not being communicated to teachers! I have heard several ‘gems’ over the last couple of years … ie, “the student must cross the space three times to score highly for use of space …”
Where is THAT in the study design, the assessment guide or the chief assessor’s reports?? Don’t bother to look, it isn’t. Are assessors going into the assessment space with such notions that are simply wrong?
1. Something has to be done about the inconsistency of the assessment panels.
2. Whatever the assessors are being told in their inservice needs to be clearly communicated to teachers.
Meanwhile, I’m applying to assess next year, just to see what assessors get told, because I’ve read EVERY exam report from the chief assessor and I’m obviously still incompetent at judging what makes a good solo.
I too have been thinking what went wrong! One of my students received a C+ for her solo performance which is highly uncharacteristic for her. She received A+’s for her other marks in Unit 3/4, she has received A+’s and A’s for all her work in Unit 1/2. I expected her to get an A for her solo as she fulfilled all the dot points and other criteria to a very high level and in a sophisticated and clear way. The lengths she went to to ensure her solo meet the dot points was impressive. She is a very confident and impressive performer so I do not believe that she would have buckled under the pressure of performing for a VCAA panel.
I triple checked her solo and made sure that it met ALL the criteria. I also ensured that the student’s statement of intention helped to clarify any necessary issues. So the best explanation I can come up with is that a mistake has been made. I believe the assessors either did not fully understand (or properly read) the statement of intention, or they somehow mis-understood the performance, which seems crazy as the solo performance was very clear and well constructed and impossible for someone to mis-understand. I feel as though one or more of the assessors had a fairly subjective view and therefore this has resulted in a lower grade.
I am disappointed that there is no appeal process as this leaves teachers and students dis-empowered, and at the mercy of what seems like a subjective examination situation.
I feel so much better after reading alll of your comments! I have been sitting at home wondering where things went wrong. My three possible A+ students were only given B+s, and my weaker students received Cs (one shoud have been an E in my mind!). This year seems far more inconsistent compared to other years. Very stressful for teachers – it makes us second guess ourselves and question our undertanding of the course. I also agree with Micoke – how do you get an assessor “golden ticket”?
It would be fair to say I’m definitely in the same boat as everyone else here. My students’ grades this year had one common theme: clear and obvious examples of middle band students being elevated and high band students being brought down across all three sets of grades.
This resulted in anticipated “A” or “A+” solo performances being awarded “B’s” and the reverse with a few students on the written examination who were no doubt pleasantly surprised with “A’s” (while other anticipated “A+” students – you know, the ones who were getting 14/15 and 15/15 on SACs all year – got “B’s on the written exam, instead.
My internal SAC grades (Grade 1 Result) came back as all “A’s and “A+’s”, which was surprising to say the least. Overall, anticipated 40+ students ended up with 38’s and 37’s.
I’ve been doing some homework for everyone over the past few days and I now suspect these issues lie in the moderation process at the VCAA, not in the exam room with assessor interpretation (though apparent inconsistency from room to room at exam centres is a different issue, again, and I know quite a few people concerned about this one).
I have no intention of speaking “out of school” (and I am not an assessor for solos), suffice to say that it is now my understanding that it is UNcommon for assessors not to give 7/7 for criterion 1 on the solo exam. So, that may throw the theory about assessors possibly “missing” aspects of the task, out the window.
Sandie, your query, above, about the same grades receiving 43 and 36 in different classes at different schools has been explained to me as VCAA cross-checking between various grades, including the GAT, but more importantly it is a “cohort” issue, ending up in different scores among different cohorts. Suffice to say, beyond that, I don’t understand any more, but you may find a colleague who can explain it for you better – but the reasons apparently lie in the area of different cohorts.
I’d like to share with everyone a moderation example. Every year what grade range is an “A” or “A+” in the solo performance exam changes. I understand this varies more in the Drama solos than the Theatre Studies monologues. I believe it may be someone who has posted above that explained to me (but we shall leave it anonymous) that a couple of years ago, after applying for one student’s solo performance result transcript for a breakdown of the grades, discovered a 227/231 was dropped to an A that year.
With three assessors awarding a potential 77 marks each, making a total exam score of 231, in that year, only a perfect score, a 230, a 229 and a 228 were awarded an “A+” for the solo. If you ask me, that’s crazy, but for someone who failed Year 10 Maths at school and never did it again(!), I’m assuming that’s all about the VCAA getting their bell curve. So, it’s a tough call some years to get an “A+” in the solo performance, after scaling.
Looks like 2008 was one of those years…….
Well it was the first time for me, and I have worked very much alone being at a remote school. However with the 4 of my 6 students competing for dux of the college we have had a brilliant year. 4 of my students went to Melbourne to do a solo intensive at the MTC, which gave me reassurance that we were on the right track… all were assured that they had covered the structure well. So I was very disappointed when my hardest working student came out with a C+ for Fortuna, I can only assume they felt she had not met the structure – despite her tutor at MTC assuring her that she had. By far my best student, with Greed attained only an A, which I was disappointed with as he is a truly exceptional performer and of course my natural comedian, who tried to add 2 missing dot points to his ever changing Prisoner in the last two days, gained a B+. So I have come out questioning whether I should be teaching it, especially for my C+ girl. She achieved A+ in written exam.
I wonder at Sandie’s A+ A A who got a 43, mine with the same result only got 36.
I would also like to point out the difficulties rural – remote students have. To see more than one of the prescribed plays is virtually impossible, it takes two days to attend a Melbourne performance for us and other teachers don’t appreciate the absence. Addressing this by touring to Warnambool (10 hours drive from here) is not enough! A good look at a map should identify a few more centres accessible to remote areas. It is also very difficult to see other students work both for me and my students, but in that area I have to get more organised!
I am also one very confused and freaked out Drama teacher. I am very experienced and I have had very successful students every year. I thought I knew exactly how this works. Now I dont get it. I heard alarm bells after reading the posts last night and speaking to a few colleagues, who are excellent and experienced teachers and assessors. I am faxing my enquiry to VCAA today as i feel there has been a mistake. While my own students hace done very well and relative to each other the marks make sense, there is an inconsistency across the state. Foe example- How does one sudent get A+ A+ A+ and get 42 ( which is a lot lower than 3 A+s would have got before- but perhaps the standard was lower)- while another gets A+ A A and gets 43? and what about A+ B+ A+ and 43? What about a 50 with an A+ A A+???? Lets not even get into the ones who should have scored A+ but ended up with B+. As teachers and assessors we need tp be demanding an explanation. There cannot be people teaching a subject who do not understand how it is assessed. Do we need to film the solos? How can we have more consistency between assessors? I am not keen to continue teaching it unless the situation is clarified. We need a unified voice. Justin, can you help us?????
I’m so very disappointed.
I worked my butt off, sat with people who had done the assessor training, tried to clarify all the requirements, read through the examiner’s reports for the written exams and spent hours checking and rechecking criteria.
I agree with Duggan’s post – comedy appears to win over dramatic / research heavy performance.
When we went to Top Acts many of my past students noticed that there were criteria not addressed in those performances – so I’m just confused.
Depressed and confused.
Clearly require a drastic re-think about my approach to the course and the assessment guide.
Perhaps Drama Vic could run a PD like the Assessment Training for the Solo Panels so that everyone has the opportunity to clarify criteria in a meaningful way.
I’m off to start again.
Well, once again, I must say I live by my comments to students and parents that there is ‘little to no system’ behind marking either the solos, or this year, even the written. I find it interesting, my students who clearly got (using our calculator from the drama vic forum) A’s were bumped back to B’s (I’m talking consistent 13/15’s here) and another girl who somehow pulled a B+ with an average of 9/15’s.
Hmmm…..and the confusion continues. My weakest student received a B+ on her exam while my brightest (average 14/15’s) got the same mark. (The weaker girl missed one whole question she informed me)
And with solos, well, it proved once again that doing a comical or overly dramatic solo means better marks when my two top had no dramatic merit but were the most entertaining. One guy nearly missed an entire dot point and still got an A. I’m really disappointed I didn’t get any A+’s for the solo, though realistically I didn’t think that anyone was quite that amazing. Though some Top Cats and Top Acts prove again its better lucky than good.
I really think that this whole system is just a little too subjective, though I think that I’m just spitting sour grapes since my highest study score was only 36.
As a side note, I agree with everything said here and in the original forum post.
Better get thinking about how to ‘fix’ the problems then.
Thanks Justin for this opportunity. First of all let me say that I was fairly happy with my results overall, however was concerned with individual students. A student who was ‘just a B’ scored an A for her solo and two potential A+ received B+.
I want to stay constructive here. Agree that ALL VCE DRAMA TEACHERS should receive notes that are given to the solo performance examiners. This keeps the playing field level for all. Also a reduction in criteria may assist in the inconsistencies we are seeing from room to room. Could I also add that becoming a VCAA solo performance examiner is like winning tattslotto. A colleague of mine has attempted 3 times and got knocked back and every year I see the same faces. Maybe some new blood would help?
Could our indicative marks come into play? Maybe 15 or 20% of the overall mark. That stops the kids who ‘slap it together’ from getting an A+ and rewards the hard workers.
I will keep thinking…..