Go to our other sites
Cambridge Assessment International Education
Why choose us
Find out how to become a Cambridge school
Programmes & qualifications
Support & training for schools
News & blog
Cambridge Outlook magazine
Keep up to date with news from Cambridge and its schools around the world.
We appreciate you may have questions about how we mark and grade portfolios of evidence. We have answered some of your most common questions below.
We have received a wide variety of types of work in portfolios of evidence. This is what we expected: we wanted schools working in difficult circumstances to be able to send us what work they had available, so that their students would be able to get a result and move on with their lives.
The method of marking the portfolio of evidence is designed to cope with this wide variety of work. It differs from the way we mark exams because we mark exams using a mark scheme and we mark portfolio of evidence using a set of exemplar work. It would be impossible to write a mark scheme that could be applied to such varied work.
Portfolio of evidence examiners read through a candidate’s piece of evidence and form a holistic view of its quality. They then select the piece of exemplar work which is closest in overall quality to the candidate’s work. The exemplar has a mark associated with it: if the candidate’s work is of exactly the same quality, then the candidate will be given the same mark as the exemplar. If the candidate’s work is a little better or a little weaker than the exemplar, then the examiner can give a mark higher or a mark lower than the exemplar.
The same process is followed for each piece of evidence in the candidate’s portfolio.
We are following our usual processes to make sure that all examiners mark in the same way.
Before they are allowed to start marking, each examiner must successfully complete a process called standardisation:
Once an examiner is marking during the live series, their work is checked at regular intervals to make sure that they are continuing to mark accurately. There are two types of checks:
If we find that an examiner is marking erratically, we will stop the examiner and cancel all of the marks that they have awarded. Other examiners will then re-mark those portfolios of evidence.
Yes. At the same time, remember that there is a scale of reliability - some assessments are more reliable than others. Exams are generally more reliable than the Portfolio of Evidence approach – that’s one of the reasons why Portfolios of Evidence are only available when exams can’t happen. However, our early evidence indicates that the marking reliability of Portfolios of Evidence is comparable with the reliability of a typical exam paper. The difference is that candidates taking exams usually take more than one exam paper in each subject, and this increases the reliability of exams.
No. When we ask examiners to make a judgement about the quality of a piece of work, we ask them to take into account the difficulty of the task they were given. Good answers to easy questions don’t indicate a high level of performance, and our examiners recognise that. In fact, giving really easy tasks to students may disadvantage them, because it doesn’t allow them scope to show what they can do.
We asked schools to select pieces of evidence that were typical of the general level of the student’s work. Sending us a student’s best pieces of work would be cheating. Where we have reason to think that has happened, we will carry out a malpractice investigation and, if necessary, we will take action.
No. Many of our syllabuses offer different options (different combinations of papers that candidates can take). We are used to setting grade thresholds that ensure the different options are aligned – that is, that it is no easier to get a good grade on one option than on another. The portfolio of evidence route is simply another option, and we will set grade thresholds that keep it aligned with the exam options. It is a normal part of what we do.
When we choose grade thresholds for June 2022, we will do so in two steps (this applies equally to exams and to portfolio of evidence). The first step is to choose grade thresholds which match to the exam standard in June 2019, before the Covid pandemic. The second step is to adjust the standard to the easier standard of June 2022.
This first step, for exams, will follow our well-established grading procedures. These have been described elsewhere.
The second step needs explaining. It is necessary because in June 2020 we awarded schools’ predicted grades and in June 2021 we awarded a mixture of exam grades where exams could take place and school-assessed grades where they could not. For entirely noble reasons, teachers’ judgements of their students tend to be more generous than exam results, and we saw this in both the predicted grades of June 2020 and in the school-assessed grades of June 2021. We could not allow our June 2021 exams to be tougher than school-assessed grades, as this would be unfair on exam-route candidates. Therefore, we temporarily eased the standards of our exams in June 2021 so that they aligned with school-assessed grades. In common with smaller international exam boards, we will make a staged return to the pre-pandemic standards. June 2022 standards will be at the mid-point between the easier June 2021 standard and the pre-pandemic standard of June 2019. Hence the need for the adjustment at step 2.
Portfolios of evidence will be graded using the same two-step approach. At step 1 we have two main types of evidence to help us choose appropriate grade thresholds. These are benchmark baskets and notional thresholds.
Having used these two types of evidence to choose grade thresholds at step 1, we will then make the same adjustment to thresholds at step 2 as we make for our exam routes.
So, both exams and portfolio of evidence will be graded to maintain the June 2019 standard at step 1, and both will be adjusted in the same way at step 2. That is why we can be confident that the standards of exams and portfolio of evidence will be aligned with each other in June 2022.
Every year, most candidates receive grades which are broadly what they expected. And every year some candidates are pleasantly surprised by their exam grades and some are disappointed. We can be sure that in June 2022 this will still be the case, whether the candidates have taken exams or portfolio of evidence.
You can request a copy of a candidate’s script to see what they wrote, and a review of marking if they want to have the result double-checked.
For portfolio of evidence, the pieces of evidence were selected by the school. Students and parents who face disappointing grades may attempt to blame the school for not selecting the best work available. The best response to this would be to point out that selecting the best pieces of work rather than typical pieces of work would have constituted malpractice, with potentially serious consequences for both the candidate and the school. If they complain that the work selected as evidence was among the student’s worst pieces, you may want either to point out to them the other requirements which also needed to be taken into account in the selection (such as to provide as broad a coverage of assessment objectives and subject content as possible), or to point out from the student’s records that the work selected was not in fact their worst. You can also point out that there is an Enquiry about Results service 10, in which students may query whether or not the selection of evidence met Cambridge requirements. However, this service will not consider whether or not the work was typical of the candidate’s general level of performance.
The marking of portfolios of evidence can also be reviewed through the enquiries about results services, in the same way that exam marking can be reviewed. This may help to allay the concerns of some parents and candidates.