User Group

We spoke to Principal Examiner at OCR, Mark Pedroz, to find out how regular feedback from examiners who use RM Assessor, is enabling RM Results to improve the user experience through product development.

Cambridge Assessment and RM Results have brought together a user testing group of examiners to gather and apply valuable insight around e-marking platform RM Assessor.

Can you tell us about your background and how you became involved in assessment?

I was an English teacher for 25 years, and Head of Department for most of that time. I left that role to become a Principal Examiner at OCR.

As a teacher, you are judged by your ability to help students achieve good grades. I became involved in assessment with a view to improving my own students’ results, and to help them achieve the outcomes they deserve.

I enjoy my role at OCR – I find it stimulating to be on the other end of the examination process and I still feel I’m involved with, and able to make a contribution to, what schools are doing.

How did you find the transition from pen and paper marking, to e-marking?

To begin with, I wasn’t sure if e-marking would work, especially for subjects like English Literature. Fortunately, I had a brilliant team leader who went through the process with me step by step, and I learnt how the system can benefit markers, students and schools. I was impressed with how straightforward and accurate RM Assessor is in capturing annotations and calculating marks.

Tell us about the RM Assessor user testing group.

The testing group consists of around 30 members who have joined forces to conduct user testing of RM Assessor. The team is made up of experienced assessment professionals, including examiners, e-marking team leaders and members of senior management from Cambridge Assessment. It’s a great experience to work alongside other dedicated professionals and consider ways that the end-to-end assessment cycle could be improved.

What has the user group found so far?

We started the user group testing process by gathering feedback from the helpdesk and Yammer group so that we could find out what examiners struggled with, and what their frustrations were. Together with RM Results, we reviewed this feedback and prioritised changes that would have the greatest benefit to the widest range of academic subjects.

It’s rewarding to know that what we’re doing here is really valuable, and already we’ve seen significant progress with the platform. We’ve witnessed our feedback appearing as new product features for instance, particularly those geared around the user interface. We’re also continually working towards a simpler, more user friendly platform that helps examiners to mark accurately and efficiently. What’s most important to us is that markers are able to focus on marking rather than spending too much time getting to grips with technology.

The group really puts users at the forefront of the software, and we’ve found that RM Results has been very quick to respond to markers’ needs. It’s been a pleasure to contribute to the ongoing improvement of a product that has evolved throughout the duration of a long partnership between OCR and RM Results.

How do you feel technology can improve assessment?

I think that technology has already improved the quality of assessment with regard to e-marking. One of the major differences between e-marking and traditional pen and paper marking is that examiners can hold a dialogue about the script. There is also function to link the message to a particular script, so both markers can view the messages alongside the text that it refers to. This allows markers to engage with each other when they are unable to meet face to face, and leaves a clear audit trail to ensure that communications are transparent.

The RM Assessor3 platform will feature improved tools for standardisation and for reviewing scripts after they have been marked. The additional flexibility of the platform (being hosted in the cloud and accessible from different devices) will be great for busy examiners needing to mark at different times of day or from different secure locations.

What are the main barriers awarding bodies face in transitioning to e-assessment?

I think the biggest barrier to transitioning to full e-assessment is getting centres to submit work online. It would require a huge shift in the infrastructure for the education system, which lacks the resource to allow large numbers of students to sit exams on computers simultaneously. Security is also a major problem – schools and awarding bodies have to make sure that computers used during examinations are internet disabled.

E-marking is a great interim solution while exams are still being conducted using paper scripts. With RM Assessor, schools and assessment centres can get quick access to annotated scripts, and students can receive their grades much quicker than they would through the traditional system.

How can we help organisations to recruit more e-markers and make the role more appealing?

Examiners develop very valuable skills during their time, but they need to feel that those skills are rewarded and respected, and they need a chance to pass on what they’ve learned while developing their own careers. To move forward, schools and awarding bodies need to work together to recognise marking and assessment as part of professional development.