When this site was first conceived, the Royal College used 'normative-based' assessment (NBA), whereby a specific number of candidates will pass depending on the overall performance of the whole. This means that around 80% (or whatever the figure is) are going to pass whatever. Cynics would argue that this is essential in order for the Royal College to maintain their revenue stream from the exams.
The shift in assessment in recent years has been towards 'criterion-based' assessment (CBA) where candidates are marked according to whether they meet specific criteria which have been agreed beforehand. You can get an overview of the differences between the two at this page at the University of Toronto. This is arguably a much fairer way of determining the skills of the candidates. You could score 95% on the exam under NBA and still fail if 80% of people get 98%. The College really have to sort this out, in my humble opinion.
Examiners have intimated that the Royal College is looking into how the OSCEs are marked, and to be fair, they are endeavouring to introduce a new format to the exam system. Hopefully, in a few years time all of these teething problems and inconsistencies will have been resolved.