Forget A levels – it’s aptitude tests that need scrutinising

Julia Rampen asks should we be scrutinising the use of aptitude tests amongst employers and discusses some of the flaws in this approach.

In this weeks column, Julia Rampen asks should we be scrutinising the use of aptitude tests amongst employers and discusses some of the flaws in this approach.

This is an abbreviated example of a question from perhaps the most career-defining exam many of today’s students will ever sit – and it’s not A-levels or even university finals, but the far less scrutinised and increasingly prevalent verbal reasoning test. It is one of the many aptitude tests used by employers as wide-ranging as KPMG, the RAF and Unilever.

In Japan, companies generally expect their employees to put in long hours of overtime. But it is difficult for women, who also have household chores to do and children to take care of, to work at the same pace as men, who are not burdened with such responsibilities.

“Japanese men do not share household chores and childcare with their wives.” Based on the passage above, is this statement definitely true, definitely untrue, or do you have insufficient information?

Why do employers use aptitude tests?

There are obvious benefits to aptitude tests. For employers expecting thousands of applications, they are a handy sieve to separate the entitled, the lazy and the deranged from the sensible. For applicants, they are a clear sign the company is interested in their ability and not on whether they play polo or knew their future boss’s old tutor.

But given how passionately the redesigned A-levels have been argued over, it seems strange that we don’t question aptitude tests more often.

Here it’s time for my disclaimer: I’m terrible at aptitude tests. The first time I came across them was when I decided to apply for the Civil Service Fast Stream, circa 2011. I immediately failed them.

However, it being shortly after a recession, I decided it was still worth my while to master aptitude tests. The moment I gave up was a verbal reasoning test about a couple who missed their early morning flight and were given doughnuts by the airline while they waited for the next one. I could imagine this couple, already dressed for their summer holiday, huddled in the corner of the airport. “The couple had a free breakfast – true or false?” the test asked. I chose “false” – what I would say if I was handed a deep-fried, chocolate-covered ring of dough at six in the morning and told to eat my breakfast. I was wrong.

Gaming the system

So, now you know my bias, here’s the problem with relying too much on aptitude tests. First, they can be gamed. In some cases, where the test can be done remotely, this literally means someone else is doing them. One friend, who is something of an Einstein at aptitude tests, told me: “I’ve done Deloitte, PwC and KPMG despite never actually having applied to these companies.” For him, there is no mystery: “It’s just understanding what these companies do, and who they’re looking for. I could just see.” For those who don’t have such accomplices, there are forums and practice tests online, including “packages” that cost £49.

Of course, there’s nothing wrong with trying to improve at passing aptitude tests, if this is genuinely what employers think will make you the best fit for the job (I knew one applicant who spent three years taking such tests and eventually succeeded). But less confident applicants may already have fallen by the wayside. The aptitude test genius I mentioned earlier did them for a friend. He reflects: “My friend probably could have got through them if they had tried, but was nervous about their quick maths skills.” While most employers do not share their application process with the world, we do have data from the now-defunct Civil Service Fast Stream aptitude test. This data shows there is a pattern to who is intimidated by aptitude tests: in 2014, graduates from BME or working-class backgrounds were more likely to ditch their application rather than face the test, compared to those who were middle class and white.

Aptitudes tests and bias

As this suggests, aptitude tests may eliminate overt prejudice, but they do not always create a level playing field. A 1995 experiment with black and white students in the United States found that black students performed worse in exams when reminded of their race – and the stereotypes associated with it. In the 2016 Civil Service Fast Stream test, the success rate of ethnic minority applicants was 3.2 per cent, compared to a success rate of 4.9 per cent for white applicants.

The Civil Service Fast Stream test also revealed a huge failure in recruiting applicants from working-class backgrounds. A 2016 report looking into this problem found that just 4.4 per cent of successful applicants were from lower socioeconomic backgrounds – making the Fast Stream less diverse in this regard than Oxbridge.

This is worrying when you consider that people from that demographic are the most likely to rely on public services and experience the brunt of initiatives implemented by the civil service, like welfare reform. Would the roll-out of Universal Credit – a one-size-fits-all benefit that has, due to errors, in some cases left claimants on the brink of starvation – be different if more of the top honchos in the Department for Work and Pensions had direct experience of what it was like to be reliant on benefits?


Even if aptitude tests were representative of graduates as a whole, there’s still a risk of institutionalisation. A browsing of aptitude tests online suggests high scorers are likely to be sensible, good with numbers, and able to anticipate what their boss wants. This person sounds like a great employee. They also sound like the kind of person who, if faced with a fundamental crisis in the company, could be adept at massaging the numbers, and reassuring their boss that everything’s fine.

Aptitude tests tell us nothing about an individual’s ability to call out groupthink. Nor, for that matter, do they capture the intricacies of day-to-day office life – the ability to manage other people, to manage mood swings or charm a potential client on first meeting. According to the Institute of Psychometric Coaching, verbal reasoning tests become more difficult because they use “higher language” and more complex text. But is that really necessary? Shouldn’t companies focus on trying to make their internal communications clearer, rather than indulging in corporate jargon?

The answer to the question at the start of the answer is true (apologies to Japanese male feminists). As for me? I think with more perseverance, I could have passed an aptitude test eventually, and I think it’s also fair to say that the civil service has functioned perfectly well without me. But there is one thing I will maintain till my dying day: doughnuts are not breakfast.

Julia Rampen

Julia Rampen is the digital night editor at the Liverpool Echo, a former digital news editor at the New Statesman and financial journalist.

Rate This: