When you read scientific research, you should be left feeling as though you gained knowledge and/or have something new and shiny that can be applied to the real world. But once in a while you finish an article and there is nothing but unpoppable “What did I just read?!” bubbles floating in your brain.
This article focused on how applicants’ personality types might impact their reactions to assessment tests within a hiring process. Specifically, candidates for firefighter, dispatcher, and rescue management roles had to complete a series of personality and cognitive assessments as a part of the selection process. Immediately after, they were asked to complete a voluntary survey asking about their reactions to the tests. The researchers found that personality types had no impact on applicants’ perceptions that the assessments were related to the job and that the tests could predict future job performance. One personality type did perceive the tests as less fair than those with other personality types, but the difference may not have been large enough to have real meaning.
As I kept reading the article, I kept wondering how this information would be applied, or even how it would be useful. I kept wondering this because the authors never told me. The authors briefly mention previous research stating that applicant reactions can impact whether or not a candidate might accept a job offer and/or impact their future performance on the job. Yet they never relate their own findings to this previous research. I was left hanging.
The study also had a number of confounds, a few of which the authors acknowledged. Looking solely at rescue applicants isn’t representative of most jobs and applicants. Candidates had to first pass a physical test before they were allowed to begin the personality and cognitive assessments. The reactions survey only asked for their reactions to the personality and cognitive tests, but wouldn’t their perceptions of the physical test muck up their thoughts a bit?
Also, participants voluntarily completed the reactions survey, and not everyone completed it. Wouldn’t the thoughts of those who did NOT want to share their reactions be critical? Finally, their research found different reactions to the assessments based on gender and age, but they never investigated further, which I found disappointing.
Now I have to be fair and say that no research is perfect. All research has confounds. But when you feel as though you don’t get the “so what?” of the entire study and there are also lots of confounds, how are you supposed to react?
After reading this article I was left feeling a little icky inside. But it reminded me that reading research with a discerning amount of skepticism is not only healthy, it is mandatory. It reminded me of a wonderful quote by the philosopher George Santayana: “Skepticism, like chastity, should not be relinquished too readily.”