For companies hiring staff, pitches from online security firms sound appealing enough: Running a credit check before signing up a new employee will “offer insight into an applicant’s reliability and a sense of their personal responsibility,” insists employeescreen.com.
Another security firm swears employers using credit checks will “find out what you need to know.”
It is no wonder nearly half of all employers bite. Credit checks are cheap, costing as little as $20 apiece, and what business wants to hire irresponsible, unreliable employees?
And then there are the discredited but lingering myths that even the credit-reporting companies aren’t peddling anymore: On surveys, many employers still speak as if probing job applicants’ past financial struggles serves as a crystal ball, predicting which prospective hires will steal from customers and defraud their employer.
It’s all gravy for the companies selling credit reports, which were designed as a tool for lenders but have found a lucrative new market: Anxious employers. But the evidence isn’t there.
You’d think that for a practice so prevalent in business, there’d be reams of studies showing that checking credit history is an effective way to increase the reliability and responsibility of a workforce. No such luck: Almost no research evaluates credit checks at all, and still fewer studies consider how they are actually used in hiring.
Instead, the research shows something very different. A 2013 Demos study I worked on finds that poor credit history is associated with unemployment, lack of health coverage, and heavy medical debt among middle-income households with credit card debt. Medical costs were a major contributor to debt for more than half of households with poor credit, while nearly 1 in 3 experienced a bout of extended unemployment. In many cases, the job applicant with unpaid bills isn’t irresponsible, but just needs steady employment.