The State of “Skills-First” Hiring

Of the MANY themes that were addressed at #SIOP23 (the Industrial/Organizational Psychology Conference), I was particularly keen to learn more about the “Skills First” trend that has become ubiquitous in HR media over the last 6-12 months. I’ve read LOTS of articles on this and certainly understand the gist… we should hire for SKILLS rather than education and experience. The salient points…

  • Education and Experience don’t work (they are correlated with job performance at .10 and .07 respectively based on the latest meta-analytic data (Sackett et al))
  • Education and Experience introduce bias (weeding people out based on either criterion frequently eliminates disproportionately large numbers of people in certain races and socio-economic classifications)
  • We use Education and Experience as a “PROXY” to make certain assumptions about traits and capabilities. We do this because it’s easy, but it’s time to recognize that it’s doing more harm than good.
  • We need to recognize that people can gain skills in lots of different ways, so let’s ditch the outdated and ineffective education and experience requirements and broaden our minds.

OK, great. But what I have YET to see in any of these articles is HOW to do this. It feels like “hire for skills” is just a euphemism for “eliminate your education requirements”—I was seeing no instruction for how to go about verifying skills or otherwise operationalizing this fantastic notion. So, I attended every session at SIOP with the word “skills” in the title. And, after 5 hours of listening to over 30 I/O Psychologists and practitioners from leading organizations, I’ve concluded… it’s kind of a cluster___.

Basically, when the word “skills” is bandied about in the “real world,” it is actually somewhat of an umbrella term that includes:

  • Knowledge (MS Office, OSHA, phlebotomy)
  • Skills (attention to detail, communication, organization)
  • Abilities (mechanical aptitude, learn quickly, critical thinking)
  • Other Characteristics (independent, agile, comfortable with ambiguity)

Anyone who has been in HR for a while will recognize something interesting there (hello KSAOs). So, is this “skills first” thing even NEW? I think… kind of yes. Why:

  • It’s necessary. For most, applicant pools are tight and shifting our mindset away from traditional criteria toward a broader strategy for determining whether someone is capable of doing the work opens up lots of options that weren’t available to us before.
  • It’s fair. With intense attention being paid to DEI, the idea of removing screens with embedded bias and putting everyone on equal footing is appealing.
  • It’s progressive. Rather than staffing for “jobs”, we staff for “capabilities”. We create agile talent marketplaces within our organizations that allows us to get stuff done in less stodgy ways. Also, it increases talent mobility (and, likely, retention) within organizations.

All good stuff. Implementing skills-first hiring, then, requires a fundamental shift in how we screen candidates. We ditch the education and experience requirements, sure. But then we have to actually understand WHAT skills (and knowledge, abilities, and other characteristics) are needed for certain jobs (or job families, or even companies as a whole). This requires developing skills ontologies or taxonomies (or whatever terminology), and doing so is hard, time consuming, and expensive (if you ever went through competency modeling, this is like that, but even more granular). Not to mention… IT CHANGES. FAST. The half-life of learned skills is generally believed to be about 5 years. And the number of skills required for a certain job has been increasing by 10%- year over year- since 2017.

Even if you manage to get a good grasp on the skills needed for a job (today), you still need to figure out how to measure them. Which either requires developing some measurement method(s) internally (there’s time, money, and risk involved here, not to mention you need pretty large populations to do local validation), or you use an off-the-shelf skills test (usually pretty quick, easy, and cheap, but you have to make sure the vendor is keeping the test content up-to-date and relevant, especially in technology-related testing).

Then, even if we nail it and verify that this applicant has the skills and knowledge they need to come in the door and hit the ground running, there is still a lot we don’t know about them. What about attitude, work ethic, interpersonal skills, motivation, willingness to be coaching, temperament, emotional intelligence, commitment, ability to learn and adapt to change?

Get why I’m saying it’s a bit of a cluster? I think philosophically this is the right direction for companies to be going. Get rid of restrictive, ineffective, and outdated proxies, and jump to actually validating that candidates have the “right stuff” to do what needs to be done. But, operationally… it’s incredibly complex. Most of the huge organizations addressing this at SIOP admitted their skills-first programs are in their infancy and they don’t really have any data yet to know if it even works.  

So, where does that leave us? Here are my recommendations:

  1. Let’s go all-in on changing the paradigm of how we hire. Stop relying on education and experience as a top-of-funnel screen for most jobs. This will take some deliberate change management, especially with your recruiters & talent acquisition teams. Verifying skills and knowledge via testing, simulations, work samples or other methods is far better (more effective, fair, and inclusive) than assuming someone possess certain skills and abilities just because they’ve completed a degree or held a certain job for a certain about of time.
  2. Let’s get better at understanding what actually correlates with success. In my experience, very few companies have methodically identified what matters most—is it reliability, mechanical aptitude, initiative, empathy, intellectual curiosity, adaptability, critical thinking (the list is infinite)? Once you have a clear picture of what you’re selecting for, the selection process falls into place.   
  3. Let’s leverage assessment tools that are meticulously selected based on their ability to scientifically measure the things that matter most to you. This mitigates human bias, increases consistency and accuracy, and sends the message to candidates that you’re making decisions based on data rather than subjective opinions (like what they wore to the interview or whether they sent you a handwritten thank you note).

While skills and knowledge testing definitely have their place in selection (especially if they’re taking the place of flawed education- and experience-based assumptions), I am still a fan of also measuring some of the more stable personality traits that may be required for success, along with ability to learn (especially with the rapidity of change in our VUCA world), and potentially some other aspects such as interests and values.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *