HRchat Podcast

Hiring Science: Why It's Broken and How to Fix It with Dr Deborah Kerr

The HR Gazette Season 1 Episode 802

In this eye-opening conversation, assessment expert Dr. Deborah Kerr dives into why the science of hiring remains "broken" despite decades of research and how validated assessment tools can transform results.

Dr. Kerr reveals startling statistics: managers get hiring decisions wrong approximately 80% of the time, according to Gallup research, with human cognitive biases being the primary culprit. Your brain's natural tendencies toward confirmation bias, overconfidence, and anchoring on first impressions sabotage objective decision-making in ways most hiring managers never recognize.

Perhaps most surprisingly, many common hiring practices lack scientific validity. Experience as indicated on resumes shows almost no correlation with future job performance. Those "years of experience required" in your job descriptions? Research suggests they're nearly worthless as predictors of success.

What actually works? Validated, predictive assessments that measure cognitive ability (the strongest predictor across all jobs), followed by personality traits and work culture fit. When these assessments form the foundation of a structured hiring process - complete with targeted interview questions and job simulations - organizations experience dramatically better outcomes.

For HR professionals, this episode, hosted by Career Club's Bob Goodwin, provides actionable insights to transform hiring processes, reduce costly turnover, and make the business case for evidence-based hiring practices. The technical knowledge exists - now it's time to close the research-practice gap and revolutionize how we match people to jobs.

Support the show

Feature Your Brand on the HRchat Podcast

The HRchat show has had 100,000s of downloads and is frequently listed as one of the most popular global podcasts for HR pros, Talent execs and leaders. It is ranked in the top ten in the world based on traffic, social media followers, domain authority & freshness. The podcast is also ranked as the Best Canadian HR Podcast by FeedSpot and one of the top 10% most popular shows by Listen Score.

Want to share the story of how your business is helping to shape the world of work? We offer sponsored episodes, audio adverts, email campaigns, and a host of other options. Check out packages here.

Speaker 1:

Welcome to the HR Chat Show, one of the world's most downloaded and shared podcasts designed for HR pros, talent execs, tech enthusiasts and business leaders. For hundreds more episodes and what's new in the world of work, subscribe to the show, follow us on social media and visit hrgazettecom.

Speaker 2:

Hey everybody, this is Bob Goodwin, President of Career Club and very happy guest host of HR Chat. Thank you, Bill Bannum. Today I am really pleased to be joined by Deborah Kerr, PhD, who is the president of an assessment company called Authentus, where we're going to be looking at best practices in hiring, some technical aspects of hiring and maybe just a bit more broadly, the job market today.

Speaker 3:

So, with that, debra welcome. Thank you. It's always good to have a chat with you, bob.

Speaker 2:

An HR chat with me, no less.

Speaker 3:

I know it's awesome. Come on.

Speaker 2:

So just so people can get to know you a little bit, Deborah, do you mind just tell us a bit about Offentus? How long has Offentus been around? And maybe just sort of like the two or three sentence description of what you guys do.

Speaker 3:

Yeah, Offentus is a pre-hire assessment company. Provide an assessment that evaluates an individual's personality, cognitive and work culture, fit preferences, and then we take those preferences and compare those to the success factors for specific jobs. So there are about a thousand jobs that exist in the world you can call them a lot of different things, but really not much more than a thousand and so what we have is information about what the key success factors are. We're going to compare candidates to that and we'll be able to statistically analyze and identify who's most likely to be a high performer.

Speaker 2:

Excellent. No, that's phenomenal. And I know that we're going to get into more of this and you know, not so much an infomercial for offenses per se, but really to help folks understand best practices in hiring, maybe separate a little bit of fact from fiction. And I thought it might be helpful, deborah, if you could help us set some context for the current hiring environment that we find ourselves in, as you guys see it at Adventist.

Speaker 3:

Yeah, right now, as of the last day of December 2024, we had 7.6 million job openings, and this is the Bureau of Labor.

Speaker 2:

Statistics. Pardon, this is in the US.

Speaker 3:

Yes, also, in the US, as of the end of January 2025, we have 6.8 million unemployed people. So we've got jobs, we've got people. So that's where we are right now. How do we match up those people with the right jobs? It's interesting that since 1975, since 1975, authors have been writing articles on. Hiring is broken.

Speaker 3:

At 1975, 50 years later we're still talking about hiring is broken At 1975, 50 years later, we're still talking about hiring is broken. And I did a scholarly search not a Google search, but a scholarly search on the phrase hiring is broken and I got over half a million scholarly articles.

Speaker 2:

Wow, 500,000 articles.

Speaker 3:

Yeah.

Speaker 2:

Well, actually it's 508,000. There you go, but hiring is broken, and that's since 1975?.

Speaker 3:

Yeah, wow, isn't that crazy. It's not like we don't know there's a problem, so.

Speaker 2:

Well so and I want to pick up on something, particularly for people who are listening who don't know you, which is probably most everyone I did introduce you with your proper title, that you have a PhD. These are things that you have studied deeply and have you know the research and the academic background to not just have a theory or an idea or a feeling. So where does your research take you, deborah? Like what in your view? Don't let me limit you to the top three things, but why is it still messed up? How come we haven't seemed to make a ton of progress?

Speaker 3:

There's one reason that it's messed up and that's an absence of knowledge in the marketplace. So we know what works, We've got 80 years of research my goodness, Frederick Taylor wrote about this stuff and productivity and how do we get productivity? But what we don't seem to do in the field is look at what actually works. What can we demonstrate predicts success for an individual in a job, and it's that absence of knowledge and we're calling it it's more technical knowledge because it is evidence-based, research-based knowledge, because it is evidence-based, research-based.

Speaker 3:

And so we end up with I think Gallup estimated that hiring a manager or promoting somebody to a manager, people get it wrong about 80% of the time. That is so expensive 80% of the time, according to Gallup, Wow. And they also, percent of the time, according to Gallup, Wow. And they also. There's other research that talks about when managers make a hiring decision. About half the time they're not happy with that decision, and so we've got the bad hire or the hiring mistake, and what that means is that after the hire and after training and onboarding and appropriate management, you've got employees who are not meeting the goals of the job known about this for at least 50 years.

Speaker 2:

This has been written about and you've got 80 years worth of data. That indicates we kind of know what does work actually. Is it willful ignorance? Is it a change management issue?

Speaker 3:

No, it's not. It's not willful ignorance, but what it is a research practice gap. So we've got the 80 years of research. I love that stuff. Most people don't love that stuff right? Most HR professionals, for example, don't love reading the research. We have that gap about what I do, what I am told works without evidence and I think, because everybody's doing it this way, that it must be right.

Speaker 2:

I just don't understand how, after 50 years of people, 500,000 articles like like you just it seems like how much of this I mean. But I think I'm a good judge of character, debra, like I, you know, this is my company, or my function at this company, and I've seen who does well at this and I'm just going to go hire people that my intuition tells me right, or my experience candidly tells me that if she's got these qualities, she's probably going to be pretty good at this job.

Speaker 3:

A couple of things are going to get in the way of a good decision based on the I can tell by talking to you or reading your resume kind of approach. And when managers talk about their own hiring mistake, here's what they say. Here's what managers say about their own approaches that they believe lead them to making hiring mistakes. So one they rely too much on the candidate's own description of knowledge and ability. So the LinkedIn profile, the resume, they rely on that instead of finding a way to confirm the veracity, the truthfulness on that resume.

Speaker 3:

So we can talk about somebody has assistant director of operations and then what the job actually was was assistant to the director of operations, right, so it doesn't take much to present a whole different framework that managers then assume is right. We also see that managers say they don't use objective data consistently, and objective data you want to look at data that are unfakable.

Speaker 2:

What's an example of that?

Speaker 3:

data that are unfakeable. What's an example of that? The assessment pre-hire assessment tools provide you with unfakeable data.

Speaker 2:

I can't study for how to take the CCAT or some of these predictive index or some of the other ones that are out there to kind of shape my responses, because I know they like I'm sales, they want me to say I'm outgoing, they want me to say I'm competitive. You know I'm goal oriented, whatever, and I can kind of tell from the questions like what the right answer is.

Speaker 3:

Well, the difficulty with the knowing what the right answer is, those thousand jobs, the patterns of response that we know lead to success in that job are so different. So you've got you're an N of one. Okay, you're one person. You have your experience, which is your lived experience. That's great. Recognizing that your individual experience and assumptions may not be true all the time is important. So when we look at our 1,000 jobs, each one of those jobs is the result of surveys of, let's say, 2,000 people, and those 2,000 people are subject matter experts, incumbents in the job and managers, so people who have knowledge from a variety of viewpoints on that job. So we've got your personal experience versus what we know several thousand people say about the job. So that's one issue with that. The other is bias, cognitive bias. Do you want to talk about that a little bit?

Speaker 2:

Well, I was going to say confirmation bias, but you're the expert, so, yes, that was my next question.

Speaker 3:

Confirmation bias is absolutely a thing and it is influencing decisions, even though we humans don't know that our brains are influenced.

Speaker 2:

Can you explain confirmation bias real quick for somebody who may know the term but can't define?

Speaker 3:

Yeah, confirmation bias reflects a belief that you hold or a hypothesis that you have about whether or not to hire me. Right, and what you do then is seek information that confirms that bias. So you're going to look for characteristics in my resume, in my interview, that seem to indicate to you that what your belief is is true. So we're looking for people like ourselves, and what happens then? We get into what's called cognitive tunneling, which is an interesting factor, and that means my ability to seek and see new information is impaired because you've already been captured, because I am like you on these certain kinds of areas.

Speaker 3:

So it is pretty uncomfortable for humans to say I might be really wrong about this. So let me go ahead and look at the counterfactuals. And we tend not to do that. So confirmation bias is. I have a belief, I have a hypothesis about what works, and so I'm just going to look at what supports that.

Speaker 3:

So, confirmation bias is pretty important. Then there's the overconfidence bias, and this is an interesting literature that the research has been on all kinds of professionals and the overconfidence bias I call it. I'm not as smart as I think, I am bias.

Speaker 2:

Hey, hey, hey, that strikes too close to home, but go ahead.

Speaker 3:

We have a tendency and this has been looked at in philosophy as well as psychology we have a tendency to overestimate our knowledge and our ability to predict outcomes, and I think of Garrison Keillor's tagline. I just thought of that. What was it in Lake Wobegon? All the men are handsome, all the women are, was it beautiful, and all the children are above average.

Speaker 2:

It's the above average one that I remember.

Speaker 3:

yes, yeah, yeah. So that's you know, that's just a function of how our brains work is that we do set ourselves apart and we see ourselves as being slightly better than everybody else. It's like those surveys that say are you a good driver and how good a driver are you compared to everybody else? Well, of course I'm a good driver.

Speaker 2:

And I'm better than everybody else, and all of our surveys say that everybody's better than everybody else.

Speaker 3:

Yeah, yeah, yeah. I got to tell you a story about a study that was done at Duke University that just underlines this overconfidence, and they did a survey of chief financial officers at large companies and they asked the financial officers to estimate the results of the S&P index over the coming year. So these are people who are highly trained, highly experienced, highly knowledgeable. And so the Duke researchers collected over 11,000 forecasts and then they looked at their actual results in predicting the market's behavior. What they found was that financial officers of large corporations didn't have any clue about short-term performance not a clue. And in fact, when the correlation was calculated, the true value of their estimate was less than zero, and what that means is.

Speaker 3:

I know less than zero. Okay, we're in minus numbers now. So, here's what that means. When they said the market was going down, when the prediction was the market's going down. These are smart people. When the market is going to go down, it was slightly more likely that it would go up instead of go down.

Speaker 2:

So they're negatively correlated. All right, so well. First of all, before I move on, is there anything else? First of all, before I move on, is there anything else? So there's the overconfidence bias, there's the kind of cognitive or confirmation bias.

Speaker 3:

There was a lack of information, I think, or the N equals one, and that's the availability bias, I think, is that last thing that you talked about, and what that means is that's an assumption, that we make judgments about the likelihood of an outcome based on our own personal experiences.

Speaker 2:

Which is the n equals one Yep.

Speaker 3:

Yep, if I can think of a time when this thing happened, then I'm going to believe it.

Speaker 2:

Yes, yes, yes, yes, yes. Is there anything? Because I do want to move on to what a ray of sunshine might look like here in just a second. But is there anything else? On the problem statement side of this equation, Deborah, before I let you move, on.

Speaker 3:

Maybe the anchoring bias is important because that comes into play particularly later in the hiring process, and the anchoring bias is that we as humans tend to fixate on the first piece of information that we get, so our first impressions were really hanging onto that as reality. And so once that anchor is set, every other decision relates back to that anchor. So that can be, that can be problematic.

Speaker 2:

The first wedding dress phenomenon.

Speaker 3:

Yeah.

Speaker 2:

Right. Everything goes back to the first wedding dress, and that's why that one gets picked.

Speaker 3:

That's great, all right. So that anchor, bottom line anchor, is going to contaminate future decisions, gotcha Okay.

Speaker 2:

So what we figured out is the little box inside of our head doesn't work quite as well as we wish that it did.

Speaker 3:

Oh, it's a mess up there.

Speaker 2:

And it has a very, very long track record, apparently, of not being very good.

Speaker 3:

According to what those 500,000 articles?

Speaker 2:

Clearly Okay. So with that, I want to move into assessments, because assessments is not a new idea. If you'll indulge me a category that I think a lot of listeners will be familiar with, whether it's Myers-Briggs predictive index, I don't know what else is out there.

Speaker 3:

There's a DISC, there's something called D-I-S-C.

Speaker 2:

DISC, which is a communication style instrument, both yours and being able to read that into other people, hogan, I mean there's tons and tons.

Speaker 3:

I forgot about Hogan yeah.

Speaker 2:

There's tons and tons of stuff out there, and I think what people are saying is Deborah, I hear you. That's why we're trying to use some assessment tools to bring a level of objectivity. We know that we've got bias, and so that's why we're starting to have been using a tool that we hope, believe is going to work Before we get into the puts and takes of. Like, what makes for a good assessment, a valid assessment, if you will. Just more generically. More generically, when in the hiring process is an assessment? Can an assessment be the best utilized? And, maybe as importantly, when is a bad way to use an assessment?

Speaker 3:

Let me start by making a distinction within the category, please. We've got two kinds of assessments. One is descriptive, and that really goes to like the DISC and the Hogan, and what that description does is give us information about ourselves. There's the other category, which is predictive selves. There's the other category, which is predictive, which means that the assessment is validated, tested to show that it in fact is related to future performance on a job. Okay, okay, so we've got the ones that describe and they're really useful For hiring purposes.

Speaker 3:

We don't, and most of those that are descriptive will have statements don't use this for hiring.

Speaker 2:

Okay, yeah, okay, and I think another one, maybe that would be StrengthsFinder.

Speaker 3:

Oh, StrengthsFinder is great and it is descriptive. Minders is great and it is descriptive.

Speaker 2:

And it's very descriptive, yeah. And then we find you know, and for people who know who I am on this call like, we work a lot with you know, both as executive coaching and also with job seekers, and we find that, you know, these descriptive tools are quite good for understanding myself right and doing some self-reflection and you're just being able to understand a little bit better what are some of the attributes that make me me and being able to get to your point, give words to it right, which is super, super helpful, which, again, isn't necessarily predictive of anything but, it is helpful in terms of me being able to both understand for myself and to describe to somebody else you know some attribute of me.

Speaker 2:

So it sounds like, if I'm following the descriptive instruments are probably not very helpful in a hiring context and that if listeners want to go explore assessments in general, they should be focusing on the ones that are purported to be predictive. Is that true?

Speaker 3:

Well and validated. Somebody can say they're predictive but what you? Want to look at is that technical report that shows I mean that's external people not associated with the assessment company come in and they test your instrument to make sure that you are in fact delivering what you say you are. We had some folks from Auburn University come in and do our validation study.

Speaker 2:

Gotcha.

Speaker 3:

So yeah that's what you want to look for, and you can tell that by looking at that technical report.

Speaker 2:

Okay. So what I know is a mistake in using this so just to put it out there and not dance around, it is using the instrument as a filter and, like I'm not even going to talk to you if you don't score some way on the instrument. Would that be a mistake, or is that actually a valid use of one?

Speaker 3:

It's a mistake if it's a descriptive tool. It is a valid use of a predictive tool.

Speaker 2:

Okay.

Speaker 3:

That you actually have the proof that the decisions that you're making on candidates actually relate to the requirements of the job.

Speaker 2:

With this level of confidence, with this much wobble, yeah, yeah, yeah.

Speaker 4:

Right.

Speaker 3:

We've got the numbers for that Would you like to look at? You know I have some slides. There's one slide I'm thinking might be useful.

Speaker 2:

Okay, but we're going to need to talk it out too, if you don't mind, because most people are probably doing this on an audio podcast.

Speaker 3:

Oh, that's right, that's right, that's right.

Speaker 2:

Okay.

Speaker 3:

So here's what you want to look at what is the validity of the source of data that you're using? For example, experience is a very big part of what people look at and they look at learning about that through resumes, through LinkedIn, and again, you can't confirm any of that. And then we look at the research that's been done on the relationship between experience reported experience, between experience reported experience and success on the job. There is not a correlation there. The number of years of experience does not predict success on a job. The classic study on this was in 1980, greenberg and Greenberg, and they looked at salespeople and you know, there's an assumption that if somebody has, you know, two years of experience, they're probably not going to be as proficient as somebody with six years of experience.

Speaker 2:

And they did a controlled study on that and found out that actually that experience does not relate to success in the future experience does not relate to success in the future, and I okay, I don't want to get too deep in this one, but that would have some future horizon. No-transcript window right.

Speaker 3:

Oh, the future success.

Speaker 2:

Yes.

Speaker 3:

Yeah, that would be determined by the company. After they do that onboarding, training, management and observe some, certainly within a year. And here's why the research is important. The researchers have the data over time right, and they're analyzing it over time, so, but it is probably within. It takes about six months for people to settle into any job, whether you're a frontline worker, probably takes two years if you're a CEO. But that information is important to remember that experience does not necessarily lead to success on your job, even if the job title is the same.

Speaker 2:

How can these assessments, can you use what you learn from the assessments? And then again we'll kind of narrow this down to predictive validated assessments. Yep, can you use those from a sourcing candidates? Let's not get into vetting candidates yet but can I even use it from a job description or somehow make sure that I'm sourcing hopefully better candidates that will be a good fit for the role?

Speaker 3:

I want you to say a few more words about that, to make sure I'm saying relevant stuff. Okay, I'm just sorry.

Speaker 2:

If I've got 100 applicants and I can own 100 applicants through an assessment, when are they all down to 10 people or whatever? What I'm saying is can I get 100 better candidates by how I phrase the job description, where I go fishing for candidates? If I know things are true about future success in this kind of a role and in your example you're saying experience in sales. According to this study, that didn't matter a lot Well, I would massively change how I wrote that job description. You bet didn't matter a lot. Well, I would massively change how I wrote that job description. You must have XYZ. How can we change the job descriptions and or where we're fishing for talent based on what we know about future performance and what the assessment might yield?

Speaker 3:

I think the beauty of using this science and research is that you can cast the net as wide as you want and the assessment is going to point you to the people who are statistically more likely to fit your job, to learn the job faster, stay longer, produce more.

Speaker 3:

Those are all of the outcomes that we see when we're using validated assessments. So, yes, you could. In fact, there are numerous examples of people who hired someone that, according to my own version of decision, with my biases all in place, I can make a decision based on the assessment that helps me get away from that, so that I am not relying on my own biases. And so what that means. An example a client, a user of our service was in electric lighting, so they did all kinds of lighting things, and when we talked to the CEO of the company, he said you know, one thing that's really important to me is that when I read these sales resumes that they don't have any mistakes in spelling, any typos or any commas out of place. When we conversed, when we talked about it with him, we found out that top salespeople really are not interested very much in commas.

Speaker 3:

And he said you mean, I could be missing my best employees, and the answer is yeah, you could. So, yes, you are, yes, yeah probably okay.

Speaker 2:

Okay, fair point I want to real quickly because I mean we've got a lot of hr professionals who'll be listening to this and one of the things that I this is my bias now coming in okay, that I think that that interview processes are either super structured and maybe not based on research and knowledge, or they're very organic, and my definition of organic is ain't got no plan, so it's just sort of the wild west, right, you know? And? And so I'm curious, how can an assessment be used, a predictive, validated assessment, be used, to shape better questions, to have just a better interview process? That's going to enhance the likelihood of getting the best candidate for this role at this company, of getting the best candidate for this role at this company.

Speaker 3:

One thing and this relates to something you said just a minute ago clarity in that job description is essential.

Speaker 3:

And it is essential because if you can't define what the job is and here I'm thinking really, the essential job functions are so important that we know what those are and we have validated those within our company If you don't clarify that job description, then no one's going to be able to necessarily get the right information to make a decision, and you may not even have the opportunity then to interview those top performers or those top candidates, right, and so that clarity in the job description is really important.

Speaker 3:

Now, with the interview process, actually, you said several things that are very useful for people. One is and there is a legal component to this which HR folks are aware of the idea of basing the interview questions on those essential job functions, because an interview is a data collection process, it's not a conversation, and you can get yourself into a little bit of a tough spot or get your organization into a little bit of a tough spot If you're doing the Wild West approach. Oh, I just want to get to know you. Actually, you don't. What you want is the opportunity to collect good data.

Speaker 3:

So, having those questions designed, and this is where HR is so important. In the systems area, hr can design those questions and you've got to then train the managers to use the questions and you've got to hold them to that and you also have to teach them to take notes. That is the evidence of your appropriate consideration of the characteristics of any job candidate. Now, with some psychological or valid assessments in ours, we provide a report to the hiring manager that is based on so you've got your standard questions right that HR is going to pull together. Add to that questions based on the results of the assessment and the job match. It can be that somebody is not an exact job match, but bringing the manager's knowledge in and having the manager assess that during the interview process can be extremely helpful. We happen to provide those questions customized questions to hiring managers so they can check on those factors where there's not a strong match.

Speaker 2:

But to your point. One, I need to know the questions to ask. Two, I need to actually ask them. And three, I need to record the responses. Yes, you do, oh my goodness, yeah, record the responses. Yes, you do, oh my goodness, yeah. Just to be clear, if somebody you know is using a quality assessment that we've described now, and let's say they've got three candidates who you know all, did you know appropriately? Well, if that's even the right adjective or adverb for how somebody does on an assessment.

Speaker 3:

Oh, they matched the job, they matched the job.

Speaker 2:

Thank you, they matched the job.

Speaker 3:

Yep.

Speaker 2:

You talked about and I hope I'm not asking two questions in one by accident but at the beginning we talked about cognitive assessment, personality assessment and workplace or work culture. Culture, fit yeah, culture, fit, okay. How do I then kind of I've got three people who seem to be strong fits for the role, right through the assessment, we're interviewing them. How do I use those three different lenses to actually pick my best candidate, or is there even a best candidate?

Speaker 3:

There's going to be a strongest fit candidate and I think that's your three that you're talking about that they really have the inherent qualities that match with the qualities needed for success in that job. So there is that success in that job. So there is that. Now, we never, ever, would recommend that somebody use any assessment as the only driver in decision making. Okay, so you're going to have that statistically valid information. The interview brings in that knowledge of the managers. That is extremely helpful. So if you're going to pick through the three, then you will have had those manager conversations. The other thing that can be very helpful, if it's well-structured, is some kind of a job simulation where you ask people to demonstrate they can do some part of the job, some real part of the job.

Speaker 3:

This is not like why our manhole covers round. I mean that you know those kinds of trick questions we've known for a long time are completely invalid and really don't give you any useful information. But a well-designed job simulation, what is one part of the job, and don't ask somebody to do a complete marketing plan. Come on, Ask someone to do a certain kind of work that's limited, that is constrained, so that you can then begin to see how do they apply those personality and cognitive factors on the job. So that's another source of information.

Speaker 2:

Is there either a stack ranking of those three attributes cognitive, personality, culture or there is, there is Significantly yeah, yeah.

Speaker 3:

What we know is that cognitive functioning and there were always everybody's going to talk about the same three aspects of cognitive functioning Verbal reasoning, numeric problem solving what kind of math can you do? And then logic deduction and reasoning that problem solving piece we know that the cognitive functioning is the single most accurate predictor across all jobs of success on the job. So if you get a match on that cognitive area, very strong indicator. The second strongest is personality.

Speaker 2:

And what's what would double click on personality? What does that even mean? That could? That could mean anything that anybody thinks it means.

Speaker 3:

Well, actually no, it can't, I know. That's why I want you to tell us at least 80 or 90 years of research at this point. Personality factors are characteristics of an individual that tend to stay stable over the lifespan. So once I become an adult and I have developed my personality, those factors and the way that they operate within me are unlikely to change statistically for the rest of my life. So my personality is a combination of those factors.

Speaker 3:

For example, introversion and extroversion is a factor that most people have heard of. That is a proven construct. We have again that hundred years of research that shows. Yes, people tend to be on a sliding scale. Now they tend to be naturally more introverted, tend to be naturally more extroverted. And again, there's a scale on there.

Speaker 3:

Another one is um, how do I approach others at work? It's a characteristic called agreeableness. So when you're looking at the job, is this a job where it's important for the team to essentially work together to draw on each other's knowledge and experience? Or is this a job like the one you described early on, where I'm a strong salesperson? Do not make me try and coach somebody else? So that's on the other end of that agreeableness scale. Do I prefer group orientation or do I prefer an individual approach? And there are about 18 of those factors that have been proven. They do not co-vary and what that means is that if I am an extrovert, that does not affect in any way that notion of agreeableness. I can be a group-oriented person and be an introvert. They don't change when you compare the two of them. So these are all independent factors.

Speaker 2:

So it's interesting because bringing up the rear but not to say it's not important is culture and culture fit.

Speaker 2:

And yet you know again, this would probably just be anecdotal on my part, but it feels like you know, culture fit is just like this big thing and, like you know, we're looking for people that are just going to be a great fit with our culture and that's like the most important thing, because if you're not going to fit into our family, you know it's probably not going to work out. And that's not your number one, no, it's not your number two, no, it's your number three. Tell me about culture.

Speaker 2:

What? What does that mean, and why is it third, not one or two?

Speaker 3:

um, well, that's the research. The research has showed us that the culture fit is not as strong a predictor as cognitive or personality. So that's all that means and it doesn't preclude that as being something to look at and consider. But it is important to understand that the research is pretty mixed on that. I mean, I've read the same things that you have that culture fit is the most important thing. It's actually not the most important thing. It is an important thing and it's worth looking at, but it is far behind cognitive and personality match.

Speaker 2:

What are some of the attributes or diagnostics of culture fit?

Speaker 3:

When we're looking at culture fit, we're looking at something like perfectionism is a culture attribute of a job. So if I'm an accountant I have a certain natural tendency. If I'm a good one, you know a good fit would be the accountant is detail-oriented. They better be, because you got laws, you got standards, you got all kinds of stuff. Culture fit on that factor for a salesperson. You don't want a salesperson who's that detail oriented, you want someone that is stronger in other characteristics. So being a detail oriented salesperson, there's a little bit of a mismatch there.

Speaker 2:

Yeah, and how are those different than personality factors?

Speaker 3:

They're actually sub factors of personality, but they're the ones we call them culture fit, because they're the ones that reflect the, the ethos of the organization. I guess you could say another one of the culture fit.

Speaker 3:

it just left my head. We've got perfectionism. Oh, hierarchy. What's the structure of the organization? Do I prefer to work in? An organization where the hierarchy is very clear in terms of who makes what decision and the other end of the scale? Is it an organization where there's more of a group dynamic in decision making that we do shared decision making? Both of those approaches work, and it's just. Where does the individual fit in that work? Culture.

Speaker 2:

Let's start bringing this home a little bit. I'm looking at the clock and I want to make sure we're being respectful of your time. If I'm an HR professional listening to this, debra, and you're making a really good case that, left to our own devices devices we can screw it up. More often than not, we don't get it right because of the different biases. Um, how do I, how do I go and make the case to my cfo that we need to be investing in this, because the cost of getting this wrong seems to be high?

Speaker 2:

but I'm not sure, but I'm not sure how to dimensionalize that.

Speaker 3:

How would?

Speaker 2:

you recommend, they do that.

Speaker 3:

Well, I think that cost is an excellent place to start. Most organizations do not report financial information directly related to the human resources in their organizations, so they don't know how much does it cost to hire someone, how much does it cost when we lose that person within seven months of the hire. So it's beginning with. I mean I've got estimates for a $30,000 a year job. It's probably going to cost about $18,000 to hire somebody. If you're at a $60,000 job, it's about 48,000. And if it's a $100,000 job it's probably 128,000.

Speaker 3:

We have a calculator sheet of 44 items having to do with hiring calculator sheet of 44 items having to do with hiring. So getting those costs down and using those numbers is key to talking to the CFO. There's no better partnership than having HR and finance work together closely to really impact the bottom line of the organization organization. So we've got that cost data that people need to have at their fingertips. And the second thing is HR is so good at this getting the clarity on that job description and those essential job functions and hammering those home to the managers. There's some research that shows that, given left to their own devices, managers are not going to have a very good hiring system. But if HR can design a system and then go through that change process of implementing the system, outcomes are much better. Um, the first step in any hiring process is to use a valid assessment. Step one it's like hey, we got your resume. Here's my automatic reply, here's a link.

Speaker 3:

Next step is go complete this questionnaire so that you're not wasting time on people who are statistically unlikely to be successful.

Speaker 2:

Yeah, it's interesting, my last commercial job before I started Career Club was at a company that's owned by a private equity firm, and I don't have enough knowledge to say whether they used a predictive validated tool or not.

Speaker 3:

I'm sure my guess would be no.

Speaker 2:

They believe they did. But what I do know is they believe their religion and if somebody took this and everybody had to take the assessment and it didn't, it didn't matter if you were years and she was amazing If they couldn't clear, like you were saying, cognitive was one of the things, personality was one of the things. If they did the threshold, it didn't matter, like they never violated their religion just because they. Their experience was. When we make exceptions, it doesn't work. It just more oftentimes. Not, it doesn't work. We just created a new problem for ourselves.

Speaker 2:

that we avoided, so so. So your point, though, on make sure that it's a validated tool I think that's like number one Is there. Is there anything else? Because, again, I do want to be mindful of the time and I do want people to know how to find you. Is there any major point that we haven't made, Debra, that we would be remiss not covering?

Speaker 3:

You know, there's one more piece and this is a tough one, and I think you and I have talked about this at some point and that is providing feedback to applicants.

Speaker 2:

Oh, please yes.

Speaker 3:

Yeah, what we know is that most companies do not provide any feedback to applicants. It's what people talk about online is the black hole that I put stuff out there, I get nothing back, and what the research shows is that your employer brand in the marketplace can be enhanced if you provide some kind of feedback that people see the application as being more valuable. So that gets you away from. I only apply when I can click twice and my application is done right. You want to be able to increase the number of candidates that have intention to apply to your job. So that feedback loop and finding a way to do that, which is challenging you know, we at Aventis have actually stood up a whole section just for applicants, but that can be really useful in your reputation management.

Speaker 2:

Yeah, there's something called the Candid Experience Awards, the candies and it's longitudinal, it's quantitative, longitudinal, it's quantitative. And one of the clear things for employer brand is feedback, quick feedback and helpful feedback. And so you know, I'm a huge advocate of what you're saying there and as long as people can stay between the you know legal liability lines and not get in trouble. But I completely agree with what you said. So I appreciate you bringing up the feedback point. If people want to get in contact with you, learn more about the Fentis, learn more about how to use, you know, validated, well-researched tools to help them make better hiring decisions, what's the best way for them to do that?

Speaker 3:

Go to our website wwwoffintuscom. A-f-f-i-n-t-u-scom.

Speaker 2:

A-F-F-I-N-T-U-Scom. Deborah, so much fun, I mean. What I love about this is it's a huge problem. So you've taken on a big ocean to go tackle.

Speaker 2:

It's one that is well researched, that this is not you know kind of snake oil territory, I mean, but it's not, you know, and I'm afraid that people have had maybe a negative experience with assessments that were either descriptive or purportedly predictive and weren't, and because they weren't validated. I appreciate what you share with people. Please take this as. Can you show me your validation? Just can I please have the technical document that?

Speaker 2:

shows validation by a third party who's not affiliated with us. Yeah, I think it's really, really important, and no, I just. This is an important, important issue both for the hiring companies but as well as the applicants, and, just to make sure this is clear to people, there's a section on the Offentus website for job seekers. So something that at Career Club we highly advocate that people do is go take that. That will help you better understand what a good fit for you looks like, as well as giving you language to describe that when you're interviewing, of why you believe you would be a good fit for this job which you're going to get asked. And here's a free tool to help you be much better informed and more articulate in how you respond to a question like that. So, debra, as always, I love our conversations. I learn something every time, so thank you for that. Thank you.

Speaker 2:

Thank you, bill Bannon, thank you HR Chat, hr Gazette, thank you for the opportunity to guest host and we look forward to seeing you guys on the next episode. Bye-bye.

Speaker 4:

Bye. This episode of the HR Chat podcast is supported by Deal. Are you looking for an all-in-one solution to manage your global workforce? With Deal, you can easily onboard global employees, streamline payroll processes and ensure local compliance, all in one flexible, scalable platform. Join thousands of companies who trust Deal with their global HR needs. Visit dealcom today that's D-E-E-Lcom to see how you can start managing your global team with unmatched speed, flexibility and compliance. Thanks, and we hope you enjoy this HR chat episode.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

HR in Review Artwork

HR in Review

HRreview
Career Club Live with Bob Goodwin Artwork

Career Club Live with Bob Goodwin

WRKdefined Podcast Network
A Bit of Optimism Artwork

A Bit of Optimism

Simon Sinek
Hacking HR Artwork

Hacking HR

Hacking HR
A Better HR Business Artwork

A Better HR Business

getmorehrclients
The Wire Podcast Artwork

The Wire Podcast

Inquiry Works
Voices of the Learning Network Artwork

Voices of the Learning Network

The Learning Network
HBR IdeaCast Artwork

HBR IdeaCast

Harvard Business Review
FT News Briefing Artwork

FT News Briefing

Financial Times
The Daily Artwork

The Daily

The New York Times