People Analytics

Adam Grant and Replacing Intuition with Evidence

The Washington Post wrote a piece in early April profiling Adam Grant, a professor at Wharton (my alma mater) and author of The Originals. The piece describes him as Malcolm Gladwell meets academia and is very flattering. He is at the forefront of a new trend called “people analytics”, a term that just keeps growing in popularity. Here is what the Google searches for “people analytics” looks like:

People Analytics trend over time

It’s not a hockey stick, but it is a strong upward trend year after year starting in about 2008.

My wife actually worked for a company called Volometrix that tried to be the leader in people analytics. They had a product that sat on top of  a company’s Outlook and combined with their HR system. It would pull the data on  who was meeting with who, how much time they were spending in meetings and who was emailing who. Even how often people were cc’d vs being in the ‘to’ line. The idea wasn’t to be big brother on individual workers (there are far easier ways and moer accurate ways to find out if someone is slacking off). Instead they wanted to use it to find trends in the data to help improve company management and struscture. Some use cases they pitched included:

  • Look at communication patterns before big  B2B sales. See if there is a difference between the successes and failures (maybe the extant of communication between product and sales? How much communication with the client?)
  • Look at communication patterns before big product launches to see differences between successes and failures
  • Look at the communication patterns of companies during a merger process to measure quantitatively the extent of the integration

The coolest anecdote I heard was when they showed a giant graph of all the connections in the organization to the CEO. He pointed to one cluster of people that did not seem to have any communication with the rest of the company. He asked, “Who are those people”.

“That’s your IT team in India”

“We don’t have an IT team in India”

“Yes you do.”

It turned out that the company had added a handful of Indian IT employees during one of their acquisitions but then promptly forgot about them. They were still on payroll, but they were in their own little world with almost no contact with the rest of the company. Who knows what they were working on (if anything).

 

The product was very cool. (My wife joined as Director of Product partly because of the amazing potential it had). The problem was no one was buying it. “Cool” isn’t enough.

Actually cool often IS enough to sell something.

“We have a paid search solution that runs on big data that will lower your costs 20%”

“We have a personalization engine that runs on big data that will increase your conversion 20%”

“We have a social media monitoring tool that will drive your engagement up 20%”

(It’s always 20%. I think that’s because it’s a big enough number to be very interesting, but a small enough number that managers believe it. If the salesman said he could half their cost of SEM or double their conversion, there would be too many questions asked. But 20% seems reasonable – but impressive).

 

People analytics is new enough, and unusual enough that people didn’t know what to use it for. There was no +20% to improve from. They had to  create a whole new category. And that’s really hard. Especially when you don’t know what you are going to do with the data.

 

Which is the problem with a lot of these new data-driven solutions. Many provide interesting insights, but the real challenge as most on-the-ground managers know is applying those insights. The bottleneck in most companies isn’t insights, it’s the ability to execute the right insights.

 

Adam says, ““I think we are leaving the age of experience and moving into the age of evidence. One of my big goals professionally is to get more leaders to stop acting on intuition and experience — and instead be data-driven.”

I agree. But the challenge is that most people’s intuition is that they should go and collect more data.

 

Some of Adams interesting conclusions from the article:

  • “His best-known study examined how much performance improved when workers in a call center — widely thought of as a tedious, thankless job — met the students who benefit from their sales pitches.”
  • “He found that they [Goldman Saks associates] wanted to spend less time on rote tasks like making “pitchbooks” and to gain more exposure to clients as well as have more time to learn tradecraft, such as how mergers and acquisitions get done.”
  • About a JetBlue’s program where employees can give cash rewards to other employees: “Employee “engagement” scores not only improve for the people who get the rewards but also for those who give them”

Let’s tackle the first one:

Call Center employees meeting students

Call center employees who met with the people who benefited from their pitch improved their performance. That sounds great. Why isn’t this being done in every call center on the planet?

Well sometimes it is. At A Place For Mom we used to have all new employees – including call center staff – visit nearby properties so they could see what the “product” actually was (most call center staff was very young, so they otherwise might never have seen an assisted living community). Many companies have call center employees do some sort of immersion to understand their product and customers. But many don’t. Why?

First: It takes time. That is time that is not used for many other opportunities to improve call center performance. Maybe you could use that time to teach selling skills. Maybe you need that time to teach phone etiquette (really). My wife spent time helping a call center in India. She took them through a training on how to identify American sarcasm.

Second: Call center employees in America tend to turn over at a high rate. So if you invest in their skill levels that ROI better happen fast. If it doesn’t, you are better to live with the lower performance.

The issue is the cost of meeting customers and the improvement is going to vary dramatically from industry to industry and company to company. Sometimes it will make sense, and sometimes it won’t. Which means that if you decide to do it at your company you are first going to have to run a test (just like the test Adam did). And running a test takes managerial (and sometimes executive) time and effort. Now you need to add in that opportunity cost as well. You can also only run so many tests at once. Which do you want to run first: Meeting customers or product knowledge session or new sales training session or lean operations initiative or… The chances are the exciting and interesting one (Get your call center employees out of the call center and visiting customers) is going to be less effective than the tried and true ones that have been proven for decades. If, after you have mastered the tried and true stuff you need to find a new idea to try, this seems like a good one. But most companies have trouble picking up the phone when you call and getting back to you immediately, let alone trying to be cutting edge.

 

This is the problem in general with cutting edge management science. It is super interesting to read and it is inspiring for managers and executives. But it is also distracting from the boring stuff that we know matters. It’s why I say that the quest for excellence often gets you to a worse place because it distracts you from being good.

 

But don’t just take my word for it. My book has stories like this, but it mixes those stories with data. Like Adam I too am very data-driven. That natural desire for data tends to get people like us excited by cool new analytical insights. So we need to be especially careful to not be distracted by “the fancy”.

You can read the first chapter of my book, Good Enough: Why Good is Better than Excellent by submitting your email on the right hand side of the page.

Comments

comments