Your RFP is Useless

with Paul Leonardi, Duca Family Professor of Technology Management, UCSB

Paul Leonardi, Duca Family Professor of Technology Management, UCSB

Paul Leonardi

Duca Family Professor of Technology Management, UCSB

Paul Leonardi is an expert in digital transformation and organizational change and director of UC Santa Barbara’s Organization Studies Ph.D. program. He is the author of 4 books on technological innovation, and consults for major tech companies like Google and Microsoft.

Satyen Sangani, Co-founder & CEO of Alation

Satyen Sangani

Co-founder & CEO of Alation

As the Co-founder and CEO of Alation, Satyen lives his passion of empowering a curious and rational world by fundamentally improving the way data consumers, creators, and stewards find, understand, and trust data. Industry insiders call him a visionary entrepreneur. Those who meet him call him warm and down-to-earth. His kids call him “Dad.”

Producer: (00:02)
Hello and welcome to Data Radicals. In today's episode, Satyen sits down with fellow data radical Paul Leonardi. As an expert in digital transformation, Paul trains the biggest companies in the world to improve their data culture. Today, he'll help you figure out which tools to buy, how to analyze and use data productively, and how to completely overhaul your data culture.

Producer: (00:26)
This podcast is brought to you by Alation. We bring relief to a world of garbage in garbage out, with enterprise data solutions that deliver intelligence in, intelligence out. Learn how we fuel success in self-service analytics, data governance, and cloud data migration at alation.com. That's A-L-A-T-I-O-N dot com.

Satyen Sangani: (00:50)
Today, I'm excited to have on the podcast Paul Leonardi. Paul is an expert in digital transformation and organizational change. He's also the author of four books on technology innovation, including the award-winning Car Crashes Without Cars. He's also a consultant for Google, Microsoft, Cisco and YouTube, a popular keynote speaker, and today, he's the Director of UC Santa Barbara's PhD program in Organizational Studies.

Satyen Sangani: (01:15)
His new book, The Digital Mindset, is available now, and we're excited to talk about that and many other topics with him today. Paul, welcome to Data Radicals.

Paul Leonardi: (01:24)
Thank you so much, Satyen. I'm excited to be here.

 
 

The 80% rule in software users

Satyen Sangani: (01:28)
So maybe let's start a little bit from the beginning. Tell us a little bit about who you are, and I guess much more critically your background in organizational behavior. Because that in some ways is a lot of what we talk about here on Data Radicals: people's behavior, how they behave with data. And so tell us about that science and how you got into it.

Paul Leonardi: (01:47)
I guess my interest really started when I was an undergrad and I was doing an internship at a search company. Search was really popular, as you know, in the late '90s. And this particular search company was an early competitor to Google, and I was fascinated by the ways that they thought about how users would envision using their search technology.

I would sit in marketing meetings, and we would conduct user personas, and we would try to identify who are key users on the B2B side and the B2C side. And it always struck me that we had lots of projections about who these people were going to be, but no real in-depth insights of what their motivations would be, like why they would wanna be searching, and why searching in particular ways would be most useful for them.

Paul Leonardi: (02:43)
I was sort of struck by the fact that we created a lot of strategic plans based on very little insight about actual user behavior, and that got me to really want to understand: How do organizations think about who their users are, or who their customers are, and how do we develop better plans and strategies for engaging those users or customers?

And I really started to get dedicated. My research interests and my teaching and my early consulting to trying to unravel some of these questions and really understand: How is it that we can design and build and implement technologies that are doing interesting new things with data, in ways that get people to use them and change their work in productive ways?

Satyen Sangani: (03:28)
Yeah. Which it's really fun and interesting because you have a lot in common with one of our other guests, that's Seth Stephens-Davidowitz, who wrote a book, basically about how people use Google as a technology, and it wasn't quite all of the sort of idealistic things that I think that people would have originally envisioned or maybe entirely envisioned when they were developing the tech.

Satyen Sangani: (03:49)
So, you have thought about this idea around how people use and apply technology, and how they might actually ultimately behave, which is obviously not always expected from the designers of the tech itself.

Paul Leonardi: (04:04)
In fact, most often unexpected from the designers, I would say.

Satyen Sangani: (04:09)
Oh really? So do you do work to sort of compare what intention might be in terms of usage with actual outcomes? And is it generally that people use it differently than people would have expected?

Paul Leonardi: (04:19)
I would say 80% of the time. I have spent a lot of time over the last couple of decades working with companies that have ideas about who their users are gonna be and how these tools are gonna change the way that people work, and with leaders within companies that are implementing these tools, and they've got ideas, too, about the kinds of changes that are gonna happen.

Paul Leonardi: (04:39)
And really, I think about 80% of the time, the most impactful and important changes tend to be ones that people didn't foresee in advance. And I've come to recognize that I'm not sure we can always foresee the sorts of changes that are gonna take place. Because it's really difficult to predict – especially when we're working with data-intensive tools – the kinds of ways that people are gonna use those data, the surprises that are going to unfold, and how they'll change their work because they have access to new information.

Paul Leonardi: (05:10)
And our best course of action, I think in many cases, is that we just prepare for lots of change that we can't anticipate and we be flexible in our responses to it.

Satyen Sangani: (05:19)
Yeah, that's so interesting. Do you have some examples, just to make it more concrete for the listeners around what would be some examples of some great technologies that might have been used differently from what their designers might have expected?

Paul Leonardi: (05:32)
Yeah, absolutely. This example comes from a major automobile maker that I worked with, and they were really interested in building a tool that was gonna help to streamline the vehicle testing in the product development process. And so, like most products, where we build a prototype, we test to make sure that it meets the parameters and the performance objectives that we've set out in the initial program...

…And of course, automakers really wanna digitize as much of that testing process as they can because it's so expensive to build one-off models of the vehicle and crash them into walls or put them on a shake table or do noise and vibration testing. So the organization that I worked with was really interested in trying to develop some digital modeling capability to enhance this process and really speed up product development.

They had this vision in mind that part of the problem was that most of the engineers that were working with these tools were not using the modeling requirements of the organization in uniform ways, and so this tool would really seek to allow them to standardize what they were doing.

Paul Leonardi: (06:40)
That was the idea that developers had, and so they built a system with a set of features that would help to standardize the work. That was the idea that managers and leaders of this organization had, and when they rolled out the tool, they talked about how important it's gonna be to standardize the way people work and that the goal of that standardization was gonna be speed: “We would develop faster models and we would reach product development decisions much quicker.”

But most of the engineers who started using this tool looked at it and said, "I hear that I'm supposed to work faster when I use this, but it actually slows my work down and the way I was doing things before seemed to be much more efficient. And I keep hearing from senior leadership that the most important thing is that we move through these iterations and design much faster. So I'm not gonna use this particular tool for that purpose."

Paul Leonardi: (07:26)
"Instead what I'll do is I'll use the tool to sort of debug my models, because it's really good at allowing me to submit my model to the solver, and then it spits back if there's any problems with my model." So they didn't actually use it to standardize anything, but they were able to use it to figure out how to debug their models much faster.

That actually proved to be a pretty important improvement in their product development, but it was not one that anybody could foresee, and initially, many managers tried to clap back on them using it for that purpose because they're like, "We want you to be standardizing your work, not using it to test things out when you submit to the solver."

And it took a fair amount of recognition that this was an emergent property of using the tool that had good advantages for the product development process and those changes to sort of creep into the organization and make positive change.

Satyen Sangani: (08:18)
Which is a really interesting use case because it's such a specific technology that you would think there is no chance that these people could build a tool that would be used in any other way than what the designers had intended, and yet, here we are using a sort of a tool to drive commonality instead for debugging, which I guess in the broadest definition, you could say maybe debugging is a form of driving commonality, but...

Paul Leonardi: (08:42)
Perhaps.

Satyen Sangani: (08:44)
A reach, right? And to say that this is what happens 80% of the time, I think might be really surprising to — it's almost like the entire script got flipped on everybody involved, like the people who are buying technology, the people who are designing the technology, the people that are using the technology.

Satyen Sangani: (09:05)
Because the people who are using are like, "Well, of course, the way that I use it is exactly how everybody should be using it," and the people who are designing it are like, "Well, of course, this is what I intended and therefore that's exactly what people are doing." And then the people who are buying it are, "Well, I have this problem so that's what we intended to solve."

Paul Leonardi: (09:20)
Right. [chuckle]

Satyen Sangani: (09:22)
And so it almost seems like everybody's missing the joke. How much awareness is there amongst any of these populations that the world isn't quite how they conceived it to be?

Paul Leonardi: (09:33)
Well, not a lot of awareness. And let's kind of break down the chain of events here, because I think it'll help explain why you end up with this 80%. On the first brush, you have the fact that design practices are often decoupled from users and what users are doing. And certainly the big push in lean start-up methodologies and market validation in the last 20 years, especially in SaaS-based companies, has been around, “We need to recapture an understanding of what users do, what they need, what they want.”

And it's pushed market validation out to the edges of the organization to try to recapture some of that knowledge and insight. But still, when you're developing a tool that's disconnected from the users who are going to be potentially using it, or you're developing a tool for users who have never had experience with this, it's impossible to predict with a 100% certainty how they're going to incorporate it into their work. So that's where I think one of the first disconnects happens.

The second is that you have a marketing team at those developers that's selling tools to user organizations, and even though — and I've seen this happen time and time again — even though the product designers and product developers may have a fairly robust understanding of what the use cases would be, that doesn't always map on to how marketers decide they're gonna sell these tools to recipient organizations.

And the people that buy the tools at those recipient organizations aren't the people that are gonna be using them either. They're either kind of high-level executives that think that this is gonna be useful for changing work in a positive direction, and they're sold by the rhetoric that's coming from those marketers at the product development organization, the technology company.

Or often it's IT that's bringing in these kinds of tools, and IT sees the value of putting this in the ecosystem, but doesn't necessarily understand exactly how users are making sense of it either. Right?

Then on top of that, next link in the chain, you have a set of, let's call them mid-level managers who are socializing these tools into the organization.

And they're making lots of pronouncements about how the tool is gonna be useful, for what kinds of work it should be applied, what kinds of outcomes they imagine they're going to see from it, and those are starting to shape the way that the users themselves interpret and make sense out of that tool.

And then I would say the final kind of link in the process is that there are always, always important influencers in the informal social network within the organization that are early adopters, that form opinions about how they think that the tool actually works on the ground, and they're perhaps the most persuasive persuaders for everyone else in the organization about how to use it and if you're gonna use it at all.

Paul Leonardi: (12:23)
Because they're actually involved in that work, they are respected and they can tell others, "You know, that tool: really crappy for actually standardizing how you set up a model, but really good for figuring out how to debug it." And that's what people listen to.

So if you think about all those steps in the chains that we talked about along the way here, there's many, many places where the intended plan can get morphed, and I think that that obscures for all the way back to the product developers about how different their technologies ends up being used in practice.

Satyen Sangani: (12:58)
Which is super interesting and, I guess, so clearly then underscores why new software companies are built under this kind of product-led growth moniker, where so much of the software is developed in response to what people are actually doing inside of it, as opposed to the theories of what some enterprise budget-holder might have or what some really great salesperson might be able to convey.

Paul Leonardi: (13:28)
But like I said, even if they do that, sometimes there's still that disconnect, that they're not the ones that are selling the tool, and so the message gets morphed and transformed somewhere along the way, and then you end up with these disconnects at the ground level.

Satyen Sangani: (13:40)
Yeah. Which explains so much of why there are these massive enterprise go-to-market organizations inside of these companies and why it's so...

Paul Leonardi: (13:49)
Huge.

Satyen Sangani: (13:50)
Why these IT organizations exist and why there's so much complexity. It's a really interesting problem and it's a really fun problem and there's a feedback loop within all of it. Right? I think what I find fascinating is I was listening to a CEO, in fact talking to one the other day who's like, "Look, I don't like... " Very popular category, very popular tool.

Satyen Sangani: (14:08)
In this case, he was talking about I think customer success, and he was talking about one of the major tools in the world of customer success and he said, "I actually don't like the fact that that team uses this particular tool. And the reason why I don't like it is because it actually, it makes that tool the center of information or gravity for all this information that I'd like to have distributed throughout my organization."

Satyen Sangani: (14:29)
"And so even though it's supposed to capture knowledge, it actually captures knowledge but it silos it in a way that means that a lot of people can't access it. And so while one team has been treated really well, a whole bunch of other folks are now really insulated from the knowledge that I want them to have."

Satyen Sangani: (14:45)
And so there's all of these interesting cause and effects that are not predictable, and that's I think such an interesting thing about implementing technology. If you're somebody who's responsible for administering or managing technology in an organization, how do you form awareness of this? And what do the best leaders do?

 
 

The power of beta testing

Paul Leonardi: (15:04)
What I've seen the best leaders do is they think about how they beta test the frames and the user experiences in the same way that they beta test the technology's functionality. We are really good these days at making sure we don't release things into the wild that don't work.

Always there's gonna be bugs, but we're really good at controlled testing before mass deployment. We do this on the product development side, we do this in the organization, we get a small group of users, they test it out on a series of use cases. The goal with that typically is that the technology piece, the technical functionality, works as we expect.

What we don't do very much, and what the best organizations do, is they also beta test the use cases, and they beta test how we talk about those particular use cases to our users, knowing that those comments, that rhetoric, that discourse, is gonna socialize them into how they see the tool and how they experience it, and whether they evaluate ultimately whether or not it's good at doing the job they want it to do.

Paul Leonardi: (16:15)
So I've worked with a number of different companies that have set up a team or a series, small series of teams to try out these new tools and to really investigate how are they making sense of it, how are they interpreting what these tools are doing, do they evaluate that it's good for job A better than it's good for using for job B.

And once they have a sense of how in the actual practice of work in our company, these technologies are being used, made sense of, and deployed, then we can develop more of an internal marketing program that sells that tool broadly across the organization. But simply thinking that we can roll it out because technically it works just fine, is not dealing with any of these issues that we've just talked about.

Satyen Sangani: (17:04)
Yeah, it's so interesting. It reminds me of the story which happened at the evolution of my company, Alation, where we made these claims early on that you'd be able to search for data more easily. And that would seem to be factually obvious because we'd have a search engine that would work on datasets and so one would think that it would allow you to search for data.

And yet one of our early customers basically said, "Well, we wanna make sure that's actually true, so what we're gonna do is take a user and we're gonna allow them to go search for data outside of Alation, and then we're gonna give them Alation and we're gonna take a stopwatch and we're gonna figure out whether they actually can do it faster."

And thank God we passed the test, but we hadn't actually done that testing because we didn't have their datasets, nor did we know exactly how they'd wanna use it or what their environment was, what their alternative environment was. So it was really hard to recreate that, but it was a great lesson for exactly what you're talking about.

 
 

The top 3 skill areas digital leaders need to succeed

Satyen Sangani: (18:04)
So maybe switching gears a little bit, you were in part talking because you've written and released this book, The Digital Mindset: What It Really Takes to Thrive in the Age of Data, Algorithms, and AI. What does it really take to thrive in the age of data algorithms and AI? And what is a digital mindset?

Paul Leonardi: (18:21)
I think what it really takes, if we get right down the brass tacks, is humility and curiosity. It's easy to think that we know a lot about how technology works and how data works, especially for someone who's worked in a large organization that is powered by information technologies. So I think that humility is the first step.

And then curiosity is really the second, that the tools we use and the data landscape in which we're embedded is constantly changing, and if we're not curious consistently enough to try to understand how it's changing, why, in what directions it's moving, it's really easy to fall behind.

So I don't say those two things exactly in the book in that way, but I really do think that it's about these, these two elements of humility and curiosity, that are ultimately what it takes to thrive.

Now, on top of that, a digital mindset we define as a set of approaches for thinking about the interaction of data and technology with our social and organizational lives.

We argue in the book — and I hope that we're pretty convincing about this — that you need to have approaches to three different areas to really be competent and skillful as a practitioner in the digital world. The first is an approach toward collaboration, that recognizes that the kinds of collaboration that we have with people is increasingly mediated by digital technologies, and there are different strategies and skills you need to be successful in communicating and interacting in that environment, and how we think about collaboration with machines is gonna be part of that approach to collaboration.

The second major approach is what we call computation, and this is the one that scares most people by its title because they think, "Do I need to be really advanced in non-parametric statistics to understand how my company is operating?". And I think the short answer is no, but what you do need to recognize is that we now operate in a computational infrastructure.

Paul Leonardi: (20:25)
Data powers everything that we do, the decisions that we make, and we have to understand where those data come from and how those data are produced in order to make meaningful decisions. And that mention of data being produced is extremely important because data don't just exist out there in the wild waiting to be discovered, we create them through the technologies with which we capture the physical world and trying to put it into ones and zeros the way that we then classify that, how we query it, how we decide that it's important for decision-making.

All of those are products of data production, and we have to understand the undercurrent of how data are being produced to use it and use them in productive ways. And then the final approach is one that we call change, and the idea there is that we often have this metaphor, I think, of change as something that punctuates otherwise fairly stable existence.

And that's just not the way the world works, and it hasn't for a while, and it certainly isn't going to in the future. The rapid evolving nature of our tools and the increasing prevalence of and availability of different kinds of data means that we're in a constant process of change.

And so we call this “transitioning”: we're always transitioning from one moment to the next, and how do we manage an organization and how we manage ourselves in a way that respects the fact that change is a constant? And so we need to think through how we create the culture of our organizations, we need to think through how we're deploying tools, kind of in the manner that you and I already talked about, and how we do things that respect the fact that change is a constant.

Paul Leonardi: (22:10)
So a digital mindset really encompasses how we approach collaboration, computation and change, and that's what we talk about throughout the book.

 
 

Why did you write The Digital Mindset?

Satyen Sangani: (22:17)
What inspired you to write it? Was there a singular insight that led you to think, "There's a book here that I need to write"? Or was it a culmination of a set of learnings? Or what was ultimately the catalyst for you to put this down on paper?

Paul Leonardi: (22:33)
We kept hearing the same thing over and over again when we talked to people at all organizational levels and across different industries, and that refrain was like this. They would say, "Paul, Tsedal" — Tsedal is my co-author — "digital transformation. I keep hearing about it, I get we need to do it."

"I generally have a sense of how I might think about what a new business model or operating model could be based on the influx of data that we have available, but I don't really know how to get there, and I don't know what I need to know in order to get there, and I don't know what my employees need to know to get there."

"Do all of my individual contributors now need to be software developers? Do my senior managers need to be data scientists? What level of sophistication do we need and different kinds of skills in the digital economy to be a successful organization?"

Paul Leonardi: (23:26)
So Tsedal and I sat down and thought, "You know, there's no shortage of books that you can buy and keynote speeches that you can go listen to that talk about the importance of digital transformation and the strategy associated with that, but there's almost no content out there about how you create the right skill set and the right mindset to ask the kinds of questions and engage in the sorts of behaviors that might make you successful in a digital business." And that was really the driver for this book.

Satyen Sangani: (23:57)
A lot of the themes that you mentioned strike me, at least as you mentioned humility and curiosity, that harkens back to a lot of the Carol Dweck growth mindset work, that it's sort of like, "Look, I don't know what I don't know, but all I'm gonna do is work really hard and be open to learning and therefore just get to the next level consistently."

And I think this idea that digital transformation isn't like a one... There's not a... It's like digital transformations, like the singular seems to be a little bit of a misnomer, because the idea is like, "Oh, well I'll transform, then I'm all done."

Paul Leonardi: (24:33)
Yeah, agreed. That's a great point.

Satyen Sangani: (24:34)
It almost seems like that's actually just a continuous process and you're on that journey and you just need to go start the journey. How do you then therefore know where to start? So let's assume that I'm a luddite, don't have any of this understanding. How do you think about even just taking the first step? That's gonna be the hardest one on some level.

Paul Leonardi: (24:56)
I would say grab the book and start reading. That's the best first step. I sort of mean that jokingly. But also that's the way we wrote the book, was what is the thing that we could hand to somebody and say, "You wanna kinda know the basics of what an algorithm does? You wanna know the basics of why people use different kinds of programming languages? You wanna know how best to interact and to sort of give commands to a machine that's acting as your teammate? We can tell you what those things are in this book."

So if you've mapped out, let's say, a topography of different kinds of topics, I think that the ones that we cover are the places that you need to go to get started. You need to understand what's happening inside the machines that are processing the data that we use.

You need to be able to ask the right questions about how those data are collected and produced and stored. You need to be able to question when someone tells you there's a high likelihood that if we do X, we'll get Y outcome. You need to be able to really understand how security concerns are made manifest in our organizations today based on the way our technologies are developed, and our products are used.

That these are the kinds of places you need — and of course, given all the thrust towards remote work, you need to really understand how do you think about collaborating, and what are the tools you need to be an effective collaborator with people and with objects that you're not co-located with.

If you can understand those things as a start, then you're ready to be able to build on top of that more specialized learning in areas that are more important to you. So if you're constantly working with a team of data scientists that are giving you prediction after prediction, you can bone up on your skills around statistical reasoning to be able to really...

Paul Leonardi: (26:55)
Like if they tell you, "You know the P value on this is like .005," you can say, "Well, that's great, but what's the confidence interval? Because that matters." So you'll be able to develop the kinds of skills on top of the basic foundation that are more pertinent to the various areas that you might work in.

Satyen Sangani: (27:15)
Right. So the book then is a primer, and it is a primer for people who wanna take the first step and want to be more digitally literate as it were, but don't quite know where to start, and that's obviously super helpful. You have a concept of the book that you refer to as the "30% rule." Tell us more about that and how that factors into the equation?

 
 

The 30% rule (or how to survive in a digital landscape)

Paul Leonardi: (27:42)
Sure. I'll start by saying that I think that our 30% rule is more anecdotal than it is scientific, and the reason that we arrived at this concept is that when we would talk to folks at lots of different companies and say, "Don't freak out. It's actually not that hard to develop the skills you need to be successfully literate in the digital economy," the question that we always got was, "Okay, well, like how much do I need to know?"

And it's funny, if you asked somebody who, let's say, was an executive at a software company that came up through a non-technical track, didn't have a technical background, and we'd ask them, "How much do you need to know?" They would often say, "Well, you absolutely need to know how to code. You just need to know that, or you're done with."

And then we would ask an executive at a software company who came up through a technical track, and we would say, "Well, to be a successful senior leader at your company, do you need to know how to code?" And then say, "Oh, absolutely not." [chuckle] So I think we kind of take for granted on both sides, based on our backgrounds, what things are important and what things are not.

What we came to after interviewing thousands of people about this topic was that you need enough knowledge in a handful of different areas to be an effective interlocutor. So you need to be able to talk and understand, but you don't need to be a master or an expert in most of these domains to be successful.

So the analogy that we like to use in talking about, "Well, how much do you need?" is the analogy of English language learning if you're not a native English speaker. So in the workplace, somebody that has an almost native-level fluency has about 12,000 to 12,500 words of English. There's been a lot of linguistic studies that sort of demonstrated this is where you get to native-level fluency.

But if you want to be an efficient practitioner of English in the workplace, meaning that you can get by just fine, understand what people are saying, maybe you have a few more, "I'm not sure what that word is" moments, you need about 3,000 to 3,500 words, so about 30% of the total. Somewhere in that area.

And that analogy I think works really well for the digital mindset. That if you can develop enough understanding about statistics and about data science and about where data come from and about programming and about how algorithms work, to recognize the concepts, ask intelligent questions, interpret the answers that you're getting and think critically about them, then you've reached the threshold where if you have that basis to build on top of it more specific learnings in areas that are important to you.

Paul Leonardi: (30:27)
So that's what we really call the "30% rule.” It’s like if you wanna be comfortable in a variety of topics in the digital environment, you need to get to that 30% threshold. And that's ultimately what we try to do in the book to help get people there, to tell them, "This is what you need to know about these topics, so you can start, you can participate, and can maintain those conversations, and then build on top of that any more mastery that you want in a certain lexical area."

Satyen Sangani: (30:52)
But the book is intended to get you to that first 30% level?

Paul Leonardi: (30:56)
That's right.

Satyen Sangani: (31:00)
We had another podcast guest early on, David Epstein, who wrote this book called Range. And a lot of what you articulated harkens back to a lot of what he talked about, which is this idea of generalists being a superpower in a world where you think in technology, you might need a lot of specialization, you're sort of saying, "No, you actually don't. What you need is the ability to sort of take concepts and string them together and build relationships."

I think that's a really important insight to constantly stress, because people, the perfect is the enemy of the good often, and you find that people are like, "Oh, well I can't become a data scientist." And I think this is an observation that says, "Well, you don't really have to," is really helpful.

Now, this is a podcast about data culture, and so we talk a lot about how do you build data culture, what does it mean to have a data culture. I guess maybe a question for you is that... Is "data culture" a familiar term or one that you thought a lot about? And I guess how do you think about that term and its relationship to this concept of a digital mindset?

Paul Leonardi: (32:10)
Yeah, it's a wonderful question. A big thrust of my research has been, how do we take advantage of all of the data sources and signals that are constantly being produced in our organizations as a result of using digital tools, to try to arrive at better decision-making?

Many of our organizations today are not used to thinking about data as a by-product of actions that we take on the suite of tools that we use to do our work. And a data culture for me is one that recognizes that almost everything we do produces data in some way, shape or form.

We can use those data maybe to our advantage, but at least we can use those data to explore. Can they tell us things that we might not know or otherwise have access to, if we are perceptive and thinking about where those data exist and how we might use them? So the concrete example. When you're using one of these kinds of tools that we discuss, enterprise social networking tool, project management tool, it's throwing off a bunch of data.

Paul Leonardi: (33:22)
A key question that a lot of leaders in organizations that I work with have is, "We're trying to develop a team of individuals. We're not quite sure who are the right people to put on that team, given the objective that we want that team to achieve, and we wanna make sure that that team is healthy and they're doing the right things and the right people are being included."

"So can the data — the data byproducts, the digital exhaust of these tools — be useful in helping us overcome some of the problems of not knowing who to exactly put on a team?" So as one research project that we did, we built a dashboard that ingested data from all kinds of digital technologies that were being used across the organization.

And what we were able to do with those data was to construct patterns of informal social networks that existed across the company. It's cumbersome to collect, it's cumbersome to analyze, but those networks are already existing as a byproduct of our digital tool use within organizations.

It's very easy for me to construct a network of how many times you and I talk and perhaps about what topics even or with frequency, and to then use those to run some analyses that may be predictive of different kinds of outcomes that we care about. Most organizations have a huge, a very difficult time, first understanding everything that I just said. Like, "You could do that? How do you get those kinds of data? How are those data predictive? Where might I use them? I don't get where I would get them? And then even if you could do that, what about privacy?"

Not that these are bad questions, but these questions sort of blow people's minds open a bit and make it very difficult for them to try to say, "What could we learn from the data that exists around us and how might we incorporate that into effective decision-making? "

And that's what a good data culture does, in my opinion, is that it recognizes and it's open to the possibility that so much data exists around us that can tell us things that we really wanna know. It can't make decisions for us, nor should it, but at least we can use those data to arrive at certain predictions that might inform how we as managers and leaders operate in organizations.

Paul Leonardi: (35:46)
And to the case in point, how maybe we might assemble a team that's very different than what we might assemble if we just ask someone, "Who do you think should be the five people that might be best on this team?"

 
 

How curiosity gave rise to the modern data catalog

Satyen Sangani: (35:58)
It makes a ton of sense, and I think it gets back to the initial topic, which is, you think you know what's going to happen, but you don't actually know what's gonna happen. This is basically, I won't be able to resist telling one of the founding stories in Alation 'cause I think it's actually so relevant.

So Alation Data Catalog, when we're doing early research, we were trying to figure out what to go build, and we'd run around and we talked to these people who had data. And the idea was people didn't understand data, so we're like, "Do you have a problem understanding data?”

And by and large people would say to us, "No, we don't really have a problem understanding data." We're like, "But you have to, because there's all these big projects and people spend millions of dollars on understanding data, how can you not understand data?" And these analysts who we would talk to would say, "Yeah, well, we really kind of understand data, dude, but what we really don't know how to do is write queries, and that's really hard to go do."

And we'd say, "Well, what do you mean by this?" And they’d say, "Well, you know, we have to write all these SQL queries into these big databases, and we don't know how to do that." And so then one of my co-founders came and said, "What if we wrote a query tool to help you write these queries?"

And then one of my other co-founders said, "What if we use the queries that everybody else used in the past and wrote in the past to write these, help future people write these queries? Isn't that using that metadata that you're referring to, to write some of these future queries?"

What was interesting about that was, every database on some level has this structure, they have tables and people write these tables to populate data into them, but what's interesting of course, is that the actual queries that are written to the database, basically, read from these tables and write other tables out.

Satyen Sangani: (37:41)
What we realized and discovered was that some tables were never used, and other tables were used really, really often, and this query log actually showed what people were doing with the data. And I think what it led us to do over the course of time was do exactly what you said, which was use data around how people are using data to inform how people ought to use data moving forward.

Paul Leonardi: (38:03)
I love the way you put that. [chuckle]

Satyen Sangani: (38:06)
And very much was a collaboration tool from the start, but I don't... I think this idea was really foreign because there is this concept, and certainly this idea within the world of data that we ought to know what people are gonna ask, and we ought to know what structures we need to create.

And the reality of it is, most questions are unpredictable, because if they were predictable then we'd already know the answers to them on some level. So it's a really interesting dichotomy.

Paul Leonardi: (38:33)
Yeah.

Satyen Sangani: (38:33)
Now, you mentioned this concept of the digital mindset, so now I've got this concept of a data culture, which means that data informs my activity. How does that relate to... Just give us the hard relationship between that and the digital mindset? Because I think that could be, just driving that home for people would be pretty useful.

Paul Leonardi: (38:52)
Sure, yeah. Well, I think part of having a digital mindset is this recognition that data can be really useful for the decisions that we want to make. That's one piece of it. Right? But then the second piece would be, how and under what conditions? And what are the kinds of questions that you would need to ask so that you could arrive at this understanding of the reasons for and the occasions under which I might wanna use those data?

I mentioned this I think a couple of times, but I don't think that questions around data are all that amazing, foreign, sexy to think about. They're just really fundamental and important and basic, and we don't often think to ask them. There's a whole lot of interactions that are captured in a data point. You need to know that kind of thing in order to start making a good decision and to recognize in what kinds of decision domains these data would be useful.

Paul Leonardi: (39:46)
So again, I don't think many of these questions around how to understand data that constitute a digital mindset are particularly mind-blowing or exciting, but they're so fundamental for whether or not we can actually use data to drive decision-making and help us achieve the organizational goals that we want.

Satyen Sangani: (40:05)
Yeah. In grad school we talked a lot about doing like econometric, or we'd talk a lot about the data generating process, and I just remember this professor that I had, it was a statistics professor and economist professor and he'd basically just talk all about this process as if it were this enviable thing.

Satyen Sangani: (40:25)
But obviously, that's like the real world, and the real world is really complicated, and realizing that the technology that you use to capture this information about the real world might distort it, might change it, might be inadequate because it doesn't capture all the information or captures only a lot of some information, obviously interprets or leads you to figure out how to interpret the data, which is really thoughtful and useful.

Paul Leonardi: (40:54)
One of the examples that we give in the book, and I just love this example, comes from basketball, USA basketball. And for those who are rabid consumers of basketball and basketball statistics, you know that most players are... Player behavior is captured with a really, really old technology called the box score, and ultimately those statistics are generated for lots of players, and they have come to be a reflection of the value that a player brings to the sport.

This is less true in the NBA, but certainly it's true in high school basketball, let's say in the United States. You kind of rank and evaluate players based on how well they score and all these metrics that are captured in this box score. Well, there are a whole lot of other things that people do on the basketball court that make them good players that just aren't captured in that box score, and we do this all the time, in many areas.

Is that we have specific data capture and encoding mechanisms that highlight certain things that we care about as a culture and devalue other things by not incorporating them into our evaluation rubric. We have to understand what kind of data we're missing out on and not capturing, if we want a more complex picture of how things in our organization work, how people...

Paul Leonardi: (42:20)
If we wanna have a better understanding of how things in our organizations work or how people end up being productive or not productive, and unless you know what data we're collecting and what data we're not collecting, it's really hard to have a good handle on some of those issues.

Satyen Sangani: (42:37)
Yeah, the player that came to mind when you were speaking is, the modern day player that came to mind when you were speaking was Draymond Green of the Golden State Warriors. And of course you have Draymond Greens like inside of companies, like these glue players that may not be the measurable sellers, or in the case of a software company, the measurable people who are developing code, but gosh, you know that they are critical to the operations of the company.

And in some ways, this lack of measurability extends often to technology. Like these knowledge management tools. You're studying an area — collaboration tools and knowledge management tools — where it's really hard to quantify the return to value or the return to the institution.

Because they obviously have a lot of power, but at the same time, there's all these externalities that exist, and beyond the unintended effects it's hard to even measure the intended effects. How do you think about counseling your clients when they're saying, "Well, I have this tool, I don't know if it's valuable or not. How do I think about value? I really need to address the bottom line"?

Satyen Sangani: (43:47)
In particular, one of the hot topics today is, we're entering a down economy in theory, where lots of people are looking at which tools are valuable and which tools aren't. How do people assess this and decide whether this technical stuff is useful?

Paul Leonardi: (44:02)
The question that I counsel decision-makers to ask is, especially when it comes to thinking about any sort of digital tools that we might be using, is that there are direct and indirect effects of these tools, of any tool that you're gonna use. And if you wanna target the best investment possible, you wanna make sure that you're taking advantage of both the direct and the indirect effects.

So what do I mean by "direct and indirect effects"? If you go back to a tool like Asana — I'm just gonna use Asana as the example here — the direct effect of Asana would be that you have a more efficient project management capability within your organization.

Because you have visibility into who's working on what assignments, where are people in different phases of the project, do I need to adjust my timeline. So it's helping to achieve a very functional objective. That's the direct effect of using the technology. But if we keep in mind the conversation that you and I just had, that every kind of tool that we're using within our organization is creating metadata in the form of this digital exhaust.

The indirect effect is, does the data that is being thrown off in pursuit of that sort of primary objective also going to be useful for us? And do we have a capability in our organization to take advantage of those metadata to do something useful and interesting with them?

So in a time where we're getting increasingly concerned about efficiency and about whether tools are gonna be valuable, my recommendation is that you conduct an audit that really helps you to think about, "The intended and stated purpose of this tool is X. And the direct effect would be this," and that's valuable.

Paul Leonardi: (45:54)
But what would be the metadata these tools are producing? And what would be the indirect effect that we could generate by using some of those metadata? If that's also valuable to us, then you have a situation where you can start to say, "I think this tool would be very useful for our workforce." So that's the one way of approaching that.

Satyen Sangani: (46:11)
Yeah. And for the more skeptical, maybe hard-nosed financial types that are like, "I need to see a hard-dollar ROI," how do you respond? Or do you adjust the track at all? Or how do you convince those skeptics?

 
 

How to convince the c-suite skeptics

Paul Leonardi: (46:28)
I've certainly had this conversation a lot over the years, and in the book, actually, I think it's in that last, second-to-last chapter on transitioning, we lay out a model. It's something that we call the work digitization process. And what that does is it basically says, "Okay, so you implement some kind of new digital capability within your organization."

Ultimately, there's some ROI that you're expecting from that, but the implementation of that tool doesn't lead directly to ROI. It's not like we implement Asana and boom, we save like $300 million a year. So what we do in that work digitization process is we outline like, here are the paths that happen from implementation of the tool to changes in work, to changes in who people interact with, to some local performance improvements that could result, to ultimately some higher-level corporate objective, which would be the ROI that we really care about.

So the objective then of implementing this tool is, does it help us to reach that final product faster? And that's something that you can actually measure. But one of the failures that I often see is that companies are good at thinking about that big-picture objective, that faster time-to-market, and they have no way of assessing whether or not this tool is gonna lead to that. You never can assess it; there's just too many variables at play.

But what you can do is to pick a much smaller, more intermediate variable that we know is correlated with the big ROI that we care about, and then be able to track that. And I don't think this is particularly hard to do, it just takes some creativity. In fact, I find and I'm surprised that many of my consulting jobs actually end up being around precisely this problem, which is helping a company to figure out, "You'll know success if you see this happen."

And it's not something that I bring from the outside, usually. It's understanding how work gets done, what the priorities are, what are relationships the firm already knows about between different sort of intermediate steps and final objectives or outcomes, and then being able to articulate a path from, "Here's how this technology is likely to get us there, then this is the thing to measure."

Paul Leonardi: (48:42)
So I think it takes some creativity, but it's really sort of backing away from huge-picture ROI, to more specific intermediate steps.

Satyen Sangani: (48:50)
Yeah, and I think this idea, you said prioritization and to me that's the big insight, which is that people have a hard time selecting metrics. Because metrics, as we discussed, have this problem of oversimplification, but they have this fundamental benefit of focus.

Paul Leonardi: (49:08)
Correct, and specification.

Satyen Sangani: (49:11)
There's sort of that phrase that, "If you don't know where you're going, take any road. It will get you there." And I think that's a really critical point. And people have these sort of RFI-based views of technology selection: "If I just check all the boxes and all the features, then I'm gonna get the best software."

Satyen Sangani: (49:27)
But I think one of the many basic lessons out of this conversation is, look, you don't really know, and you probably ought to go test, and test relative to this one or two or three metrics that you're trying to optimize for. And that requires both very specific selection and choice, but also that you test relative to whatever that outcome happens to be. That's really insightful.

Paul Leonardi: (49:51)
Yeah, and so remember early when you ask the question, "What do good companies do around this problem of not being able to completely anticipate what outcomes you're gonna get from implementation of digital tools?" And I said beta testing how the tool is used and how it's incorporated from the work is really important.

Paul Leonardi: (50:11)
And once we can show empirically with data from your own company, from people that are working on the same projects that you're working on, that because they use this tool, they're able to do this? Boy, there's no better selling point for people inside the company than that.

Paul Leonardi: (50:28)
Because it's not some abstract thing, "This could lead to this." It's not at some other company this leads to this. It's, "Here, at my company, with people just like me, I use this, it leads to this. And the good news is we've tested it empirically."

And you sort of work that out in a small group before you roll that out across the organization, you not only have the benefit of being able to have some assurance about the relationship between the tool use and your ROI, but then you can get everybody else on board much more quickly because they see the immediate benefits.

Satyen Sangani: (51:01)
Makes total sense. So before we break, where is your future work leading you and what are the areas of inquiry that you're pulling on now?

Paul Leonardi: (51:10)
I think that for me, one of the things that really excites me these days is uncertainty. [chuckle] And I know that sounds like pie in the sky, but I think it's relevant to so much of the conversation that you and I have had today — which I've really enjoyed, by the way — that we just don't know what the future is going to bring. And we never have been able to.

But I think one of the differences today is that we seem to be more under the illusion that we could know what the future brings, and that's because our models are much more complex than they've ever been. We have machine learning algorithms that are able to really pull from so many disparate data sources and combine them.

We have tools that are able to generate predictions and incorporate those with very detailed visualizations and lots of data points. That we can make the future seem like it's much more knowable, even though it's not. And from a research standpoint, I find that to be really fascinating.

How do you proceed with when the whole world, and the tech sector in particular, imbues in you a false sense of confidence that you could actually know what's going to happen in the future, and still be open to the possibilities that things aren't gonna go the way that I planned?

So on a research side, I think that's a really important set of questions. Because when I first started doing the kind of research and consulting work that I do in the late '90s, early 2000s, most key decision makers in companies weren't under any illusion that they could accurately predict what was gonna happen in the future.

Paul Leonardi: (52:54)
They were like, "The best we know is that there's gonna be some macro trends happening and things could change, but we're gonna sort of plot this course and we'll keep flexible." And that rhetoric has really shifted for me, at least that I've heard in the last decade, that there's so much more perceived certainty. So how do you navigate that? So that's one area that excites me.

Satyen Sangani: (53:15)
Super interesting and reminds me of one of our former guests, Margaret Heffernan, who talks a lot about this idea of complexity. And the way she describes it is, in complex environments, small changes make a massive difference in outcomes. And there are many different factors that could be small changes in any given process.

And to me that's this idea; we have all this data gives us this false view that we can predict the world, and yet here we are realizing that the data actually just reveals all this complexity.

Paul Leonardi: (53:46)
Right.

Satyen Sangani: (53:48)
So I love the parallels between your work and obviously what we're trying to do, and I think what a lot of the listeners of this podcast are trying to do, so it's just been a great lens on how we all see the world. Thank you for taking the time and thank you for educating us.

Paul Leonardi: (54:04)
My pleasure, my pleasure. And thank you.

Satyen Sangani: (54:12)
As data radicals, we know that data is valuable, we know that technology is valuable. But this general belief doesn't translate into specific value. So as we enter a down economy, most data leaders wanna know, what exactly is our data worth? And what's the ROI of all of these data projects? So that leads me to ask, how do you measure the ROI of all of your digital projects and technology?

To borrow from Paul's wisdom: simplify, simplify. Pick two, three, or four goals that you're trying to impact with your data — that you're trying to impact with your technology — and no more, and then determine if your efforts are actually working. "Gross simplification?" you might say.

Sure, but as Lewis Carroll once said, "If you don't know where you're going, any road will get you there." You may think you know what you need, and approach tech buying as a deterministic process: "I'm gonna list all of my needs and just pick the vendors that check the most boxes."

This couldn't be farther from what actually yields a good result. Hell, even the people making the technology don't know how it's going to be used. At Alation, we started as a search engine, but then we learned how people actually used our software. We went from a search engine to an understanding engine, to a query writing tool, to an intelligent SQL editor, to a data intelligence platform.

This evolution was the product of a lot of feedback, but the goal was always the same: We wanted to get people to engage with and adopt our products and data. This meant that we had to put our tool in the hands of users and learn from them, and without this learning, our products would have died when they needed to evolve.

As Paul shared, 80% of the users don't use your products as the creator intended. So whether you're a product creator or a buyer, this means you must observe your users in the wild. You have to try before you buy. Because that feedback will reveal one of two things: either you have something useful, or you need to learn and pivot.

Satyen Sangani: (56:15)
Thank you for listening to this episode of Data Radicals. And thank you, Paul, for joining. I'm your host, Satyen Sangani, CEO of Alation. And data radicals, stay the course, keep learning and sharing. Until next time.

Producer: (56:31)
This podcast is brought to you by Alation. Your entire business community uses data, not just your data experts. Learn how to build a village of stakeholders throughout your organization to launch a data governance program, and find out how a data catalog can accelerate adoption.

Producer: (56:49)
Watch the on-demand webinar titled “Data Governance Takes a Village” at alation.com/village.

Other Episodes You Might Like :

Start with Story, End with Data

Ashish Thusoo

Ashish Thusoo

Founder of Qubole and Creator of Apache Hive

Subscribe to the Data Radicals

Get the latest episodes delivered right to your inbox.

Marketing by