Humanizing AI: Authentic Storytelling
with Jepson (Ben) Taylor, Chief AI Strategist, Dataiku
Jepson (Ben) Taylor
Chief AI Strategist, Dataiku
Jepson previously served as chief AI evangelist at DataRobot, which acquired his deep learning startup Zeff.ai. He’s built game-changing AI systems for HireVue and Micron and is a frequent speaker at industry events. Today, he’s focused on advancing innovation and enhancing what’s possible through AI, deep learning – and the human soul.
Co-founder & CEO of Alation
As the Co-founder and CEO of Alation, Satyen lives his passion of empowering a curious and rational world by fundamentally improving the way data consumers, creators, and stewards find, understand, and trust data. Industry insiders call him a visionary entrepreneur. Those who meet him call him warm and down-to-earth. His kids call him “Dad.”
Hello and welcome to Data Radicals. In today's episode, Satyen sits down with Ben Taylor, an AI expert who's recently made waves in tech for his approach to being a data radical.
In this episode, Ben dives deep into vulnerability. He touches on overcoming modes of failure in your org, how to get stakeholders onboard, using storytelling to enrich your data, and so much more. So please enjoy this interview between Jepson (Ben) Taylor and your host, Satyen Sangani.
This podcast is brought to you by Alation. The act of finding great data shouldn't be as primitive as hunting and gathering. Alation Data Catalog enables people to find, understand, trust, and use data with confidence. Active data governance puts people first, so they have access to the data they need with in-workflow guidance on how to use it. Learn more about Alation at alation.com.
Satyen Sangani: (01:05)
Ben Taylor is chief AI strategist at Dataiku. He most recently served as chief AI evangelist at DataRobot. Prior to this, he co-founded deep learning startup, Zeff.ai, which was acquired by DataRobot in 2020. He spent years building AI systems for HireVue and Micron, and frequently speaks at industry events and works closely with the larger machine learning and data science communities to advance innovation and enhance what's possible through AI and deep learning in the future. Ben, it's great to have you on the show. Welcome to Data Radicals.
Jepson (Ben) Taylor: (01:36)
Thanks for having me. I'm excited to be here.
Satyen Sangani: (01:39)
So tell us what a chief AI strategist is.
Jepson (Ben) Taylor: (01:42)
So that title can be interchangeable sometimes with an evangelist. So a chief AI strategist has activities in marketing, thought leadership, but also activities in sales. And so if I had to simplify where I spend most of my time, probably 80% is focused on marketing and sales and building up executive awareness of Dataiku in the US.
And then I do care a lot about partnerships, and I also have a selfish interest in moonshot projects: Think of AI marketing R&D projects that awaken the inner child, that's definitely something I get a lot of passion doing. That's also part of my role here as well.
Satyen Sangani: (02:23)
So does that mean you're spending, if you're evangelizing AI, more time trying to convince executives that they should be doing more AI, or are you spending more time talking to them about how they're doing the work that they're already doing, or is it both?
Jepson (Ben) Taylor: (02:41)
I lean toward the former. So I spend more time convincing executives that AI has defensible ROI, short-term impact. Because it's interesting, if I go back in time when I was the chief data scientist at HireVue, I didn't know how to talk to an executive. And I actually had some awkward interactions that confused me at the time. I didn't really know how to react to these conversations. It wasn't until I went and did my own startup and felt the pain of payroll, and dealt with my own customers, my own marketing, my own burn, that I was finally able to appreciate what an executive needs to hear. So I love talking to executives about AI. I think, unfortunately, many of them here, they hear talks that are too technical, lots of jargon, no urgency...
Satyen Sangani: (03:25)
You're getting to the answer of the question that I was about to ask which is, what is it that an executive needs to hear that's different from others, or that's unique, or how does one talk to an executive, and what are the special ingredients of that?
Jepson (Ben) Taylor: (03:39)
Rather than just giving a simple answer, I'll open with a very short story. So I was having a meeting with the CEO, founder of HireVue, Mark Newman and Loren Larsen,and during the meeting. It's an hour-long meeting — which is embarrassing based on what I was talking about — with the CEO, and he interrupts me in the middle of the meeting and says, "Ben, we know you're smart, that's why we hired you. Please stop reminding us." And it wasn't a compliment. His tone of voice, the way it came across, I knew I was in trouble, I knew I was doing something wrong.
And what I was doing wrong is I did not understand Mark's life. I did not understand the role of a CEO. I did not understand the nightmare that he's constantly dealing with, managing 150, 200 employees, but also managing 7-figure accounts that are at risk going into a board meeting. And so when I talk to executives, I don't mention any jargon, I don't mention how things work, because they actually don't care, they care about outcomes.
Jepson (Ben) Taylor: (04:31)
I was on a panel a couple of days ago, and other panelists were diving into how things work, talking about scaling and “big compute,” but they're not really asking the “why.” So when I talk to execs, the discovery process that I have found that is the most effective is I convince them that they have 3 fundamental problems.
Jepson (Ben) Taylor: (04:49)
So if you're an executive, I'll discuss that, if I came in and audited your business, we would find out that many of your processes are stale. Most of them are stale. They haven't changed for 10 years, 15 years, or 20 years. And they'll nod their head in agreement: “Yeah.” And you can talk about why there's... You don't wanna just say that. You wanna talk about why. There's people are busy, but also the consequence of change is significant. You need to respect that. You can't change a generative running process on Friday, or that's gonna have serious existential threats to your business. So processes are stale, they agree.
Jepson (Ben) Taylor: (05:17)
The other issue you have is you have a knowledge loss problem. Junior people come in, you pay their tuition and they leave. You can't retain all that knowledge. And a lot of times that's trapped in people.
And then the 3rd problem is they're not leveraging their data experience. So business is producing mass amounts of data, most of them don't use that for better decisions. And you'll notice in that conversation, I haven't said anything about AI. But they are in agreement that these are problems that exist.
And then when I lean into discovery, the conclusion of the discussion is, they have high confidence that we will find defensible ROI within a few quarters and we're gonna fail as quickly as possible. And we have experience and process to bring to the table. Because one of the issues you run into is optimism. Naive optimism exists in this industry. And you really want more of a seasoned realist. And so when a CEO realizes that, it opens many doors.
Satyen Sangani: (06:14)
Yeah. It's funny, one of our advisors and long-time board members is a guy named Dave Kellogg, and Dave would say, "Look, for you to sell something, three things need to be true.
First of all, you. They have to believe you understand their problem and that's 60% of it.
Second thing is they have to like you. That's probably 20% of it.
And then the last piece of it is that you actually have a solution to their problems and that's probably only 20% of it." And what was obviously striking is, most people skip over the first thing altogether and go straight to the end and it's probably... Maybe it's not 20, but it's certainly a smaller percentage than most people think.
In this world, you have obviously a lot of shifting trends in AI, macro, the start-up ecosystem, but I think in particularly in the macro environment, a lot of people are inspecting their data budgets with even more care and scrutiny. Do you think that's meant your job has been that much harder or have you had to work a little bit harder in the last 6 months, or has there been any noticeable change from your perspective?
Jepson (Ben) Taylor: (07:16)
I think we do see people pulling back on innovation generally in the industry. That's natural for a recession. They focus on efficiency. But I think it's not pulling back as much as people think, because we're at this inflection point right now. Every industry has transformational AI use cases that are well known.
So if you're in retail, telecom, oil and gas, if you haven't done anything in AI, it's pretty straightforward to come to the table with realized use cases from the market, from your vertical, that are significant.
But to give some validity to your point or concern, there are some data science teams that we see that are being cut a little bit more aggressively with some of these layoffs. And I think if you looked in the situation... I'm not saying this is true for all of them, but I would think for many of them, they don't have a roll-up attribution number or they don't have a very strong ROI use case. So they were seen previously as a luxury. Having an innovation center can be seen as a luxury going into a recession, but if that innovation center is producing defensible outcomes, now they're less likely to be cut by a CEO, CFO. And so it's really important to have that urgency that...
Jepson (Ben) Taylor: (08:31)
It's funny, for almost 10 years of my career, I didn't understand how a business works, I didn't understand that my paychecks have consequences. They never really asked those internal questions, "Am I delivering a multiple on what I'm being paid?" And I don't expect most employees to do that.
Satyen Sangani: (08:46)
Yeah. But I think it's an important point. And when you do speak to an executive, I think CEOs always feel, generally, most of them feel some sense of urgency. But in times where things are harder, you're sometimes fighting for survival, every investment gets more scrutiny because there is more competition for every given dollar. And I do think there's a sense in the data community that data is good for the sake of being the means toward knowledge. But the reality that you have to prove your worth constantly and adjust, I do think that's something that people can forget or tend to forget.
Jepson (Ben) Taylor: (09:21)
Yeah, definitely. I'm a big nerd on storytelling. And one of the things that's often missing in the executive conversation is emotion. It's features and speeds equivalent on sales. I'm telling you about my product, I'm telling you about the industry, I'm sharing case studies. There's no emotion in you. You're listening, you're being informed. You didn't ask for the meeting. We asked for the meeting. So from an emotional perspective, it's very rational. And I have a lot of fun on the emotional side.
One of the things I talk about is, how do you disarm executives? How do you bring down those anti-sale walls? And one of the most effective ways is to be vulnerable and to be human out of the gate. And so the most condensed opener that has worked for me has been, "I know what it feels like to be a junior data scientist at Intel and Micron, hoping to prove I'm smart, but anxious I might not be, dealing with impostor syndrome. I know what it feels like to be a chief data scientist where I'm hiring and firing data scientists and I'm trying to communicate value to you, but you and I are not communicating and I honestly don't understand why. I don't know why we can't communicate.
Jepson (Ben) Taylor: (10:22)
“And now I know the worst thing of all. Now I know what it feels like to pay data science payroll out of my personal bank account while my spouse is fighting with me, telling me to fire them instead.”
And now, once you have that brief opener and you kinda have an honest conversation about it, now when you get into value, now when you get into experience, there's a lot of trust in the room because they actually feel like, "Oh, so you get it. You understand that every employee has a number. The urgency is real." 'Because that's what I'm hitting on is urgency. Beause if something happens, if you're wiring money out of your personal bank account to pay payroll, what is your urgency on value? Well, it's 30 days. It's pretty specific. I think it's funny 'cause I didn't realize that before.
Satyen Sangani: (11:07)
And are you maybe transitioning to the end of the story? Do you see data science projects and AI projects often delivering value in 30 days?
Jepson (Ben) Taylor: (11:16)
Yes, absolutely. It's interesting because you definitely wanna be wary of people that would promise that. So if you're a CEO, and if I show up and if I say, “we can deliver value in 30 days.” Maybe, but it's gonna depend on you. It's gonna depend what's your data, what's your stack, is it ready to go? And there's also internal politics. But I would argue that very frequently, we see examples where people can get defensible value in 30 days. It's probably not the norm, but it is out there. I've seen it before. It used to be years. If you go rewind the clock, 5 to 7 years, it was 2 years.
The success of failure(s)
Satyen Sangani: (11:52)
I think it also depends largely on the problem that you're trying to solve and how much of a moonshot, to your terminology, that you're trying to take with the project, and therefore there's a prior risk assessment and likelihood of success that gets into a project.
That being said, I guess one question I always sort of wonder about is, what percentage of these projects do actually succeed? If you are a maybe somewhat novice CEO in the world of AI, what should be your expectation from your team in terms of how many processes they might improve or what their project success rate?
Jepson (Ben) Taylor: (12:26)
We could write a book on this question. It's a very profound question because if I just look out in the AI market, I would argue the vast majority fail. And by fail, I mean they don't justify the time, they don't justify the head count. And they fail for different reasons. One of the reasons they fail is they're being run by people who are trying to stand up their own stacks, they haven't done it before, they look at the compensations in the market of someone who's seasoned and they compare it to a junior grad out of Stanford, less anxiety to hire the junior grad for that salary.
Jepson (Ben) Taylor: (13:01)
There's a long list of failure points. So one of the failure points is, people work on the wrong problems, which is kind of shocking. So you get all the way to the end of the problem, it's being consumed, and then you ask someone, "What is this worth to the business?" And you can hear their eyes blinking. And I get it. AI is the shiny new toy, those cool projects, and a lot of times we're off to the races. But for seasoned CDOs, they always work backwards. They always partner with the business.
Jepson (Ben) Taylor: (13:28)
One of the other profound points of your question is, let's say you have proper process, let's say you have a center of excellence, you are investing in the right “build it versus buy” mentality and you bring in the right experience to the table. Now, to repeat the question, what fraction of your projects should fail? I would argue, be very aggressive on delivery out of the gate — crawl, walk, run — don't have three failures in a row. You've got political capital that'll go away very quickly and I see this where there's no interest in AI and it actually does more damage to the company to have multiple failures out of the gate.
So find the low-hanging fruit that's delicious, project's very feasible, high value, know your industry, get a few wins, but once you have mature data science work where you're delivering thousands of use cases, if none of your projects are failing, I would now fault you on the other side. If nothing is failing, then that seems very... It's not a very innovative company, not a very innovative culture. So there is a fraction of failure that for a mature organization, you should celebrate, but with failure, you have the time urgency: How can we fail faster? I'd rather fail this week than four months from now. And there's some innovative creative ways where you can fail faster on projects.
Satyen Sangani: (14:43)
Yeah, I think there's a lot of intelligence in that, and this idea that's sort of starting with data projects and taking on things that are high reward, high likelihood, or even moderate work, high likelihood projects is probably better than taking on the high reward, high risk project from the onset, because there is a learning process to getting to use data.
One of the things you said earlier was that a lot of times, executives don't ask about the how. And maybe it's my defect as a founder, but I don't quite resemble that in the sense that often, what I find is the teams, their sort of problem drift, scope drift, feature drift, and so you get all this drift in a project. And often what I find is that people are trying to do too much. Or they have diverging use from whatever the metrics are that we're trying to affect. Is that just about being very focused as an executive on the goal? Or how do you recommend execs keep their teams accountable to what matters to them? And how does one do check-ins? How does one think about driving those outcomes?
Jepson (Ben) Taylor: (15:48)
Great question. One of the failure modes that exist over here on the experimental side is not involving the business. And by involving the business, it's from beginning to end.
And this is something I made mistakes with before. I spent a few months on an AI project years ago where we realized months in, the actual subject matter expert was not invited to the meeting because they weren't senior enough. They weren't a VP, they weren't a director. It turns out they were a technician. And so involving the business, you wanna have a subject matter expert. There's an interesting thing that has happened here where some data scientists will take on a level of arrogance. They could actually be pretty intimidating to other people in the business. So when I was at HireVue, every member of my team had a PhD in physics; it can be a little intimidating. But there's a level of arrogance that can exist on that side, and I was definitely a part of that before. So if you're telling me about your customer success problem, I'll listen for 20 minutes, and then I'll turn around and say, "Okay, we've got it from here."
Jepson (Ben) Taylor: (16:45)
There's a lot of issues with that. So the issue is, I don't know all the features, all the data. I'm not the right one to innovate on new ideas to go into it. But I'm also the wrong one to determine value. Because I'm gonna come back with, "The AC looks really good. This is what it is, or the accuracy or our value."
But the customer success rep, they're gonna be able to translate that into a defensible ROI number that rolls up in their P&L that now they can talk to a CFO about. But the other thought, too, that shows up here is you never wanna bet on one horse. That's a very scary thing. So even if we decide we're gonna work on this AI project, it's proven, it's in our industry, we have data, we have the business, you're gonna fall prey to the data. Like, what if you have issues with the data, what if you don't have enough, what if there's issues with targets and different things? So I really like prioritizing and coming up with 10 or 15 or 20 project ideas, have a brainstorming session workshop where you're really engaging the business and the process owners, and then prioritize based on feasibility and value, and then start with 3 or 5, if you have resources. That's much more likely to exceed, at least with one.
Satyen Sangani: (17:54)
So maybe one last question on this Ben, so you speak with a lot of executives, what is the most common way you fail? So when you're convinced, when people say, "Yeah, maybe not now. Maybe we'll consider an AI project, Dataiku later," what are the most common rejection patterns that you see from those people who are not yet buying?
Persuading with storytelling: two key rules
Jepson (Ben) Taylor: (18:13)
The two most important rules — and you're gonna see me to keep coming back to storytelling rules — are authenticity and knowing your audience. And often if you really know your audience, if they're saying “Not right now,” and you can quickly see, from someone's body language, they are not interested at all. And if you lean into the story, you'll find out they've had 3 fault starts internally over the last 2 years. So I don't blame them. They've had promises and failures, and yet again, here's one more person showing up to the table with promises. But I think those failures that they had previously, they fall into these different modes. There's so many modes of failure.
So for an executive that is quite determined that AI is not a top 10 priority this year, it's “science project, nice to have, it's not the biggest priority,” I would say, "Is there a bias that you have? Have you had failure before internally? Or do you have friends that have had failure?" If they admit to that, we can discuss it. If they haven't had failure, then I would say, "We are seeing transformational gains in your industry.” We can actually dive in.
Jepson (Ben) Taylor: (19:22)
If you're retail, telecom, manufacturing, we can dive in and say we're seeing transformational gains and let's do discovery quickly. What's a number in your business where a small change makes you excited? Where's your growth bottleneck by human capital? We're quickly gonna dive into some human CEO discussions that are a little different for them. They haven't had someone from AI take that angle of attack where the conclusion of this story is, I'm building trust that we have the most applied experience across the world. We've made the most failures, but we've had the most wins. And I'm also respecting who they are, that I'm not selling them on a moonshot. I'm not trying to give them something that's not gonna offer impact, and I'm gonna back out if I don't see a path through a workshop that this is gonna be a meaningful win for them.
Satyen Sangani: (20:10)
The two things that stand out from what you mentioned was this idea that people have had bad experiences with AI, which basically speaks to the idea that most people have either had a project or tried a project and many of them have failed, which I think would be par for the course.
The other thing you mentioned though, I think, which was really interesting was probably the biggest thing that might sell somebody is social proof and perhaps the feeling of being left behind or competition, which I think is super interesting — that you need to do data because other people are doing data and AI. And I think there's probably enough early evidence in the ecosystem where one of the reasons why I do believe, and I agree that AI is gonna accelerate through this down cycle, is because there's still enough competition around it that if you're not smarter, your competitor might get in front of you, which may not be the most noble of motivators, but it's certainly one, I would say.
Embrace what you fear: the ChatGPT opportunity
Jepson (Ben) Taylor: (21:04)
I love that you bring that one up because it's one that people remember. So if you remember in 2012, the big data hype cycle: “If you need big data, you need Hadoop. If you don't get it, your competitors are gonna crush you and you need these data lakes.” And there were a lot of movers! There are a lot of people that bet big on a data lake and for many people, they were disappointed. It became the data swamp.
And so selling based on fear, "You need this or else," that doesn't work very well these days. But that said, there are surveys with executives where it's harder to grow your company these days for different reasons. They're worried about incumbents coming up. I actually had a fascinating conversation with a CEO a couple days ago. I was on a ChatGPT panel and she came up at the end and she owns a copyright business. And she said she is terrified of ChatGPT. I love talking to her because I told her, "If you are running away from ChatGPT, you should be terrified.
Jepson (Ben) Taylor: (21:57)
"But if you, the business owner, subject matter expert executive, if you begin practicing ChatGPT two to three hours a week in earnest, trying to see if you can accelerate a unit of work — because you as the CEO of your company, you should become the internal expert to see if you can accelerate a unit of work — and if you can, where you cross that line where it's economic." Because if the economics will constantly improve, the moment she decides it's economic even break-even, then that's a transformational shift, then she has to train her employees, then she has to potentially risk capital. I was almost envious of her position because I'm not a copyright expert. I don't have a copyright business and I don't have that network. But maybe for someone who's technical, I'd push that back on you: “Would you want to be a copyright owner right now if you believe you could be first mover and raise capital with traction?”
Satyen Sangani: (22:56)
Probably. I don't know enough about the use cases, and certainly the problems to have a specific opinion about that question. But it would seem to me that if you were able to be a first mover, there's a lot that you could do and a lot you could learn.
Actually, ChatGPT is now one of the most talked about trends in tech that I can think of perhaps since crypto came up. It just feels like this thing that every single person is talking about. What do you see as being the early use cases? What are the things that have stood out to you to be things that are the most proximate applications of the technology that people are actually looking at right now?
Jepson (Ben) Taylor: (23:31)
For the early use cases, there's some hurdles, I think. One of the ones I'm the most excited about is education. I'm a PhD dropout, but I do have a master's in chemical engineering and there were some classes that I absolutely hated: thermodynamics, physical chemistry. I don't know if I truly understood them to the depth of the experts that wrote those theories, but I would love to, or electrodynamics, I would love to.
ChatGPT, you can ask it to explain string theory, general relativity; you name dozens of very complicated topics, it will explain them to you like you're 5. And so there's a scenario where ChatGPT is going to completely change education. Some quick industries or wins that I think you could see popping up is, how many business books do we read on Audible? Play Bigger, Lean Startup...
Jepson (Ben) Taylor: (24:20)
We could probably share a list of books that we've liked recently. It's one thing to read a book like Harry Potter or Lord of Rings that has setting, character development, those books will not be compressed. But if you say, "I read this book Play Bigger," even on Audible, I don't wanna read the whole thing. I would rather ask ChatGPT, “based on the knowledge I've consumed that you are aware of because you're consuming it with me….” This becomes quite interesting when it's individualized. If ChatGPT is consuming my books in my experience and I say "Based on what I've consumed, these 10 books that were recommended, please summarize the most profound points or new ideas in 2 pages or 3 minutes."
I would love that. I think a lot of people would love that. They say, "Great, I am consuming the wealth of knowledge or any profound idea in business 10 times faster or 100 times faster than other people."
Jepson (Ben) Taylor: (25:11)
I hate emails. Wouldn't you love to have an AI executive assistant that was summarizing and giving you “too long; didn't reads” in a way where now your email answer rate just went up 10 times? I see some CEOs that are just complete monsters in a good way on their phones. They run their entire business from their phones. I can't do that. I get frustrated. If I'm trying to answer important emails on my phone, if it's really important, then I have to go sit on my laptop. And I think ChatGPT can make me behave like some of those CEOs that I admire.
Satyen Sangani: (25:42)
I didn't realize that ChatGPT had this sort of contextual capability where you could train it on a limited set of data and then have it give you contextualized answers. Is that something that is available? And how much context can you feed it? And how do you feed it that context?
Jepson (Ben) Taylor: (25:56)
That's still not widely available to people. We know the process how they did it. There's some very good tutorials online. But there's a lot of articles out there where people are trying to poke holes in it. They're saying it does this, it doesn't do that. I'm not that concerned about that. It shows a crack in the dam. It shows a breakthrough where now you need to deal with this being a thousand times better on your data, a thousand times better on your experience. And that's very different. That's why this gets so much excitement and attention from me. Yeah, I think consuming texts, the units of work hasn't been able to be compressed very well and ChatGPT is compressing that. And so it's a massive time saver.
Satyen Sangani: (26:39)
Yeah, it does feel like it could have huge implications for so many different fields. Your imagination around documentation, your imagination around copywriting, your imagination around education. So much of code is incomprehensible. If you could take blocks of code that are otherwise inscrutable and translate them into something that's commonly understood, that could be really useful. Gosh, there's a ton.
Jepson (Ben) Taylor: (27:01)
Satyen Sangani: (27:02)
There's a ton. Has this guided where many of your conversations at Dataiku are going? Is this something that has spiked in terms of level of mention and interest?
Jepson (Ben) Taylor: (27:10)
There's definitely interest.
Satyen Sangani: (27:13)
Or is it...
Jepson (Ben) Taylor: (27:14)
I would say it's just internal geek interest because Dataiku is very much focused on the Global 2000, the applied use cases. There's still a lot of work to be done when you look at oil and gas. I don't want people to get the false sensation that this is a pivot for every AI company in the world. It's not. It's something that AI professionals can appreciate.
Yeah, it's interesting. One of the things people might not know is before I joined Dataiku, I used to compete against them. And it's interesting because you form opinions of your competitors. You have to. And it took a few months to unwind that, but when I unwound it, I was really blown away. There is a spectrum in the AI platform play. You have services companies, which I won't name, and then you have platform companies. And you are... I'm gonna place you on that spectrum.
Jepson (Ben) Taylor: (28:03)
Dataiku in my mind, they're one of the ones that is doing the best on the platform play. And by platform play, I mean democratization. Your employees are doing the work. There's a story I'm allowed to share at GE, a 100-year-old company that needed to hire 100 data scientists. They were able to bring in platform and deliver 2,500 use cases worth hundreds of millions of dollars. Those miracle stories, they evoke a lot of emotion, but they also build a lot of trust in the executive sales cycle. But it's kind of interesting, your own biases that you have to unpack.
Satyen Sangani: (28:37)
So people are looking to get into AI and then they see things like Dataiku, and then certainly there's your former employer DataRobot, and there are companies like Databricks. And so they all have data in their name, which is good. So what does Dataiku do and how does it position itself in this market that to the uninitiated is fairly confusing?
Jepson (Ben) Taylor: (29:00)
Yeah, a lot of these companies play different roles. Dataiku has really taken the collaboration approach. So if you look at their marketing and their messaging, to them it requires the human element. They're leaning into collaboration. And like we were talking about earlier, you desperately need collaboration. You need collaboration on, what are the features going into it, but also on the outputs, building an AI model that people can't inspect from the business, understanding what are the drivers, or at the prediction level, what are the outcomes. That's a very scary thing. That's a liability.
And so Dataiku, I've been very impressed by their ability to retain their customers and expand their accounts. But more importantly, the users, if you look at the scale that the users are using the product — I think SaaS products is very complicated. You have customer success, you have users, you have sales, you've got technical depth. There's so many things that go into it and having done it myself and then sold my company, it's really hard to do. It's incredibly hard to build the right features for the right customers with the right investment, not overengineer it, but also have the appropriate amount of technical depth. It's very hard to build a good platform company and I see a good platform company at Dataiku.
Advice for joining a startup
Satyen Sangani: (30:05)
So you've now had the opportunity to work at a couple of different AI companies. You've obviously founded your own AI company. You've done the work of data science. And at least one interesting thing that... The moment that you left, and we won't dwell into the specifics around what exactly happened, but the moment you left DataRobot was a little bit of a disillusionment moment.
You kind of were at a place where you believed, and then you found that "Hey, maybe this isn't exactly what I thought it was." There's a lot of learning there, particularly for people who are interested in joining startups, want to join startups, or thinking of joining startups. If somebody, a friend of yours came along and said, "Hey, I wanna join a startup," what would you tell them and how would you advise them to think about that problem of joining a company that's early stage or even mid-stage where it's not a Google, it's not an Amazon so you don't have necessarily this brand that everybody knows about?
Jepson (Ben) Taylor: (31:05)
I would encourage everyone to join a startup because you're going to learn more lessons, wear more hats. It might be a little bit more stressful.
There's also this transition. Early on in my career I was... This might be a little bit more provocative, but I'll say it anyway. I think it's true. I was told the lie of job security. Job security is a lie. The thing I had to graduate to was this idea of market security. And in startups, you will find market security because it's a high-risk venture. So really, you are deciding your element of risk. If you are joining first-time founders that you like: early stage, high risk. If you're joining repeat founders, if they're like Series A or B, much lower risk.
The other thing, too, I'm very close friends with Jeremy, the founder of DataRobot. He's the one that bought my company. So there's a personal aspect there. So when he moved out, you wanna work with people you like and I really liked working with Jeremy. So that was maybe part of it as well.
Satyen Sangani: (31:58)
I love the market risk statement. I actually love the market risk statement, because I think what the market risk statement tells you is, look, it's not really so much about the company you choose, but about the domain that you wanna learn about.
Jepson (Ben) Taylor: (32:10)
Satyen Sangani: (32:12)
And I think that empowers you to pick your learning journey and then say, "Look, I'm gonna live in this world and I'm gonna learn something. And if that thing is not gonna make me bajillions of dollars, fine, and I'll win the lottery," which I think for a lot of people, that's in many cases a motivation for doing a startup, hopefully not for most people, primary motivation. But if you can say, "Look, I know that I'm gonna win because I'm going to learn something," that's really powerful. That's a really powerful way you think about it.
Jepson (Ben) Taylor: (32:38)
There's an extension there. It's not just what you're going to learn. Our careers are a random walk. So what people do for their jobs, if I really challenge them, especially if they've been stuck in one industry, they haven't had enough exposure to know if they liked anything else. And I've been very fortunate to work in semiconductor manufacturing, finance, HR, my own startup, and then joining DataRobot as the chief evangelist, I really sat in marketing and sales and I loved marketing and sales.
If you had asked me in college or high school that you're gonna love marketing one day, I would've had a pretty aggressive response: "Absolutely not." 'Cause I was very much focused on this technophile ego comparison: I know more math, I know more compute, I'm a high-performance computing nerd, which is so dumb. But at the time, I was very much focused on that self-validation, that "Look how well I'm doing at math, look how well I'm doing at high performance compute. This means I'm smart."
Jepson (Ben) Taylor: (33:32)
And I've talked about this publicly. I used to not put creatives on par with... So if you said, "Who's smarter, a data scientist or a creative director?" I had that mentality years ago. And now there is genius, there's genius in both, but I would argue there's a little bit more courage on the creative side.
Because on the analytical side, it's black and white. You do this model this way, you do that. On the creative side, how many ways are there to do a video segment? How many ways are there to do words, to do storytelling? You're playing in the gray. And I've been so impressed by some of these creatives over here. But also, I didn't know I had this passion. So a startup will force people to touch new skill sets that they would've never had access to because jobs in big companies tend to be so focused.
Satyen Sangani: (34:19)
Yeah, especially in the early stages because most of these problems are so creative that you don't necessarily know what the right response or answer is, or what's going to influence the market. And so I totally agree that there's many different types of intelligence.
You said something that I completely agree with, which is this idea of a myth of job security. It's funny 'cause you look at the news and there's been obviously layoffs at Meta and there's been layoffs at Google and those were formally thought to be the most stable environments. And often it's the thing that is new that could have the most generative power and capability.
One of the things that you mentioned was market selection and working at a startup, and the other thing you mentioned was people. I might argue that there are experienced founders, multi-time founders, sometimes the “special sauce” that makes up a startup means that the person who is naive and ignorant is perhaps more likely to succeed in a given market or space because they just don't know what they don't know and maybe they're a little bit hungrier.
Evaluating the next step of your career
Satyen Sangani: (35:20)
The people-selection process to me is always the hardest thing. How do you, walking into a startup, think about evaluating that, especially if you don't know the people that are starting the company or the leadership?
Jepson (Ben) Taylor: (35:30)
When I posted on LinkedIn that I was looking for a job, it had three million impressions across LinkedIn, TikTok, like Y Combinator, and that led to 110 jobs in 48 hours. Most of them are interviews with CEOs, executives. The whole thing is a little dumb.
But I had two people from Dataiku show up, so I think the second day, they said, "I don't know who you are. I'm not trying to convince you join Dataiku, but whatever you're going through seems really crappy and I think you need a hug." And I'm like in war mode, just like hammering through these interviews and this individual shows up. I'm like, "Thank you." And then there was someone else that showed up with kind of the chaos, just saying Dataiku is the best company. And then when I finally met the French leadership team, they're all very charming. That bias that I'd had was completely eroded away. And as I've joined the company, the more people I've met, it's only reinforced that. Life is too short to work somewhere where you don't like the people you work with, or you don't like the product or the vision that you're focusing on. I would encourage people to don't fall prey to being stuck in a job. Especially stuck in a job you don't like.
Satyen Sangani: (36:36)
And what is it about those leaders? 'Cause you've seen now multiple leadership styles. You've seen Jeremy, the guy who founded Dataiku has been there for a long time, the same as Florian [Douetteau], as you mentioned, he's French. You've seen the regime change at DataRobot. What do the great ones do?
Jepson (Ben) Taylor: (36:49)
I don't know if people have talked about this, but a lot of CEOs will wear a second hat. So people joke that Jensen [Huang], the CEO of NVIDIA, wears the CMO hat. He's very much a marketer. Florian, he cares so much for his culture and people. So you could argue he's wearing an HR executive hat. And that shows in the company. And so it's been nominated best place to work at four or five different places. That's not a fluke. Best people can give the best work, they can attract better people. But there could also be a long conversation about what makes a good executive. There's a lot of different aspects of what makes a good CEO.
Satyen Sangani: (37:29)
Yeah, and it comes in all different shapes and sizes. And do you have a bias? I mean, having founded a company yourself and working with Jeremy and now Florian, do you have a bias towards founder-led companies, and is that like a pattern that you would be personally looking for, professionally-led companies? Because that's like a big debate in the Valley and it’s gone on for a while.
Jepson (Ben) Taylor: (37:47)
Yes, I do. But people change over time. So if you run into me 10 years from now, I really have no idea what my opinions would be. But right now, I put a lot of value in a founder-led company.
Satyen Sangani: (37:58)
Yeah. And any further detail on what it is about that is different for you as an employee or for you personally?
Jepson (Ben) Taylor: (38:04)
Founders are inspiring. There's the hero template. They've got a vision. They've got people. They care. Founders care for their employees. They care a lot for them. Employees are family to them. I admire them. But I think having gone through the valley of suffering myself, I have a massive amount of respect for founders because they carry a weight that most people will never realize. So it's hard for me not to like them.
Satyen Sangani: (38:28)
You had this statement on LinkedIn and you said something to the effect of, "To be a founder, you have to get as close to the line of lying and never cross it." That struck me. As a founder, I look at it, I'm like, "Oh, I don't know. Do I get close to lying? I'm not quite sure that I do that." [laughter]
Jepson (Ben) Taylor: (38:46)
That's actually not my line. That's why I had it in quotes. But I liked that line. I'm blanking on her name. It was a chief strategy officer at Decode Health. She's advised 100 startups. But what she's essentially saying is, if you are truly honest with your employees, your customers, your partners, and people around you on what it feels like on a day you're being sued or on a day you have to fire someone, or on a day you lose an account, or when you're worried about burn, or you're thinking about what it'd be like to be employed again... So a lot of people talk about this, there's other CEO and thought leaders, they'll talk about having the worst call of the month, worst call of the quarter, and then next call, they're high-value prospects, they're smiling. So we talked about the pit in the stomach. So it's not really talking about lying in a way that's being shady, like I'm going to nearly lie to manipulate you. Some people might misinterpret for that.
Jepson (Ben) Taylor: (39:34)
It's about really being guarded on your emotions, but also trying to lean in... You're lying to yourself. It's reality distortion. If we really talk numbers, the likelihood of your startup succeeding is near zero. So you're lying. You really think you're just going to succeed? I think that those are the elements that come out of that quote, which I think are fun. But hopefully people don't misinterpret it into “sell snake oil or do things that are so close to lying that you can succeed,” because I definitely don't agree with that.
Satyen Sangani: (40:05)
I think you do have to believe. I think you probably don't think you're lying to yourself, even if statistically, it's likely that you are. You probably think of yourself as an outlier definitionally, and then of course, then there's this question of, at what point does that become so disconnected from the data that it's... It is a lot, obviously. Elizabeth Holmes is a great example, a totally degenerate case.
Jepson (Ben) Taylor: (40:36)
That's an example of crossing the line. So get close, don't cross. Because there are examples... If you sell it to a more aggressive contract than you should have, are you truly prepared to support that? And a lot of founders, they'll figure it out. They'll figure it out. Once they get the cash, they'll hire, they'll have a failsafe for the partner, they'll do something. But if you are truly honest on a bigger contract, you're leaning in. And that's part of a startup, needling in.
Satyen Sangani: (40:58)
You have to lean in, because it's the optimism... Any given set of circumstances you could have, if you think about my life as a Monte Carlo, there's a whole bunch of set of circumstances and some of them are bad and some of them are good, but if you as a founder or a leader in general are constantly thinking about all the bad circumstances, that almost becomes a self-fulfilling prophecy because you won't invest enough, you won't trust enough, you won't, I guess inspire enough, and all of those things are necessary. So there is this kind of, I think the Steve Jobs term that you used, was reality distortion field, and knowing that crossover point, and I think you've seen it in your many experiences is a really tough thing to do.
Jepson (Ben) Taylor: (41:34)
Satyen Sangani: (41:36)
Maybe I'll switch over to... Go ahead.
Jepson (Ben) Taylor: (41:38)
I was gonna throw an emotional bomb to the audience. So to give them a glimpse into the horror of being a founder, I was driving the car with my wife and she said... She's talking about family or something, something happened and she asked me, "Are you afraid of death?" and I answered in 200 milliseconds, like knee-jerk reaction, I said, "Absolutely not." And the reaction was so bold and so fast that she had to inquire and she said, "Explain this." And I said, "At least I would get a break."
I was never suicidal. But hopefully that gives the audience a glimpse, if death is a break, that's a pretty heavy load. I'm working every possible hour, I'm honestly working too much, I'm failing at home, I'm failing at being a dad, being a father, I'm doing much better now. And that's why you need a network of founders. They've been there before. I've been there before. You've been there before. You can have these conversations.
Satyen Sangani: (42:30)
Yeah. Look, it's funny, when you're early, probably not as busy, but there's always this existential fear of “God is this gonna just not exist, is this... Am I fraud?? All of that stuff. And then later it just becomes like a lot. To your point, at least I get a break. It's just an inexorable amount of work, but it is a tremendous privilege.
And I think the thing that I try to come back to is that, man, life gave you all of this money and all of these smart people to go work on a problem that's worthwhile and useful and... Because one of my board members constantly reminds me, "A good life is about good problems."
Jepson (Ben) Taylor: (43:04)
Good problems with driven people. If you can work with smart, driven people on good problems, especially if they're worth something...
There's a little level of arrogance when I left HireVue, because HireVue was seen as a success. So when I was there, we delivered 14 patents, we delivered an AI product. That product was accelerating growth in the company. That product was accelerating the evaluation. Did that for four years, loved everything. But I think you get that itch as a founder on, "What could I do if I was in charge? How fast could I go if I was in charge?
What you find is the problems that exist in the business, it doesn't matter how smart you are, it doesn't... Intelligence is also... It's not one-dimensional. You can be brilliant in this domain. You can be an idiot with product-market fit, elevator pitch, pricing, networking. We see that with technical founders, they really struggle with the business side. They wanna tell you the how and why they're so smart and they built this amazing thing, and then when they realize they have to sell it, they're in trouble.
Remembering (and celebrating) the human soul
Satyen Sangani: (44:08)
Maybe circling back to AI, I wanted to inspect a phrase, there's a phrase and I didn't quite understand it, or I don't think I understood it with profundity that it seems to intend, which is there is this saying that there's no soul in the machine, only in front of it. Tell me a little bit about what that means and that phrase, and why is this... Why it even matters? It's like, why are you even talking about that?
Jepson (Ben) Taylor: (44:30)
So there's a lot of things behind that statement. So at Dataiku, we don't celebrate the machine, we celebrate the user. Here's a Tron reference for you. We celebrate the soul, the human behind it, because the human is gonna understand how to quantify the value, the human's gonna come up with the best problem, and the human is ultimately the creator, the artist, the creative. So one thing: Dataiku puts incredible amounts of focus on the user experience. Users are delighted, users use the product, they continue to use the product, they don't have to rely on outside AI experts to do everything for them. They love the product.
The other thing that stood out to me is, so I talk a lot about soul, so if you have some marketing you don't like, if you have a sales deck you don't like, if you have a talk that's forgettable and you're worried about it, if you reach out to me, I'll say this has no soul. So let's add a soul to it. And really it's just leaning into emotions.
Jepson (Ben) Taylor: (45:21)
So I love... If you look at some of Dataiku's marketing, it's got soul, it's got passion, they celebrate the human element. And that's what a human is. A human is this creative, passionate soul with emotion. But unfortunately, in software, we forget that typically. And so I think that messaging is really tipping the hat to the human.
Climbing (literal) mountains
Satyen Sangani: (45:40)
Yeah, who you're trying to empower. Before we end, your... Obviously, you founded companies, you joined them, but you also apparently have this passion for mountain climbing, which has always been something that I've just been... Just like climbing literature with Climb and Into Thin Air and Free Solo, just like these individual feats that are just magical. What have you learned from this experience? How do you get into it? What's your journey there? Tell us a little bit more about this. Is it a hobby? How would you describe it?
Jepson (Ben) Taylor: (46:09)
I'd say it's a disorder. [chuckle] It is a miracle that I'm alive today, because I've been in multiple avalanches, often I'm solo. As a young kid, I remember hiking in mountain lion territory by myself, in the snow, big distances. This is all for backcountry snowboarding, but there is that element of, "I'm gonna conquer this mountain. I'm gonna conquer this mountain." And I think I'm built for it. So for me to put crampons on and kick up a 3,000-foot couloir, it's awful. I lose toenails doing it, but for whatever reason, I find myself doing it over and over again.
I put it in the same category as meditation. So some people do meditation. If I am thinking about traversing this rocky ridge, or if I'm making sure I don't step too far to drop a cornice, or if I'm really worried about this slope sliding and I've got my mountain ax and making sure it hits things below, I'm not thinking about anything else.
Jepson (Ben) Taylor: (47:02)
It's so dumb to say because there is this mental health aspect, but it's also the physical reality, and I talk about this a lot with my wife, that my biggest concern is one day I won't come home. The likelihood of me dying young is high and there's an element of extreme selfishness to it because I have three beautiful kids. As long as they're taken care of financially, I do hope to see them raised, I do hope to see my youngest leave the house. And I also am friends with mountaineers that are 60.
Jepson (Ben) Taylor: (47:31)
They just avoided the odds. But it's hard to describe the feeling when you're standing on top of Mount Superior. It's 5 degrees, you've been hiking through the dark, you've sworn a few times because your crampons are slipping on rock, but you've succeeded and now you're watching the sunrise. I can't transfer that feeling to someone. I can't even do it justice explaining it. When you're standing up there, it feels sacred.
I will often find myself on tops of mountains where you feel this sense of... It's almost like the mountain is telling you you should not be there. "You should not be here, you as a human, but here you are. Here you are with your crampons and your mountain ax and death is in the air. There's avalanches that can be out there." I don't wanna overdramatize it, because a lot of stuff is a little bit more casual — "Hey, we're gonna hit Superior with some friends" — but there are some aspects I could show people photos where people see the photo and they're impressed, but underneath the hood, there's a bit of selfishness.
Satyen Sangani: (48:31)
There's a lot of presence. The meditation analogy sounds about right, because you've gotta be present and in the moment in order to be able to do it. And I do think that, yeah, it's true that there are dramatic moments and to find at least decidedly dramatic moments in the work, but it's probably the case that a lot of the times when people largely get hurt or pass away or there's some major accident, it probably happens in mundane moments where people weren't fully expecting it.
Jepson (Ben) Taylor: (48:56)
And it does happen in the most experienced people too, all the time. Because sometimes the more experienced you are, the more risks you're willing to take because you know the snow... I'm not worried about avalanches, if it falls into the category of a new snow instability, because I can do a ski cut and I've caused big avalanches like 1,500 feet vertical that I've caused and I'm not caught in it, but I'm expecting it. And then there's persistent deep slab, that's the stuff that will catch anyone off-guard.
I do wanna break out of mountaineering. There are a lot of executives and people in tech that, they want to push the human soul, they want to push the body.
Satyen Sangani: (49:33)
In aversion, I mean it's not pejorative. Even though it sounds like one, it's not. It's all aversion, because the average person wouldn't do these things, but there's obviously a reward out of it.
Jepson (Ben) Taylor: (49:41)
Yeah. I do combine work and play. So I will have business meetings while I'm mountaineering, if I know I have coverage. These aren't meetings I'm running. They're not external. So if I'm a passive listener... So I used to send... I sent photos to our Japanese team of where I was during the call, because I'm touring 2,000 vertical feet, I'm listening, I'll pause and I'll comment out of breath and I'll keep listening, and then I'll send them a photo of what I was doing, which is quite funny.
Satyen Sangani: (50:07)
It's pretty awesome. So there's cell coverage on the mountain. You're just...
Jepson (Ben) Taylor: (50:11)
Well, maybe you'll hear a story that Ben dropped off an important Zoom call and people thought it was my... They thought it was my network, but it was actually something much more serious.
Satyen Sangani: (50:19)
Yeah. That feels like a good place to cut the podcast. [laughter] Let's hope that we don't hear that.
Jepson (Ben) Taylor: (50:25)
Satyen Sangani: (50:26)
Ben, thank you for taking the time. This has been a really fun conversation and a lot of territory covered, so I appreciate your coming on.
Jepson (Ben) Taylor: (50:34)
Absolutely, thanks. That was a lot of fun.
Satyen Sangani: (50:43)
Of the many insightful tips that Ben shared, the most powerful are about vulnerability. Vulnerability means you have the bravery to be open about your flaws and to ask for feedback from others. It means that you go climb a mountain with the full awareness that you might not come back from the trip.
As data radicals, we often get caught up in the numbers. Ben reminded us that storytelling and the adventure of the work are their own rewards. They also allow us to enjoy our work, find meaning, and have real impact. We can all stand to incorporate more vulnerability into our professional lives. We're all human, after all.
Thank you for listening to this episode. And thank you, Ben, for joining. I'm your host, Satyen Sangani, CEO of Alation. And data radicals, stay the course, keep learning and sharing. Until next time.
This podcast is brought to you by Alation. Alation achieved eight top rankings and 11 leading positions in two different peer groups in the latest edition of the Data Management Survey 23 conducted by BARC, the Business Application Research Center. Read the report at alation.com/barc23.
Your RFP is Useless
Duca Family Professor of Technology Management, UCSB
Host of Cautionary Tales and Author of The Undercover Economist
How Extreme Focus Launched the Modern Data Stack
George Fraser and Taylor Brown
DataOps & The Data Catalog
VP, Principal Analyst, Forrester
Turning Data Librarians Into Supercomputers
Senior Director of Learning & Communities, Alation
From Data Strategy to Execution
CDO, The Very Group
Multiple Sources of Truth:
Decentralization and the Data Mesh
Creator of the Data Mesh
Truth, Data, and FAIRness
Chief Data and Analytics Officer, Ingka Group
Using Data to Fight for Human Rights
Data Scientist, HRDAG
The Data Behind Dating
Author of Don’t Trust Your Gut
The Beginning of Business Intelligence
Founder, Business Objects & Managing Partner, Balderton Capital
Knowledge Graphs 101
Founder & CEO, Stardog
Radical Data Politics
CDO at Tableau
Data is a Weapon
Retired US Army General
The Data on the Chief of Data
Founder & CEO, NewVantage Partners
The Death and Rebirth of Data Privacy
Michelle Finneran Dennedy
#1 CPO and CEO of PrivacyCode
Data Quality Is a Risky Business
Co-Founder and CEO of BigEye
The Science Behind Quitting
Former Lead Science Writer at FiveThirtyEight
Your Path to CDO Superstardom
Co-Founder & Director, Carruthers and Jackson
Super Chickens & the Productivity Pecking Order
Dr. Margaret Heffernan
5-time CEO and Author of Willful Blindness
Attacking Data Literacy
Chief Data Strategy Officer at ThoughtSpot
Your Guide to Corporate Controversies
Co-Founder and Executive Editor, The Information
Making Big Data an Asset in Medicine
How to be a Data Cheerleader
Co-Author of The Chief Data Officer’s Playbook
Data Governance: It’s the Final Frontier
Principal at Teknion Data Solutions
The Secret to Data Storytelling
Cole Nussbaumer Knaflic
Founder & CEO, Storytelling with Data
The Reign of Specialists is Over
Bestselling Author of Range
Emerging from the Data Dark Ages
Principal Data Strategist at Snowflake
Forging a Culture of Data Governance
President & Principal, KIK Consulting
Subscribe to the Data Radicals
Get the latest episodes delivered right to your inbox.