The Death and Rebirth of Data Privacy

With Michelle Finneran Dennedy, co-author of The Privacy Engineer’s Manifesto, CEO of PrivacyCode, and first Chief Privacy Officer

Michelle Finneran Dennedy
#1 CPO and CEO of PrivacyCode

Michelle Finneran Dennedy is the co-author of The Privacy Engineer’s Manifesto, and one of the world’s first chief privacy officers. She was the CPO at Cisco, Intel, Oracle, and Sun Microsystems, and today is CEO of PrivacyCode and partner at Privatus Consulting.

Satyen Sangani
Co-founder & CEO of Alation

As the Co-founder and CEO of Alation, Satyen lives his passion of empowering a curious and rational world by fundamentally improving the way data consumers, creators, and stewards find, understand, and trust data. Industry insiders call him a visionary entrepreneur. Those who meet him call him warm and down-to-earth. His kids call him “Dad.”


Michelle Finneran Dennedy
#1 CPO and CEO of PrivacyCode

Michelle Finneran Dennedy is the co-author of The Privacy Engineer’s Manifesto, and one of the world’s first chief privacy officers. She was the CPO at Cisco, Intel, Oracle, and Sun Microsystems, and today is CEO of PrivacyCode and partner at Privatus Consulting.


Satyen Sangani
Co-founder & CEO of Alation

As the Co-founder and CEO of Alation, Satyen lives his passion of empowering a curious and rational world by fundamentally improving the way data consumers, creators, and stewards find, understand, and trust data. Industry insiders call him a visionary entrepreneur. Those who meet him call him warm and down-to-earth. His kids call him “Dad.”


Michelle Finneran Dennedy
#1 CPO and CEO of PrivacyCode

Michelle Finneran Dennedy is the co-author of The Privacy Engineer’s Manifesto, and one of the world’s first chief privacy officers. She was the CPO at Cisco, Intel, Oracle, and Sun Microsystems, and today is CEO of PrivacyCode and partner at Privatus Consulting.

Satyen Sangani
Co-founder & CEO of Alation

As the Co-founder and CEO of Alation, Satyen lives his passion of empowering a curious and rational world by fundamentally improving the way data consumers, creators, and stewards find, understand, and trust data. Industry insiders call him a visionary entrepreneur. Those who meet him call him warm and down-to-earth. His kids call him “Dad.”

Satyen Sangani: (00:03) At the advent of the internet, Sun Microsystems CEO Scott McNealy, made a famous declaration. “Privacy is dead,” he pronounced. “Get over it.” It’s a statement that still reverberates more than two decades later. Today, we spend more of our lives online than ever before, and that data on all of our digital activity dictates almost everything we see. As a result, it sometimes feels like your computer is spying on you, like it knows you just a little bit too well. And there can be benefits to sharing your data. Netflix suggests the next amazing show to binge. Amazon can tell you a TV to buy, and a certain incredible enterprise software company can tell you which data you need to use and how to write better queries. Amazing. Yet in the wrong hands, data can be used maliciously. When does suggestion cross over to being deception?

Satyen Sangani: (00:54) Historian Yuval Harari warns us about the dangers of hacking humans, making people do something they otherwise never might have done. As data radicals, we have a responsibility to make sure we use data ethically and responsibly while also giving our teams the information they need to create results. So, no, privacy isn’t dead, but it is really, really complicated.

And do you know who else didn’t think that privacy was dead? Scott McNealy. Shortly after his infamous quote, he named Michelle Finneran Dennedy Sun’s first chief privacy officer.

Michelle is our guest on today’s episode and a leading expert in privacy. She began her career as a patent lawyer and is the co-author of the book The Privacy Engineer’s Manifesto. Today, Michelle is the CEO of Privacy Code and the founder of Privatus Consulting. She also was one of the first chief privacy officers and served as the CPO at Cisco and Intel. If you’re having issues navigating the complexity surrounding privacy or just want to understand more about the history of privacy on the internet, then this episode is for you.

Producer Read: (02:05) Welcome to Data Radicals, a show about the people who use data to see things that nobody else can. This episode features an interview with Michelle Finneran Dennedy, CEO of Privacy Code. In this episode, she and Satyen discuss the evolution of the chief privacy officer role, designing privacy policies, the blockchain, and much more. This podcast is brought to you by Alation. Data citizens love Alation because it surfaces the best data, queries, and expertise instantly, so you can stop hunting and gathering data for days and start driving value for your businesses in minutes. See why more than half of all Fortune 100 companies today use Alation. Learn more about Alation at alation.com.

 


 

“You Have Zero Privacy. Get Over It.”

Satyen Sangani: (02:49) Michelle and I started by discussing Scott’s famous quote.

Michelle Finneran Dennedy: (02:53) The exact story came up at a press conference. It was the launch of Jira, and not Atlassian Jira, but the Java-based connectivity tool that allowed you to do wireless printing. (Ooh!) And the fear from one of the journalists was, “What if I’m in a healthcare office, and you can just send my medical records to a printer? Ah! Then they’re going to be off in the wild.” And so that’s when Scott’s retort was, “You have zero privacy. Get over it,” is what he said. And I think he stands by those words, which I think is adorable.

He did say them last century, they are a little bit dated, but we still hear it today. We hear it today that we have no privacy because we have all of this technical capability, because data science has progressed so radically, because data sets are so accessible, and so there’s one sort of camp that has a belief that privacy, which I define in a fundamentally practical sense as the authorized processing of personally related PII, personal data — however you want to tag it — according to moral, ethical, legal, and sustainable principles.

Michelle Finneran Dennedy: (04:07)

Satyen Sangani: (04:50) This is an interesting point. So you’re saying, in some ways, Scott kind of had a point in the sense that, look, if you’re an actor in society, whether you’re swiping a credit card or visiting a website or doing anything that a modern human being would do, stuff’s going to get recorded. You can’t just go live like a hermit in the forest and be a part of modern society, which I think is a fairly interesting point. But you’re making, then, a second point, which is to say, so then privacy is really not about whether or not the data exists, it’s whether or not it’s treated in the appropriate way and processed in the appropriate way.

Michelle Finneran Dennedy: (05:28) That’s right.

 


 

From Patent Lawyer to Privacy Pro

Satyen Sangani: (05:29) This is obviously a topic that you viscerally latched onto early in your career moving from being a patent lawyer to this new domain of privacy, which I think is well known now, but at the time probably wasn’t as well known, right?

Michelle Finneran Dennedy: (05:44) Not at all.

Satyen Sangani: (05:45) What drew you to it personally? What was it that made you say, “Okay, I care about this, and this really matters to me.”

Michelle Finneran Dennedy: (05:54) I joke that I was raised on a raised floor. My dad was in the Navy in the late 50s, early 60s, and got his hands on Compute, which is why he joined the Navy in addition to like everyone was getting sent to Vietnam at the time. And so I literally grew up on a raised floor where he would take me to work on the weekends with him. And working on the data center, he put in the first firewall into Corning Glassworks, he did the entire identity stack into United Airlines 20 years ago, and so he had this very deep, technical side. His degree was in mathematics because they didn’t have a degree called electrical engineering or computer science, and so later in life, when I was at Sun and handling trademark portfolios alongside this very unwanted thing called privacy because our leader from the top said this is something that’s worthless.

Michelle Finneran Dennedy: (06:47) And so I looked at it through the lens of my own personal experience, understanding how difficult it is to keep a data center running, how complex, how difficult is the glorious field, an under-lauded field of data quality, and then everything came to a head for so many of us on 9/11. I was out on maternity leave. My older daughter was six days old. I had booked my mother on Flight 93 out of Newark — they were living in Princeton — and I said, “Mom, the baby’s late. I really just needed a week alone with the baby, so why don’t I get you off of the September 11 flight and put you on the 14th?” And that little data decision preserved her life for the next 16 years that she lived. And so there I was, I’m like pumped up on hormones, I’ve got this six-day-old baby, my mother has just narrowly escaped being murdered by terrorists. I’m a New Yorker, so I’m watching the Cantor Fitzgerald. My roommate was at Cantor Fitzgerald in New York City, so there’s all this stuff.

Michelle Finneran Dennedy: (07:57) And I’m thinking for all of the stuff and the noise and everything else that was going on, there were sort of two messages in my head. One was the individuals that I watched die at Cantor were people to me, each one had an individual. They weren’t casualties or murder victims or terror victims, they were people with stories that had ended. And the other side is McNealy and Larry Ellison are on TV talking about how privacy’s dead, obviously we have to, everyone give up your data, we have to hunt terrorists. And I thought, first of all, I’ve been in tap for long enough to know that more doesn’t always equal more. More data was not the problem in 9/11. The lack of appropriate sharing and critical thinking about intelligence and risk in a modern society was what ultimately the congressional report had to say.

Michelle Finneran Dennedy: (08:49) And so it was literally on that day and the day after that Jonathan and I got on the phone and talked about pulling together a business offering that said why have encryption if people don’t matter? Why are we buying an identity management stack with audit capabilities if it doesn’t matter? What are we doing with RFID, which was the sensor IoT of the moment, if it doesn’t matter? And the answer was it mattered very much commercially. Billions and billions of dollars are being spent to know things in context. Building up the ad networks that now look like a Google or a Facebook or an Amazon, those ideas were already in the wind, and they were already in the data center, and the data quality and data science people already knew about them.

Michelle Finneran Dennedy: (09:37) And so we really were sort of sitting in this moment of time where it made sense to put together a business plan, and that’s what we did. And then Scott — I have to find the email, it’s somewhere in the world — he said, “Here’s some rope, kid. Go hang yourself. You’re the first CPO of Sun Microsystems.” So that’s how it all started for me.

 


 

 How Has the CPO Role Changed Over the Years?

Satyen Sangani: (09:56) As the discussion around privacy has increased in importance, so too has the CPO role. I asked Michelle how the role has changed over the years.

Michelle Finneran Dennedy: (10:05) As the role has evolved over the years, there’s a real differentiation. There are some chief privacy officers who are very content to be compliance people in the legal department and advise. So they sort of sit back and they say, “Here’s my proclamation. Here’s what the law says.” The law is incredibly complicated. That’s a super tough job, but I think the interesting thing, and this is where it crosses over with the data radicals community, is there’s a new true C-suite emerging here where our chief privacy officer is thinking about who should be authorized. What are our service offerings to create a community where my user is just as engaged in the provenance of their data as the capability of my data stores and systems and algorithms?

Michelle Finneran Dennedy: (10:56) The person has to be pushed back into the center of the equation, and when we do that, and when we have done that over the ages for many, many, many different types of technology, there’s always business value there as well as your values that you bring to the table. So I think the real CPOs today are business leaders, they’re risk takers, they actually are challenging norms of things that people expect about secrecy and sequestration and pure compliance and cost centers, and they really are leading forward into the next generation of this which is just getting more and more complicated.

Satyen Sangani: (11:32) It’s so interesting to me that you describe the job as one where a privacy officer, which would seem to be a role stemmed from conservatism, needs to take risk. And so just kind of thinking about, from a personal perspective as people think about growing their careers, this idea of taking risk in these roles where maybe risk is not the first order of operations is super interesting.

Satyen Sangani: (11:55) And I think the second is this idea of okay, well, we also have this opt-in of how to think about giving people the ability to declare and say where their data should and can be used. I think probably helpful then to maybe bridge to — so maybe start with a very basic question. What is a data policy? What is a policy? What is a privacy policy? What is an example of that? Give us like three examples of that. People use these words, words like data governance, and they get really meta, and it’s hard for people to understand what these things are.

Michelle Finneran Dennedy: (12:29) This is where I go way old school. I go back to UML or C4. So if you’re thinking about a strategy or what a data policy should be, a data policy is a business rule. And sometimes that business rule is because an external force says, “I am the king of the EU, and I do not like the U.S. government, so transferring information from the EU to the U.S. should be, by default, prohibited, and here are the rules if you would like to get on my little data ship and go across the pond.” So having a policy from the highest level is “thou shall not, or there will be a large penalty, or I can restrict your processing of this stuff if you don’t.” So that’s an outside, external thing saying when you transfer data, you have to have a series of reasons and attestations and rules and practices in place, so that A, that process is legal.

Michelle Finneran Dennedy: (13:31) Now remember my four horsemen of the non-apocalypse is one, it has to be moral. Second, it has to be ethical. Third, it has to be legal. And fourth, it has to be commercially sustainable. So when we think about there’s one rule that says there are rules about data transfer. Before you just start transferring stuff, you think about what is it that I’m transferring? So I have a drone, or I’m a journalist, and I have a close up photograph of a wounded, young woman who’s laying on the ground. There are standards around being a journalist and what that means, and so there’s sort of the moral and ethical codes sort of blend here. I think of morality as like the thing that is so anathema that everybody kind of agrees if you’re not a horrible person, that this is a bad thing.

 


 

 The Distinction Between Morals and Ethics

Satyen Sangani: (14:22) You mentioned morals and ethics. Give us the distinction.

Michelle Finneran Dennedy: (14:26) Yeah. So ethics — I apologize to the philosophers because it’s probably totally wrong in their vernacular — but the way I look at it is the morality is more that gut punch, grandma thing. The ethics for me, in business, I think of that as your brand. So if you think about … so I was at Cisco, so the brand stable, engineering-led, network stability, these kind of big, loud, slow words are their ethics. So I would think about that if the data transfer or the data activity or processing was something awful, no. If it’s something that is not within what my brand is or could or is desired to be, no. Then you get to the legal. So now you’ve got already a set of business rules that become your data policy and then commercially sustainable is, are we the kind of organization that can suck in a petabyte of data a day and do an analysis that is meaningful?

Michelle Finneran Dennedy: (15:27) Do I have the capabilities and would it cost more to protect that than it would to do it? So you have that conversation is, should we be doing this before we figure out can we do this? Then the “can we do this” is more of a fundamental of kind of task culture and capability thing, and the law. And so once you’ve figured that out, those business rules are your policy. We will transfer information because we have these capabilities that are known and then they divide into sort of two paths. One, we call a mechanism, that’s technology: software, hardware, I don’t care, all of it.

Michelle Finneran Dennedy: (16:03) The other one is people stuff. What do people have to know? Are you doing training? How often do I post something to remind people? That people part of it is really critical, too, but you can code the mechanisms, and then you ensure that your business rules are respected by the policies and procedures and the mechanisms, and then you get to quality. When those things are in place, and the system starts running, it’s a soft system, there’s always change in it, there’s a number of different constituencies who are impacted or impacting that system, and so you’re constantly not just looking at quality in terms of, “Did I do the things I set out to do?,” but you’re also looking at quality continuously, as is this system still operating because a soft system creates problems as soon as it solves them.

 


 

The Interplay Between Privacy and Privacy Engineering

Satyen Sangani: (16:55) That makes a ton of sense. And so it gives people who have these very sophisticated, complicated jobs a framework for which to act and for which to navigate. Tell me a little bit about sort of the interplay then between privacy and privacy engineering. So there’s all of these techniques that you can use. It seems like the state of art is changing constantly. There’s all these words like anonymization and encryption and salting, and you get into a room with folks who are doing this stuff, and you’re just like, I … like, you know.

Michelle Finneran Dennedy: (17:27) Yeah.

Satyen Sangani: (17:27) There’s a lot of stuff. What are these things? How does it work? How do you decide when to use one versus the other? How do you think about these things? And I realize I’m asking you to summarize an entire field in like one answer.

Michelle Finneran Dennedy: (17:39) Thing number one is figuring out with people that have a lot of passion around stuff: what is real in the lab and what is real in the wild? And that doesn’t mean you always go second. There’s a lot of people, their answer to that question is make sure someone has done it first, and then you go second. Sometimes it’s been a little lonely out there, but I am not afraid to go first and try things and know that you’re trying something first and try to do it as with as much sandboxing or safety as you can because you know you’re running an experiment, but do try some of this stuff that is now not just fiction or someone’s PhD dissertation, and figure out what would that mean in the wild? What would that mean to govern? I almost look at it as another project.

Michelle Finneran Dennedy: (18:28) And so this goes to your point of exactly is the process that I’m leading you down from business rules to privacy policy, which is your data business case, to mechanisms, privacy, policies, and procedures on the word and people side versus the zero and one side. That process I’m constantly running, and so I’m thinking about privacy by design is the outcome that the public policy needs and wants. There are concepts like it should be private by default. What does that mean if you’re on an airplane or if you’re banking or if you are just walking by a magazine rack that has a security camera? What does privacy by default mean? It shouldn’t be a zero-sum game.

 


 

The Definition of Privacy Engineering

Satyen Sangani: (19:13) A commonly misunderstood term in the privacy space is privacy engineering. Since she literally wrote the book on it, I asked Michelle to define it for us.

Michelle Finneran Dennedy: (19:23) There used to be, well, I think it still exists, but engineers would be a part of a professional engineering association, and they would sign onto a code of conduct including morality clauses saying that like a doctor, one of your first rules as a professional engineer, which would be PE, is not to do any harm and to look at the materials available and solve the problem in the best way possible without doing harm. Privacy engineering, and the reason we picked privacy engineering, to me is taking stock of all the tools available — the people tools as well as the tech tools — and trying to get to that outcome of privacy by design. And so that is what engineers do. And I don’t mean double-E’s and mechanicals and ceramics. I mean people who solve problems with the tools available, so privacy engineering for me is a very inclusive term.

Michelle Finneran Dennedy: (20:20) There are definitely folks out there who are sort of the Berkeley School of Privacy Engineering, and it’s the really heavy mathematical encryption and obfuscation. They can’t even pick a word that I can say. And working on what is the probability of re-identification so that we can sort of prove the unprovable that data is anonymous enough. So that’s one school, but I think privacy engineering is actually incredibly broad. Anyone who would protect the authorized processing of personal data, according to moral, ethical, legal, and sustainable principles, you’re a privacy engineer in my book.

Satyen Sangani: (20:58) Yeah. And so you’d have to be able to use all of these various techniques and algorithms somewhat flexibly in order to be able to understand how to apply them in the appropriate context, given the appropriate requirements, which makes a ton of sense.

Michelle Finneran Dennedy: (21:14) And there’s a lot of specialties. Just like you have an anesthesiologist with your heart surgeon, I’m not expecting electrical engineers to understand the nuances of the European Data Protection Board. And you shouldn’t expect me to understand the latest and greatest version 3.2.1.3 of whatever you’ve got going on in your world. It’s a matter of what are the utilities of the things that are real and are not still academic cocktail conversations.

Satyen Sangani: (21:41) What strikes me, though, is as you think about this, one of the things we talk a lot about and talked a little bit with Bob Seiner, who was on a prior episode of the podcast, is this idea of data governance being sort of this agile, a constantly changing, constantly evolving thing, right? And you think about privacy, and it’s again, this agile, constantly changing, constantly evolving thing. And I think often the people who come to these roles think of themselves as rule makers and rule enforcers, and there’s this mindset that we just need to do it right once. And we talked about that theme earlier, but it seems like it keeps on emerging that there has to be this mindset of flexibility in this interaction pattern between these disciplines of data governance and data privacy and security and, of course, the data individual.

Satyen Sangani: (22:31) You said this really funny thing when we were talking earlier, prior to the recording where you sort of said, “Ah, sometimes people think of the privacy officer as the antichrist for data people.” And so you can just think of all of this as just more work like, “Ah, I’m just trying to get my algorithm to run.” Convince us that these two things are not at odds. Convince us that you can be a data organization with a rampant data culture while still really respecting privacy.

Michelle Finneran Dennedy: (22:58) I think there are guard rules for a reason sometimes, and sometimes we realize it’s nighttime, there’s no one else on the freeway, and we can go 85 miles an hour without endangering ourselves or the people around us. Other times you’re in a school zone, and it’s dropoff time, you probably shouldn’t go 85 miles an hour. I think that’s much clearer in the “meat world” where these rules are. In the data world, you’re so hungry to get the most data possible, to get the most different varieties of algorithm processed, to find either the anomaly or find the hard norm that I think we sometimes lose sight of just because it is interesting for your algorithm, is it meaningful for that outcome? And shouldn’t we be on the same page, not just in rules and what you can’t do, but why are you there? If you are there to get someone a better price on their reservation, does that mean you really need all of my data from age 20 till now or will the last two years do?

Michelle Finneran Dennedy: (24:16) And that’s intentionally drawn around the pandemic. So the data around my hotel stays in the last two years, not great if you want to look at my past 25 years on the road. So sometimes the answer is I do desire, I won’t say I’ll need it, but you want more data because you want a wider aperture of examination. And if you can make that case, and if you can make that case that you’re not either harming that person, “ooging” them out (technical term), creeping them out, and you’re still serving the point of it, right? So if again, in our hypothetical scenario, you’re trying to recommend hotel rooms and offerings or maybe you’re the hotelier, you’re trying to figure out what your profits might be this next year because money is even more complicated than data privacy — we’re just used to it. So understanding that everybody’s got sort of a data budget as well as they’ve got a monetary budget, and sometimes it’s spent in credibility, sometimes it’s the nuisance of how many times have I come to train you about a new thing that may or may not be relevant to you?

Michelle Finneran Dennedy: (25:24) So, too, is it true that there are limitations to what can we, should we, ought we be doing with data sets. And the question of ethics really comes in, I think. And data scientists more and more are being confronted with these issues. Do we use the Mangola Medical data? Do we use that or is it unethical not to use that? Because these poor humans were put through torture. Do we take pictures to document what’s going on in a war zone? Or are we aware that young children are being transported into human slavery right now in those same battlefields, and your taking a picture of that comely young lady who’s injured could be sacrificing her life for your need to have a view into a place where you’re sitting from home and you’re safe and you’re watching them. So I think more of these ethical conversations about what is the source of that data.

 


 

The Sigma Rider Problem

Satyen Sangani: (26:18)The third component to this is something Michelle calls “the Sigma rider problem.”

Michelle Finneran Dennedy: (26:22) I used to have a podcast called the Privacy Sigma Rider. And it came out of a conversation I had with my very young child who was going through some therapy for an immune disease. So she was getting immunotherapy that made her vomit and her hair fall out. And she said, “Mama, why can’t I be normal? I just want to be normal. I don’t want to be the kid with special handling at school.” And so I drew her, as a dorky mom in Silicon Valley does, of course I drew her a bell curve and I said, “Here’s normal. Normal is under this hump where we all learn how to brush our teeth and fold our clothes and make our beds.” Not so much for her at that time and that age, but capable.

Michelle Finneran Dennedy: (27:05) A lot of our algorithms are looking for that norm, but the reality is I’m not paid for how well I brush my teeth, I do a darn good job. I’m paid for what happens in the deviations. I’m paid for the extraordinary. And so for only gathering large, large, large, more and more, more data to get to norm because we’re going to have offerings around norm, you are missing the opportunity to get to the Sigma rider who loves being extraordinary and having extraordinary experiences. So that’s the other thing I say to my data science people is: Don’t forget that not everything has to go to the mean. Sometimes we want to go to the max.

Satyen Sangani: (27:45) There’s that great phrase. Why be ordinary when you can be extraordinary? That story brings to mind this idea that there’s sort of a rational self-interest to data organizations to, if they want to be in the existence for the long term and want to have the freedom that they need to have, that they should, both for reputational reasons, for obviously the desire to do right, because most people come to science from that place, that they should do that. And that’s really helpful.

Michelle Finneran Dennedy: (28:13) And you can involve the data subjects a lot more than we have been. I mean, I think that consent is a very weak legal object and informed consent is hard to get, but I do think contextually as we create contextual experiences for people and we have them participating knowingly and not having their activities sold without their knowledge, which has really caused all of this kind of pushback and flashback against the known wrongdoers in that field, people do want to have that sense of community. People do want to participate in your context. They just don’t want to be taken advantage of. So I think we will have more data sets available to us if we are designing them in a place where we do have the notion of functional privacy in place.

 


 

How Blockchain and Web3 Will Influence Privacy in the Future

Satyen Sangani: (29:05) To close our conversation, I asked Michelle how new technologies like Blockchain and Web3 will influence privacy in the future.

Michelle Finneran Dennedy: (29:11) There’s so much promise and opportunity. I mean, there’s sort of two minds. So there’s sort of the Twitter version of Web3, where we’re going to magically have democracy, and this is going to be: Everyone gets to get rich. And we know that’s not true just because we have 3,000 years of data that shows us that no, when people can take and hoard and exploit, they do. So I don’t think it’s going to be magical in that way. Where I’m really excited is there are things, and one of our funding partners and one of our clients at the same time, ABEX technologies, they do two things. So there’s the ABEX commodities side of the business and then there’s ABEX technologies, and they do this incredible stuff with self-sovereign identities.

Michelle Finneran Dennedy: (30:01) I think a self-sovereign identity rather than having the key fob notion of this is me as a soccer mom and this is me as a professional and this is me as a patient, I think having a distributed ledger of known and permissioned either artifacts, experience, provenance is helpful. So for example, instead of having these EHR health records that have never served anyone to my knowledge yet except a lot of lawyers who are getting wealthy, and I love my legal friends, and I’m glad you’re driving nice cars. But the promise of an EHR or electronic health record was that your caregivers in every context would know that you’re allergic to this and that this is your disposition for that. And even if you can’t speak for yourself, on and on, and you wouldn’t have to carry around a bunch of paperwork to show them these things, and you would have access to more information. I think there are aspects where you don’t pin your, “Oh, I’ve got a wart that needs removed” onto the Blockchain.

Michelle Finneran Dennedy: (31:06) First of all, it’s an eco disaster waiting to happen. So I think there’s things that should go on distributed ledger, and there’s things that go truly on blockchain. And the blockchain things really are, we have to think about a different sort of notion of permanency, I think. So things like your house, instead of getting a stack of paper that you sign, and I guarantee you, even as an attorney, I didn’t read everything that I signed when I bought my house. There are aspects of who owns this house now, and for the government to know whether I’ve paid my taxes or not, or there are certain things that are operational that don’t need to go on the blockchain, whether I paid my taxes or not, but there are certain sort of things of truth that will always be true that during this time period, Michelle Finneran Dennedy and her children lived here on this place on the planet, is something that you can tag.

Michelle Finneran Dennedy: (32:00) Now, you don’t need to have my name on that necessarily. Person X owns this property. It is owned by a private entity. That goes on the blockchain. So I think that there’s so much stuff, and I’m looking forward to a day at Privacy Code where if you go into our world and you take down these artifacts of policy, you convert them into usable tasks, your developers develop, your quality people quality. When you publish a project or an update or whatever, you now have artifacts that are capable of either going on your distributed ledger internally so that everyone knows there’s a change that has to happen for your identity strategy or what is allowable data set available or a new data product that your data scientists come up with to do a different kind of analysis on a visual plane.

Michelle Finneran Dennedy: (32:54) Those types of things could very well go on a distributed ledger that is talking to other data sets without human intervention because it is repeatable, and it makes sense. Similarly, you could have regulators instead of hiring tons of engineers with short-term stints doing public service, because they don’t pay as well as the privates. You can actually have them scanning a distributed ledger and saying, “Are you who you say you are? And what are the things that you’ve built in and what are the versions of things?” And we can do a lot of good with that. So it’s a very, very long-winded answer, but I think that I’ll have like 10 more years of answers to that.

Satyen Sangani: (33:33) To me, it’s a huge opportunity in so many ways in the world of data, but also in the world of privacy because ultimately it’s just a database and whether you choose to have it totally encrypted or just act as a ledger that may be more public, it gives you a lot more transparency into how people are using, and you’re more controlled.

Well, what a phenomenal conversation. And there’s so much more to explore whether it’s blockchain or the interplays with data or the interplays with security and culture. And so it was great to have you on. Thank you, Michelle, for taking the time, and look forward to seeing you again.

Michelle Finneran Dennedy: (34:09) Yeah. Thank you so much. It’s an absolute pleasure.

Satyen Sangani: (34:16) If you’re responsible for privacy, the decisions you make are not always going to be popular. Some people may feel like you’re revealing too much, and yet others may feel like it’s too little, but we’ve all heard the rule that with great power comes great responsibility. So if handling data in a moral and ethical way makes you a radical at your organization, then embrace it. Because if you’re like me, the most important stakeholder isn’t your shareholders or your data engineers. It’s your customers, the people whose data you’re using and the people you’re building your services and products for. This is Satyen Sangani, CEO and co-founder of Alation. Thank you for listening.

Producer Read: (34:57) This podcast is brought to you by Alation. Are you curious about metadata management? If so, then this white paper is for you. If not, then you should be. Learn how to evaluate leading metadata software by clicking the link in the podcast notes.

Other Episodes You Might Like :

Start with Story, End with Data

Ashish Thusoo

Ashish Thusoo

Founder of Qubole and Creator of Apache Hive

Subscribe to the Data Radicals

Get the latest episodes delivered right to your inbox.

Marketing by